当前课程知识点:现代优化方法(英文) > Chapter 1. Introduction > 1.1 About optimization > 1.1 About optimization
Hi, everyone.
Welcome to the course optimization methods.
First, let me introduce a little bit about this course.
My name is Qingna Li
from Beijing institute of technology.
This is some related information about this course.
Now, first we have several reference books.
All of them are in English.
For example, Numerical Optimization,
Convex Analysis, Convex Optimization,
as well as Convex Optimization Theory.
If you have more questions,
please contact me by my email or my cell phone,
as well as come to my office.
Now let's briefly talk about
what about this course.
In this course,
we will introduce what is optimization.
Then we will go through
unconstrained optimization problems
as well as constrained optimization problem.
Finally, if we have time,
we will talk about some applications and softwares.
Now we start with chapter one,
the introduction.
We will mainly talk about three issues.
First is what is optimization.
Second is the classification of optimization.
Finally,
we will introduce preliminaries in convex analysis.
Now let's go to the first part of chapter one.
What is optimization.
So think about what is optimization?
Actually, optimization exists everywhere.
For example, people optimize.
Why?
As long as you have choices,
you do optimization.
For example, if you got a lot of money,
you may do portfolio selection.
You try to maximize your return.
Meanwhile, you try to avoid the risk.
This is a kind of optimization.
Moreover, for engineering people,
People sometimes
they will maximize the efficiency of the system.
They try to select proper parameters in the system
so that their efficiency of the system will be maximized.
Another very practical example is
in people 's life that when you go out for travelling,
you may choose different paths.
For example, if I go to university,
I will search on Google map.
And they will tell me several paths that I can choose.
For example, if I choose the first one,
it will return me the smallest time.
However, I may use some money
because I go by the highway.
And the last path
actually returns you the longest time,
however you save a lot of money.
So different paths,
they have different aims,
which is kind of optimization.
Now, for nature, it optimizes as well.
For example, in physical system,
it is always looking for the minimum potential energy
so that it will reach a stationary state.
And for isolated chemical system,
it will keep reacting
until it reaches its minimum energy state.
So both of the two examples show that
nature optimizes as well.
The final example is that light also optimizes,
for example, it always travels
following the shortest path.
So this is a very common sense.
Now you may ask, what is optimization?
There are three key points in optimization.
The first key point is objective.
So the objective means that
what kind of aim you want to achieve,
for example, if I do portfolio selection,
I may try to maximize my return.
So in this case,
the profit is actually my objective.
I try to maximize the profit.
And in other applications,
for example, if you do path selection,
then you may try to save your time.
In this case,
time will be your aim or your objective.
So this is the first key point,
the second key point are variables.
Or in other words, the unknowns.
That means that the decision that you want to choose,
so if I choose this kind of portfolio selection,
I will buy several stocks in this way,
or I will buy some other shares in other way.
So this is the decision variables you want to make.
And the third key point are the constraints.
Constraints actually is very natural
because it tells people that you got several restrictions.
So those restrictions are actually constraints.
Overall, these are the three key points in optimization.
Now, to do the optimization,
we have actually three procedures to do that.
The first procedure is modelling.
That means we have to transfer
the real application problem in nature word
to mathematical word.
So this is the procedure of modelling.
The second step is actually we try to solve
this optimization problem using some algorithms.
And this is actually the main part
we are going to focus on in this course.
Finally, the third part is the checkout.
How to stop the algorithms.
We need to set up some stopping criterion
in order to make the algorithm terminate.
So this stopping criteria may have several kinds.
For example, if the optimality conditions are satisfied,
then we will stop the algorithms.
Otherwise, if we cannot reach
the optimality condition in finite time,
we need to do some sensitivity analysis.
These are the three steps when we do optimization.
Now let's pay attention to the first step.
The mathematical modeling of optimization.
Actually, it is very important to do this first step.
Now, the mathematical modeling is actually
you try to transfer the nature statement
to the mathematical statement.
So optimization is the minimization
or maximization of a function
subject to some restrictions or conditions.
And when we do the optimization,
we need to pay attention
to the following three key points.
This is what we have mentioned before.
For example, the objective function.
So we try to set up our objective function
in a reasonable way.
What is reasonable way?
So this reasonable way is that
this f should represent our aim properly.
And sometimes, or in a very natural way,
this f will be a function value
from n-dimensional space to a scalar function, R.
This is the objective.
And for the second key point variables,
naturally it will represent as a vector of variables
in n-dimensional space.
Of course sometimes this x will be matrix.
Anyway, in this class,
we will focus on the case of vector.
The third part is the constraint.
We actually use this constraint function,
which also maps the n-dimensional space
to the real scalar function.
And these restrictions actually represent
equality constraints and inequality constraints
that the variables must satisfy.
So we have putting the key points
in mathematical form.
So the objective function f,
the variable x as well as the constraint ci.
Now we introduce the mathematical form
of the optimization problem.
So basically, it takes the following form.
We minimize the objective f(x)
subject to some equality constraints
and inequality constraints.
So you can find the three key points easily.
First is the objective here.
And second, you got variables
that you want to make decisions here.
So it is below this min notation.
This is the second key point.
And the third key point appears here.
So you got equality constraints
and inequality constraints.
We also put the corresponding indices
into the capital E and capital I.
Finally, we got one extra notation,
this s.t. here.
It actually means that subject to,
in other words,
this variable x must
subject to these constraints here.
So this is the notation we will use
quite often in optimization problem.
And the final issue is that if you got maximize aim,
then we can easily transfer it into
minimization problem by just adding a minus term.
Then we transfer it to a minimization problem.
Therefore, in our following lectures,
we will always talk about
this kind of optimization problem.
Now let's take a look at a simple example.
We got some minimization problem
with respect to x1 and x2.
The objective is
you minimize a quadratic function here.
And also you got some inequality constraints.
So if we recall the three key points,
then the first one,
the objective we will define as
f(x) the quadratic term here.
And for the variables we write x1 and x2
in the vector x,
then x will be the decision variable.
Finally, the inequality constraints appear two.
There are two inequality constraints.
And we put them into greater than
or equal to zero form.
So this c1 is actually the minus x1
square plus x2 here.
So this is we add minus before the constraints
to make the constraints here
greater than or equal to zero.
And the inequality index will be
And we got no equality constraints.
So the equality constraints here will be empty set.
Now let's take a look at
what this example tells us.
Let's look at this figure.
And think about what you can see from this figure.
So in this figure,
there are actually two types of lines.
One type is the dotted line here,
if you can see.
And the dotted line actually
describes the contours of function f.
The contours of function f is defined
as the set of points
for which their function values are the same.
These are the contours of function f.
Now here you got three dotted circle here.
So the smaller the circle is,
the smaller the function value is.
Then you can find that
if we try to minimize this function value,
of course, the minimum will be reached
at the center of this circle.
So this is the optimizer of function f
if there is no restrictions on x.
However, we got some restrictions.
We got some inequality restrictions,
as we mentioned before.
So the inequality restrictions actually
put the c2 constraint and the c1 constraint here.
So the intersection of the two parts
will make the feasible region.
So this feasible region are actually
those points that satisfy the restrictions.
So actually we are looking for the points
in this feasible region
so that the function value will be minimized.
So compared with the feasible region,
as well as the contours of function f
then I guess you can easily find the local minimizer.
Actually, the minimizer will be reached here.
x*, this point,
the intersection of these contours
with the feasible region here.
So this is a minimizer.
So by this example,
we can actually find that
if the optimization problem is very simple,
then it is possible that we demonstrate it
by geometric graph.
And we find the local minimizer by hand.
However, in real applications,
the situation is not that easy.
Sometimes you got very complicated functions,
and you got very complicated constraints.
So it is not trivial to solve the problem
by drawing the pictures like this.
So that's why we need to design numerical algorithms.
We need to design the optimization algorithms
to find the optimal solution.
So this is the main aim of this course.
Let's make a brief summary about this lecture.
So in this lecture we introduce a little bit
about what is optimization
and three key points in optimization.
And what is the mathematical formulation of the optimization.
Finally, we show an example to see
how we can get the solution of the optimization.
In the next lecture,
we will introduce more about optimization.
So this class we stop here and see you next time.
-1.1 About optimization
-1.2 Classification of optimization
--1.2 Classification of optimization
-1.3 Preliminaries in convex analysis
--1.3 Preliminaries in convex analysis
-课件
-【作业须知】
-练习题
-2.1 What is a Solution?
-2.2 Optimality Conditions Ⅰ
-2.3 Optimality Conditions Ⅱ
-2.4 Line search strategy
-2.5 Search direction Ⅱ
-2.6 Convergence
-课件
-【作业须知】
-作业
-3.1 Exact Step Length
-3.2 The Wolfe conditions
-3.3 Inexact Line Search II
-3.4 Convergence of Line Search Methods
--3.4 Convergence of Line Search Methods
-3.5 Convergence Rate
-课件
-【作业须知】
-作业
-4.1 Main Idea of Trust Region Methods
--4.1 Main Idea of Trust Region Methods
-4.2 Trust-Region Algorithm
-4.3 Solving Subproblem
-4.4 Solving Subproblem II
-4.5 Convergence
-课件
-【作业须知】
-作业
-5.1 Conjugate direction method
--5.1 Conjugate direction method
-5.2 Property of conjugate direction method
--5.2 Property of conjugate direction method
-5.3 Conjugate gradient method
--5.3 Conjugate gradient method
-5.4 Rate of convergence
-5.5 Nonlinear conjugate gradient method
--5.5 Nonlinear conjugate gradient method
-5.6 Convergence of nonlinear conjugate gradient method
--5.6 Convergence of nonlinear conjugate gradient method
-课件
-【作业须知】
-作业
-6.1 Semismoothness
-6.2 Semismooth Newton's Method
-6.3 Support Vector Machine
-6.4 Semismooth Newtons' Method for SVM
--6.4 Semismooth Newtons' Method for SVM
-6.5 Exploring Sparsity in SVM
--6.5 Exploring Sparsity in SVM
-课件
-【作业须知】
-作业
-7.1 Local and Global Solutions
--7.1 Local and Global Solutions
-7.2 Examples One
-7.3 Examples Two and Three
-7.4 Constraint Qualifications
--7.4 Constraint Qualifications
-7.5 First-Order Optimality Conditions
--7.5 First-Order Optimality Conditions
-7.6 Second Order Necessary Condition
--7.6 Second Order Necessary Condition
-7.7 Second Order Sufficient Condition
--7.7 Second Order Sufficient Condition
-7.8 Duality
-课件
-【作业须知】
-作业
-8.1 KKT conditions
-8.2 An Example
-8.3 Dual Problem
-课件
-【作业须知】
-测试题
-9.1 Quadratic Penalty Method
--9.1 Quadratic Penalty Method
-9.2 Exact Penalty Function
-9.3 Augmented Lagrangian Method
--9.3 Augmented Lagrangian Method
-9.4 Quadratic Penalty Method for Hypergraph Matching
--9.4 Quadratic Penalty Method for Hypergraph Matching
-9.5 Quadratic Penalty Method for Hypergraph Matching: Ⅱ
--9.5 Quadratic Penalty Method for Hypergraph Matching: Ⅱ
-9.6 Augmented Lagrangian Method for SVM
--9.6 Augmented Lagrangian Method for SVM
-9.7 Augmented Lagrangian Method for SVM: Ⅱ
--9.7 Augmented Lagrangian Method for SVM: Ⅱ
-课件
-【作业须知】
-作业