Chapter 3 : Unconstrained Gradient-based
optimization
1 Introduction
Prev .
Chapter broad wes view
less
-Now
descriptive ,
more
fundamentale to mum .
opt
-
>
this chapter is uncontrained Optimization
{
unconstrained optimization
*
Definition : with
single objective is
defined as
ZIRM
X *=
min f/) beingthe design
*
With * .
var
f R
and : IR
being a scolar
function
. **
being the
optimal volves of the
design .
Solve these
problems using gradient information to determine
steps
~
the NON-LINEAR IE
oly function
~ We assume to be contin
,
and deterministic
x *
D Optimizer * x
*
X f,
no constrainte
*
Analysis
2 Fundamentals
* Desivatives and
gradient
>
gradient of scolar
obj function f(x) is the column vector
-
eforに [ …
…
each Card
In
grad of change
With
↳
camp. .
quart .
rate
respect to Casesp .
dingn var
{
*
Def : Directional derivative of a
function in
direction
of p is
defined a
- fct
flreEpl Ofp
=
):
f' Al : 11
efllpll cno
v
ε
子
, * Cenvotirs and Hessian
- Rote
of Change of gradients Curvature
Also tells is,
-
useful info - un
if function slops or o
This second order driv .
Is
represented by the Hessian
蘭膚黴籲 : 侶
:
“
α
!
(
、
(m xmx)/
"
x
mous
a
cossmin
As
for gradient ,
we can
find rate
of change of gradient an
in
abituary mem direction
p
Hp : ef / + 1
)
=
(
*
Def : To
find curvature ,
theone-dim .
function dong direction ,
we need to
project
-
Escola a
E(f(-Hp
1x1
onto direction
His to
ㆁ
an :
I L
B* + I
(Mxxmx
1xMx
-
posible to
find v
,
/i = 1
... mx
eigene Of
. H
HV; =
K Vi V rep principal .
cavoture
divection ?
consides
↳
eigened of H
examples quadr funct .
f ( t,ra
)
-
.
x
t + 2 ピー +α
^'
envol ?
CanH (是
·
ag
3x
det(11 H)
=
- = 0
-1/ 2 11
3 -
= 0 (e-ale-h) 1- =
2
= ) s -
60 -
7 = 0
断= ”さ器
6
=> つ =
Grepinding eigen veters
=
e
= p-) and va =
(
一
optimization
1 Introduction
Prev .
Chapter broad wes view
less
-Now
descriptive ,
more
fundamentale to mum .
opt
-
>
this chapter is uncontrained Optimization
{
unconstrained optimization
*
Definition : with
single objective is
defined as
ZIRM
X *=
min f/) beingthe design
*
With * .
var
f R
and : IR
being a scolar
function
. **
being the
optimal volves of the
design .
Solve these
problems using gradient information to determine
steps
~
the NON-LINEAR IE
oly function
~ We assume to be contin
,
and deterministic
x *
D Optimizer * x
*
X f,
no constrainte
*
Analysis
2 Fundamentals
* Desivatives and
gradient
>
gradient of scolar
obj function f(x) is the column vector
-
eforに [ …
…
each Card
In
grad of change
With
↳
camp. .
quart .
rate
respect to Casesp .
dingn var
{
*
Def : Directional derivative of a
function in
direction
of p is
defined a
- fct
flreEpl Ofp
=
):
f' Al : 11
efllpll cno
v
ε
子
, * Cenvotirs and Hessian
- Rote
of Change of gradients Curvature
Also tells is,
-
useful info - un
if function slops or o
This second order driv .
Is
represented by the Hessian
蘭膚黴籲 : 侶
:
“
α
!
(
、
(m xmx)/
"
x
mous
a
cossmin
As
for gradient ,
we can
find rate
of change of gradient an
in
abituary mem direction
p
Hp : ef / + 1
)
=
(
*
Def : To
find curvature ,
theone-dim .
function dong direction ,
we need to
project
-
Escola a
E(f(-Hp
1x1
onto direction
His to
ㆁ
an :
I L
B* + I
(Mxxmx
1xMx
-
posible to
find v
,
/i = 1
... mx
eigene Of
. H
HV; =
K Vi V rep principal .
cavoture
divection ?
consides
↳
eigened of H
examples quadr funct .
f ( t,ra
)
-
.
x
t + 2 ピー +α
^'
envol ?
CanH (是
·
ag
3x
det(11 H)
=
- = 0
-1/ 2 11
3 -
= 0 (e-ale-h) 1- =
2
= ) s -
60 -
7 = 0
断= ”さ器
6
=> つ =
Grepinding eigen veters
=
e
= p-) and va =
(
一