Chapter 5 :
Gradient-free methods
. 1) Introduction
3
~ GB methodes
from ch 3 and were
refull and efficient finding
⇌
,
local mis
for high dim
,
non-linear
Crt.
problems defined by
, ,
smooth
functions
* 7
1
f1
- >
objective function may
reach to
-
-
-
lock
lock
hove
differentiability
continuity
multiple echema
_
I
use
gradent-fir methods
↳
r
↳ mot
diff..
↳
for + Start
points
woul
gradient based methods
you getI optimum with
↳
multiple extreme ?
= A solution could be Rondom start Gradient Search :
Mult stat
approach 1 lea
optim method I
ques nectant
2) Gen K rondon start
.
points
3) local optim ?
perfum *
-how to exope local optima ? -
more explication ,
like Random search ?
= Yes but ,
should be balanced with
exploitation ?
advantages Gradient-free
↳ easier to set mo univered
up ,
But less
efficient particularly
,
when *
-Some
feature global seart ↑ likelihood of minimum
finding globe
=
another like
~ reason could be multic
objective
i
genetic dojitm
do when disnets derivative of disnete variable invalid GB
↳ varidles inu for
, We will over
following agoitms ,
can be
categuised as
-
Deterministic local optimizations Nelder-read
↳ Nelder mead is an
simples algorithm
very exploitative? ,
↳ local inform .
Stochastic
-
Single-state Simulated annealing
↳ I
funct . Was. Ot a time
,
motivated to
explore first ,
exploit later
Stochastic and Particle
-
Population-based Geneticalgoritm swam
time
no
multipile funct eval ot
. a
iteration
-population evolves each
. 3) Nelder-Mead
3 Simple olgaitm
↳ deterministic
, direct-seach
based on simple , geom .
Figure defined by nx + 1
points in
design space of My varibels
t point
or
↳
Cost-resort option
ram
Each iteration corrup to a
different
simples
~
agritm modifies simple each iteration
using
5
simple operations
each iteratin aim to point with better to simple
replace wat a are
gam a new
x )
**
each ites Mart with
~ .
reflection -> new
point wing X =
x + 2(X) -
with 2 = 1
for reflection
↳
If reflected point better -
expansion d = 2
Gradient-free methods
. 1) Introduction
3
~ GB methodes
from ch 3 and were
refull and efficient finding
⇌
,
local mis
for high dim
,
non-linear
Crt.
problems defined by
, ,
smooth
functions
* 7
1
f1
- >
objective function may
reach to
-
-
-
lock
lock
hove
differentiability
continuity
multiple echema
_
I
use
gradent-fir methods
↳
r
↳ mot
diff..
↳
for + Start
points
woul
gradient based methods
you getI optimum with
↳
multiple extreme ?
= A solution could be Rondom start Gradient Search :
Mult stat
approach 1 lea
optim method I
ques nectant
2) Gen K rondon start
.
points
3) local optim ?
perfum *
-how to exope local optima ? -
more explication ,
like Random search ?
= Yes but ,
should be balanced with
exploitation ?
advantages Gradient-free
↳ easier to set mo univered
up ,
But less
efficient particularly
,
when *
-Some
feature global seart ↑ likelihood of minimum
finding globe
=
another like
~ reason could be multic
objective
i
genetic dojitm
do when disnets derivative of disnete variable invalid GB
↳ varidles inu for
, We will over
following agoitms ,
can be
categuised as
-
Deterministic local optimizations Nelder-read
↳ Nelder mead is an
simples algorithm
very exploitative? ,
↳ local inform .
Stochastic
-
Single-state Simulated annealing
↳ I
funct . Was. Ot a time
,
motivated to
explore first ,
exploit later
Stochastic and Particle
-
Population-based Geneticalgoritm swam
time
no
multipile funct eval ot
. a
iteration
-population evolves each
. 3) Nelder-Mead
3 Simple olgaitm
↳ deterministic
, direct-seach
based on simple , geom .
Figure defined by nx + 1
points in
design space of My varibels
t point
or
↳
Cost-resort option
ram
Each iteration corrup to a
different
simples
~
agritm modifies simple each iteration
using
5
simple operations
each iteratin aim to point with better to simple
replace wat a are
gam a new
x )
**
each ites Mart with
~ .
reflection -> new
point wing X =
x + 2(X) -
with 2 = 1
for reflection
↳
If reflected point better -
expansion d = 2