Genetic algorithms:
+ Widely used in every discipline.
+ Effective in complex problems on large search spaces.
+ Works for both discrete and real-valued candidate solutions
+ Easy to understand and extend
+ Supports multi-objective optimization
+ Effective with noisy/non-differentiable problems or non-analytic problems
+ Well suited to MOO since they are robust w.r.t. complexity of the problem and retain a set of
solutions each of which satisfies the objective
- Premature convergence
- Loss of diversity
- assumes genotypes and genetic operators
- Can require high computational time
- Simple GA has binary representation
Differential evolution:
+ Effective and popular
+ Repositories of bad fitting individuals can be used to improve diversity and performance
+ Changing population size can be used to balance exploration and exploitation.
+ Adaptive parameters can help to alter its behavior according to the characteristics of the fitness
landscape.
+ Can be used for nonlinear and nondifferentiable continuous space functions
Covariance matrix adaptation evolution strategies:
+ Exploits possible correlations between the variables
+ Tries to follow a path increasing the likelihood of taking successful steps
+ Self-tuning algorithm (no hyper-parameters)
+ Scale, rotation or translation is invariant
+ Restarts and adaptive population size can strongly improve the performances
Swarm intelligence:
+ Based on information exchange about promising solutions.
, + Well suitable for real-valued problems (discrete versions exist)
+ Simple and efficient
+ Does not require differentiable problem
Particle swarm optimization:
+ Simple
- Real-valued vectors
Ant colony optimization:
+ Can be extended with elitism
+ Excellent for constrained discrete problems
+ Adapts to new distances
+ Retains memory of entire colony
+ Less affected by poor initial solutions
+ Suited for combinatorial and constructive problems
- Less intuitive
Multi-objective particle swarm optimization:
+ Convergence and diversity
+ Can be extended with mutation operators and ε -dominance to OMOPSO to reduce crowding
- Lot of parameters to be tuned
OMOPSO:
+ Better distribution of solutions on the pareto front
- Additional hyper-parameter tuning
NSGA-II:
+ Introduces elitism
+ Uses fast dominated sorting to rank solutions
+ Leverages crowding distances to maintain diversity in the population -> diversity
+ Widely used in every discipline.
+ Effective in complex problems on large search spaces.
+ Works for both discrete and real-valued candidate solutions
+ Easy to understand and extend
+ Supports multi-objective optimization
+ Effective with noisy/non-differentiable problems or non-analytic problems
+ Well suited to MOO since they are robust w.r.t. complexity of the problem and retain a set of
solutions each of which satisfies the objective
- Premature convergence
- Loss of diversity
- assumes genotypes and genetic operators
- Can require high computational time
- Simple GA has binary representation
Differential evolution:
+ Effective and popular
+ Repositories of bad fitting individuals can be used to improve diversity and performance
+ Changing population size can be used to balance exploration and exploitation.
+ Adaptive parameters can help to alter its behavior according to the characteristics of the fitness
landscape.
+ Can be used for nonlinear and nondifferentiable continuous space functions
Covariance matrix adaptation evolution strategies:
+ Exploits possible correlations between the variables
+ Tries to follow a path increasing the likelihood of taking successful steps
+ Self-tuning algorithm (no hyper-parameters)
+ Scale, rotation or translation is invariant
+ Restarts and adaptive population size can strongly improve the performances
Swarm intelligence:
+ Based on information exchange about promising solutions.
, + Well suitable for real-valued problems (discrete versions exist)
+ Simple and efficient
+ Does not require differentiable problem
Particle swarm optimization:
+ Simple
- Real-valued vectors
Ant colony optimization:
+ Can be extended with elitism
+ Excellent for constrained discrete problems
+ Adapts to new distances
+ Retains memory of entire colony
+ Less affected by poor initial solutions
+ Suited for combinatorial and constructive problems
- Less intuitive
Multi-objective particle swarm optimization:
+ Convergence and diversity
+ Can be extended with mutation operators and ε -dominance to OMOPSO to reduce crowding
- Lot of parameters to be tuned
OMOPSO:
+ Better distribution of solutions on the pareto front
- Additional hyper-parameter tuning
NSGA-II:
+ Introduces elitism
+ Uses fast dominated sorting to rank solutions
+ Leverages crowding distances to maintain diversity in the population -> diversity