Commit 4f03f9c0 authored by Johann Dreo's avatar Johann Dreo
Browse files

more on algorithms

parent 8b2c7697
......@@ -64,6 +64,139 @@ Algorithmics
> The better you understand it, the better the algorithm will be.
Naive algorithms
----------------
- Enumeration
- Grid search
- Random search
- Low-discrepency random number generators
- Random walk
- Stochastic convergence: there is a non-zero probability to sample the optimum
after a finite number of steps.
- Conventional convergence: a stochastic search will tend to stabilize over time
on a (better) objective function value.
- Ergodicity
- Quasi-ergodicity
- Necessary condition for stochasitc convergence: quasi-ergodicity.
- When is a random walk convergent?
- Example: 2D fixed-step size random walk.
Descent Algorithms
-----------------
Generic template:
```python
x = None
p = uniform(xmin,xmax)
for i in range(g):
x = select(x,p)
p = variation(x)
p = evaluate(p)
```
Greedy algorithm:
```python
def select(x,p):
if better(f(x),f(p)):
return x
else:
return p
```
What are the conditions for which a greedy algorithm would converge?
Simulated Annealing
-------------------
```python
def select(x,p):
if f(x) < f(p) or uniform(0,1) <= exp(-1/T*(f(p)-f(x))):
return x
else:
return p
T = decrease(T)
```
What occurs when T is high? When it is low?
Relationship to Metropolis-Hastings algorithm.
Sampling in a parametrized approximation of the objective function
(i.e. from uniform to Dirac(s)).
Evolutionary Algorithms
-----------------------
Generic template:
```python
P = uniform(xmin,xmax,n)
for i in range(g):
parents = selection(P)
offsprings = variation(parents)
P = replacement(parents,offsprings)
```
More complete template:
```python
def evol(f):
opt = float('inf')
P = uniform(xmin,xmax,n)
for i in range(g):
parents = selection(P)
offsprings = variation(parents)
offsprings = evaluate(offsprings,f)
opt = best(parents,offsprings)
P = replacement(parents,offsprings)
return best(P)
```
Evolution Strategies (ES), numerical space:
```python
def variation(parents):
P = []
for x in parents:
P.append( x + normal(mean,variance) )
```
Genetic Algorithm (GA), boolean space:
```python
def variation(parents):
crossed = crossover(parents)
mutated = mutation(crossed)
return mutated
```
Is ES convergent?
If $$#P=1$$, what are the differences with a random walk?
Estimation of Distribution Algorithms
-------------------------------------
```python
def variation(parents, law, n):
parameters = estimate(parents, law)
offsprings = sample(paramaters, law, n)
return offsprings
```
What would be an example for numerical problems?
For boolean problems?
How to ensure convergence?
Ant Colony Algorithms
---------------------
TODO
Problem modelization
====================
......@@ -71,7 +204,8 @@ Problem modelization
> Way to model a solution: encoding.
## Main models
Main models
-----------
> Encoding:
>
......@@ -86,7 +220,8 @@ Problem modelization
> - multi-objectives (cf. Pareto optimality).
## Constraints management
Constraints management
----------------------
> Main constraints management tools for operators:
> - penalization,
......@@ -229,8 +364,8 @@ Algorithm Design
> Convergence definition(s):
>
> - strong,
> - weak.
> - conventional,
> - stochastic convergence.
>
> Neighborhood: subset of solutions atteinable after an atomic transformation:
>
......
......@@ -12,7 +12,7 @@ The framework implements a simple sensor placement problem
and handle metaheuristics manipulating solutions represented as
numerical vectors or bitstrings.
Author: Johann Dreo <johann.dreo@thalesgroup.com> ⓒ Thales group
Author: Johann Dreo <johann@dreo.fr>.
Executable
----------
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment