Skip to content
GitLab
Menu
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
DrEO
sho
Commits
4f03f9c0
Commit
4f03f9c0
authored
Sep 14, 2021
by
Johann Dreo
Browse files
more on algorithms
parent
8b2c7697
Changes
2
Show whitespace changes
Inline
Sidebyside
LESSON.md
View file @
4f03f9c0
...
...
@@ 64,6 +64,139 @@ Algorithmics
> The better you understand it, the better the algorithm will be.
Naive algorithms


Enumeration

Grid search

Random search

Lowdiscrepency random number generators

Random walk

Stochastic convergence: there is a nonzero probability to sample the optimum
after a finite number of steps.

Conventional convergence: a stochastic search will tend to stabilize over time
on a (better) objective function value.

Ergodicity

Quasiergodicity

Necessary condition for stochasitc convergence: quasiergodicity.

When is a random walk convergent?

Example: 2D fixedstep size random walk.
Descent Algorithms

Generic template:
```
python
x
=
None
p
=
uniform
(
xmin
,
xmax
)
for
i
in
range
(
g
):
x
=
select
(
x
,
p
)
p
=
variation
(
x
)
p
=
evaluate
(
p
)
```
Greedy algorithm:
```
python
def
select
(
x
,
p
):
if
better
(
f
(
x
),
f
(
p
)):
return
x
else
:
return
p
```
What are the conditions for which a greedy algorithm would converge?
Simulated Annealing

```
python
def
select
(
x
,
p
):
if
f
(
x
)
<
f
(
p
)
or
uniform
(
0
,
1
)
<=
exp
(

1
/
T
*
(
f
(
p
)

f
(
x
))):
return
x
else
:
return
p
T
=
decrease
(
T
)
```
What occurs when T is high? When it is low?
Relationship to MetropolisHastings algorithm.
Sampling in a parametrized approximation of the objective function
(i.e. from uniform to Dirac(s)).
Evolutionary Algorithms

Generic template:
```
python
P
=
uniform
(
xmin
,
xmax
,
n
)
for
i
in
range
(
g
):
parents
=
selection
(
P
)
offsprings
=
variation
(
parents
)
P
=
replacement
(
parents
,
offsprings
)
```
More complete template:
```
python
def
evol
(
f
):
opt
=
float
(
'inf'
)
P
=
uniform
(
xmin
,
xmax
,
n
)
for
i
in
range
(
g
):
parents
=
selection
(
P
)
offsprings
=
variation
(
parents
)
offsprings
=
evaluate
(
offsprings
,
f
)
opt
=
best
(
parents
,
offsprings
)
P
=
replacement
(
parents
,
offsprings
)
return
best
(
P
)
```
Evolution Strategies (ES), numerical space:
```
python
def
variation
(
parents
):
P
=
[]
for
x
in
parents
:
P
.
append
(
x
+
normal
(
mean
,
variance
)
)
```
Genetic Algorithm (GA), boolean space:
```
python
def
variation
(
parents
):
crossed
=
crossover
(
parents
)
mutated
=
mutation
(
crossed
)
return
mutated
```
Is ES convergent?
If $$#P=1$$, what are the differences with a random walk?
Estimation of Distribution Algorithms

```
python
def
variation
(
parents
,
law
,
n
):
parameters
=
estimate
(
parents
,
law
)
offsprings
=
sample
(
paramaters
,
law
,
n
)
return
offsprings
```
What would be an example for numerical problems?
For boolean problems?
How to ensure convergence?
Ant Colony Algorithms

TODO
Problem modelization
====================
...
...
@@ 71,7 +204,8 @@ Problem modelization
> Way to model a solution: encoding.
## Main models
Main models

> Encoding:
>
...
...
@@ 86,7 +220,8 @@ Problem modelization
>  multiobjectives (cf. Pareto optimality).
## Constraints management
Constraints management

> Main constraints management tools for operators:
>  penalization,
...
...
@@ 229,8 +364,8 @@ Algorithm Design
> Convergence definition(s):
>
> 
strong
,
> 
weak
.
> 
conventional
,
> 
stochastic convergence
.
>
> Neighborhood: subset of solutions atteinable after an atomic transformation:
>
...
...
README.md
View file @
4f03f9c0
...
...
@@ 12,7 +12,7 @@ The framework implements a simple sensor placement problem
and handle metaheuristics manipulating solutions represented as
numerical vectors or bitstrings.
Author: Johann Dreo
<johann
.
dreo
@thalesgroup.com>
ⓒ Thales group
Author: Johann Dreo
<johann
@
dreo
.fr>
.
Executable

...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment