This descending bearing system gave birth to the Metropolis algorithm of 1953 and then to simulated annealing by IBM in 1983.

## Introduction

The algorithm successively generates configurations from an initial solution R_{0} and an initial temperature T_{0} decreasing at each iteration, until a stable configuration is obtained. The probability of accepting a new configuration is (Metropolis rule):

- 1 if the configuration improves the objective function;
- exp(diff(E)/T) with diff(E) the energy difference and T the temperature; the energy difference is an internal function, depending on the objective function.

## The algorithm step by step

- To take an initial solution R=R
_{0}and an initial temperature T=T_{0}. The initial state is obtained by a heuristic (descent or greedy). - Generating a random solution R
_{(i+1)}in the neighborhood of the current solution:- To compare R
_{(i+1)}and R_{i}with Metropolis rule - To repeat until you find a stable solution (or after a certain number of iterations)

- To compare R
- To decrease the temperature T to a threshold temperature T
_{min}, or to a stable solution.

R_{0}T_{0}WhileT > T_{min}andE > e_{max}R_{(i+1)}= neighborhood(R_{i})IfE(R_{(i+1)}) < E(R_{i})ourandom() < P(diff(E)/T)Thenaccept R_{(i+1)}update TreturnR_{n}

Since T is large at the beginning, many solutions degrading the current solution can be chosen. This allows to escape from a local optimum. The following figure presents the value of the objective function. The temperature makes it possible to « jump » to a less « good » zones to exit a « valley ». The balls and signals in the following figure show up to what height the balls can « jump », that is to say the tolerance of negative variation of the objective function in obtaining a new solution.

If the temperature is low, it is more difficult to escape from a local optimum: