Simulated Annealing

Annealing is a metallurgical method that makes it possible to obtain crystallized solids while avoiding the state of glass. Once the metal has melted, the temperature is gradually lowered until it reaches a solid state. To remove all the defects, the metal is heated and cooled, in descending order of temperature, until reaching a stable state.

This descending bearing system gave birth to the Metropolis algorithm of 1953 and then to simulated annealing by IBM in 1983.


The goal is to reach a stable state of the objective function – a local or global optimum – from a random solution. This state is reached by steps, where acceptance of a mutation is based on the « temperature » of the bearing.

The algorithm successively generates configurations from an initial solution R0 and an initial temperature T0 decreasing at each iteration, until a stable configuration is obtained. The probability of accepting a new configuration is (Metropolis rule):

  • 1 if the configuration improves the objective function;
  • exp(diff(E)/T) with diff(E) the energy difference and T the temperature; the energy difference is an internal function, depending on the objective function.

The algorithm step by step

  1. To take an initial solution R=R0 and an initial temperature T=T0. The initial state is obtained by a heuristic (descent or greedy).
  2. Generating a random solution R(i+1) in the neighborhood of the current solution:
    1. To compare R(i+1) and Ri with Metropolis rule
    2. To repeat until you find a stable solution (or after a certain number of iterations)
  3. To decrease the temperature T to a threshold temperature Tmin, or to a stable solution.
While T > Tmin and E > emax
   R(i+1) = neighborhood(Ri)
   If E(R(i+1)) < E(Ri) ou random() < P(diff(E)/T) Then
      accept R(i+1)
   update T
return Rn


Since T is large at the beginning, many solutions degrading the current solution can be chosen. This allows to escape from a local optimum. The following figure presents the value of the objective function. The temperature makes it possible to « jump » to a less « good » zones to exit a « valley ». The balls and signals in the following figure show up to what height the balls can « jump », that is to say the tolerance of negative variation of the objective function in obtaining a new solution.


If the temperature is low, it is more difficult to escape from a local optimum: