### Dual Annealing Optimization with Python

Dual Annealing is a stochastic global optimization algorithm.

It is an implementation of the generalized simulated annealing algorithm, an extension of simulated annealing. Also, it is coupled with a local search algorithm that is automatically carried out at the end of the simulated annealing procedure.

This combo of efficient global and local search procedures furnishes a potent algorithm for challenging nonlinear issues.

In this guide, you will find out about the dual annealing global optimization algorithm.

After going through this guide, you will be aware of:

- Dual annealing optimization being a global optimization that is an altered version of simulated annealing that also leverages a local search algorithm.
- How to leverage the dual annealing optimization algorithm API within Python.

- Instances of leveraging dual annealing to find solutions to global optimization problems with several optima.

**Tutorial Summarization**

The tutorial is subdivided into three portions, which are:

1] What is dual annealing

2] Dual annealing API

3] Dual annealing instance

**What is Dual Annealing**

Dual annealing is a global optimization algorithm.

As such, it is developed for objective functions that possess a nonlinear response surface. It is a stochastic optimization algorithm, implying that it leverages randomness in the search procedure and every run of the search might identify a differing solution.

Dual Annealing is based on the Simulated Annealing optimization algorithm.

Simulated annealing is a variant of stochastic hill climbing where a candidate solution is altered in an arbitrary way and the altered solutions are accepted to substitute the present candidate solution probabilistically. This implies that it is doable for worse solutions to substitute the present candidate solution. The probability of this variant of replacement is high at the start of the search and reduces with every iteration, managed by the “temperature” hyperparameter.

Dual annealing is an implementation of the classical simulated annealing (CSA) algorithm. It is based on the generalized simulated annealing (GSA) algorithm detailed in the 1997 paper “Generalized Simulated Annealing Algorithm and its Application to the Thomson Model.”

It brings together the annealing schedule (rate at which the temperature reduces over algorithm iterations) from “fast simulated annealing” (FSA) and the probabilistic acceptance of an alternate statistical process “Tsallis statistics” named after the author.

Experimental outcomes identify that this generalized simulated annealing algorithm seems to feature better performance in a better fashion than the classical or the quick versions of the algorithms to which it was contrasted.

GSA not just converges quicker than FSA and CSA, but also has the capability to escape from a local minimum more easily than FSA and CSA.

On top of these modifications of simulated annealing, a local search algorithm can be applied to the solution identified by the simulated annealing search.

This is desired as global search algorithms are typically good at situating the basin (area in the search space) for the optimal solution but are typically poor at identifying the most optimal solution in the basin. While local search algorithms excel at identifying the optima of a basin.

Coupling a local search with the simulated annealing procedure makes sure the search obtains the most out of the candidate solution that is located.

Now that we are acquainted with the dual annealing algorithm from a high level, let’s delve into the API for dual annealing in Python.

**Dual Annealing API**

The dual annealing global optimization algorithm is available in Python through the dual_annealing() SciPy function.

The function gets the name of the objective function and the bounds of every input variable as minimum arguments for the search.

[Control]

1 2 3 | … # perform the dual annealing search result = dual_annealing(objective, bounds) |

There are a plethora of extra hyperparameters for the search that possess default values, even though you can set them up to customize the search.

The “maxiter” argument mentions the cumulative number of iterations of the algorithm (not the cumulative number of function evaluations) and defaults to 1,000 iterations. The “maxfun” can be mentioned if desired to restrict the cumulative number of function evaluations and defaults to 10 million.

The initial temperature of the search is mentioned by the “initial_temp” argument, which defaults to 5,230. The annealing procedure will begin again after the temperature attains a value equivalent or less than (initial_temp * restart_temp_ratio). The ratio defaults to a very small number 2e-05 (i.e., 0.00002), so the default trigger for re-annealing is a temp of (5230 * 0.00002 or 0.1046.

The algorithm also furnishes control over hyperparameters particular to the generalized simulated annealing on which it has its basis. This consists of how far jumps can be made during the search through the “visit” argument, which defaults to 2.62. (values less than 3 are preferred), and the “accept” argument that controls the likelihood of accepting new solutions, which defaults to -5.

The minimize() function is called for the local search with default hyperparameters. The local search can be setup by furnishing a dictionary of hyperparameter names and values to the “local_search_options” argument.

The local search aspect of the search can be disabled by setting the “no_local_search” argument to True.

The outcome of the search is an OptimizeResult object where attributes can be accessed like a dictionary. The success (or not) of the search can be accessed through the “success” or ‘message’ key.

The cumulative number of function evaluations can be accessed through “nfev” and the optimal input identified for the search is accessible through the ‘x’ key.

Now that we are acquainted with the dual annealing API in Python, let’s look at a few worked instances.

**Dual Annealing Instance**

In this portion of the blog, we will look at an instance of leveraging the dual annealing algorithm on a multi-modal objective function.

The Ackley function is an instance of a multimodal objective function that has a singular global optima and several local optima in which a local search might get stuck.

As such, a global optimization strategy is needed. It is a two-dimensional objective function that possess a global optima at [0,0] which evaluates to 0.0.

The instance below implements the Ackley and develops a 3D surface plot demonstrating the global optima and several local optima.

[Control]

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 | # ackley multimodal function from numpy import arange from numpy import exp from numpy import sqrt from numpy import cos from numpy import e from numpy import pi from numpy import meshgrid from matplotlib import pyplot from mpl_toolkits.mplot3d import Axes3D
# objective function def objective(x, y): return -20.0 * exp(-0.2 * sqrt(0.5 * (x**2 + y**2))) – exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20
# define range for input r_min, r_max = -5.0, 5.0 # sample input range uniformly at 0.1 increments xaxis = arange(r_min, r_max, 0.1) yaxis = arange(r_min, r_max, 0.1) # create a mesh from the axis x, y = meshgrid(xaxis, yaxis) # compute targets results = objective(x, y) # create a surface plot with the jet color scheme figure = pyplot.figure() axis = figure.gca(projection=’3d’) axis.plot_surface(x, y, results, cmap=’jet’) # show the plot pyplot.show() |

Running the instance creates the surface plot of the Ackley function displaying the massive number of local optima.

We can go about applying the dual annealing algorithm to the Ackley objective function.

To start with, we can define the bounds of the search space as the limits of the function in every dimension.

1 2 3 | … # define the bounds on the search bounds = [[r_min, r_max], [r_min, r_max]] |

We can then go about applying the search by mentioning the name of the objective function and the bounds of the search.

In this scenario, we will leverage the default hyperparameters.

1 2 3 | … # perform the simulated annealing search result = dual_annealing(objective, bounds) |

After the search is finished, it will report the status of the search and the number of iterations carried out, in addition to the outcome identified with its evaluation.

1 2 3 4 5 6 7 8 | … # summarize the result print(‘Status : %s’ % result[‘message’]) print(‘Total Evaluations: %d’ % result[‘nfev’]) # evaluate solution solution = result[‘x’] evaluation = objective(solution) print(‘Solution: f(%s) = %.5f’ % (solution, evaluation)) |

Connecting these together, the full example, of applying dual annealing to the Ackley objective function is detailed here.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | # dual annealing global optimization for the ackley multimodal objective function from scipy.optimize import dual_annealing from numpy.random import rand from numpy import exp from numpy import sqrt from numpy import cos from numpy import e from numpy import pi
# objective function def objective(v): x, y = v return -20.0 * exp(-0.2 * sqrt(0.5 * (x**2 + y**2))) – exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20
# define range for input r_min, r_max = -5.0, 5.0 # define the bounds on the search bounds = [[r_min, r_max], [r_min, r_max]] # perform the dual annealing search result = dual_annealing(objective, bounds) # summarize the result print(‘Status : %s’ % result[‘message’]) print(‘Total Evaluations: %d’ % result[‘nfev’]) # evaluate solution solution = result[‘x’] evaluation = objective(solution) print(‘Solution: f(%s) = %.5f’ % (solution, evaluation)) |

Running the instance executes the optimization, then reports the results.

Your outcomes may demonstrate variance provided the stochastic nature of the algorithm or evaluation procedure, or variations in numerical accuracy. Take up running the instance a few times and contrast the average outcome.

In this scenario, we can observe that the algorithm situated the optima with inputs very near to zero and an objective function evaluation that is virtually zero.

We can observe that a total of 4,136 function evaluations were carried out.

1 2 3 | Status : [‘Maximum number of iteration reached’] Total Evaluations: 4136 Solution: f([-2.26474440e-09 -8.28465933e-09]) = 0.00000 |

__Further Reading__

This section furnishes more resources on the subject if you are seeking to delve deeper.

*Papers*

Generalized Simulated Annealing Algorithm and its Application on the Thomson Model, 1997

Generalized Simulated Annealing, in 1996.

Generalized Simulated Annealing for Global Optimization: TheGenSA Package, 2013

*APIs*

scipy.optimize.dual_annealing API

scipy.optimize.OptimizeResult API

*Articles*

Global optimization, Wikipedia

Simulated annealing, Wikipedia

Ackley Function, Wikipedia

__Conclusion__

In this guide, you found out about the dual annealing global optimization algorithm.

Particularly, you learned:

- Dual annealing optimization is a global optimization that is an altered variant of simulated annealing that also leverages a local search algorithm.
- How to leverage the dual annealing optimization algorithm API within Python.
- Instances of leveraging dual annealing to find solutions to global optimization problems with several optima.