### 1D Test Functions for Function Optimisation

Function optimisation is a domain of study that looks for an input to a function that has the outcome of the maximum or minimum output of the function.

There are a huge number of optimisation algorithms and it is critical to study and develop intuitions with regards to optimisation algorithms on simple and easy-to-visualize test functions.

1D functions take a singular input value and output a singular evaluation of the input. They may be the simplest variant of test function to leverage when learning about function optimisation.

The advantage of 1D functions is that they can be visualized as a 2D plot with inputs to the function on the x-axis and outputs of the function on the y-axis. The known optima of the function and any sampling of the function can additionally be drawn on the same plot.

In this guide, you will find out about conventional 1D functions you can leverage when learning about function optimisation.

__Tutorial Summarization__

There are several differing variants of simple one-dimensional test functions we could potentially leverage.

Nonetheless, there are standardized test functions that are typically leveraged in the domain of function optimisation. There are also particular attributes of test functions that we might desire to choose when evaluating differing algorithms.

We shall look into a small number of simple 1D test functions in this guide and organize them through their attributes with five differing subgroups, which are:

1] Convex Unimodal Functions

2] Non-convex Unimodal Functions

3] Multimodal Functions

4] Discontinuous Functions (Non-Smooth)

5] Noisy Functions

Every function will be put forth leveraging Python code with a function implementation of the target objective function and a sampling of the function that is demonstrated as a line plot with the optima of the function overtly marked.

All functions are put forth as a minimisation issue, for instance, identify the input that has the outcome of the minimum (smallest value) output of the function. Any maximization function can be made a minimisation function through including a negative sign to all output. Likewise, any minimising function can be made maximizing in a similar fashion.

You can then select and copy-paste the code of one or more functions to leverage in your own project to research or contrast the behaviour of optimisation algorithms.

__Convex Unimodal Function__

A convex function is a function where a line can be drawn amongst any two points in the domain and the line stays in the domain.

For a 1D function demonstrated as a 2D plot, this implies the function possesses a bowl shape and the line amongst the two stays above the bowl.

Unimodal implies that function possesses a singular optima. A convex function might or might not be unimodal. Likewise, a unimodal function might or might not be convex.

The range for the function below is bounded to -5.0 and 5.0 and the optimum input value is 0.0.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | # convex unimodal optimization function from numpy import arange from matplotlib import pyplot
# objective function def objective(x): return x**2.0
# define range for input r_min, r_max = -5.0, 5.0 # sample input range uniformly at 0.1 increments inputs = arange(r_min, r_max, 0.1) # compute targets results = objective(inputs) # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 0.0 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

This function can be moved forward or backward on the number line through adding or subtracting a constant value, for instance, 5+x^2.

This can be good if there is a wish to shift the optimal input away from a value of 0.0.

__Non-convex Unimodal Functions__

A function is stated to be non-convex if a line cannot be drawn amongst two points in the domain and the line stays in the domain.

This implies it is feasible to identify two points within the domain where a line amongst them crosses a line plot of the function.

Usually, if a plot of a 1D function has more than one hill or valley, then we will we aware immediately the function is non-convex. Nonetheless, a non-convex function might or might not be unimodal.

A majority of real functions that we’re concerned with optimising are non-convex.

The range for the function below is bounded to -10.0 and 10.0 and the optimal input value is 0.67956.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | # non-convex unimodal optimization function from numpy import arange from numpy import sin from numpy import exp from matplotlib import pyplot
# objective function def objective(x): return -(x + sin(x)) * exp(-x**2.0)
# define range for input r_min, r_max = -10.0, 10.0 # sample input range uniformly at 0.1 increments inputs = arange(r_min, r_max, 0.1) # compute targets results = objective(inputs) # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 0.67956 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

__Multimodal Functions__

A multi-modal function implies a function with more than one “mode” or optima (Example, valley)

Multimodal functions are non-convex.

There might be a singular global optima and one or more local or deceptive optima. Alternately, there may be several global optima, that is, several differing inputs that have the outcome of the same minimal output of the function.

Let’s observe a few instances of multi-modal functions.

__Multimodal Function 1__

The range is bounded to -2.7 and 7.5 and the optimum input value is 5.145735.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | # multimodal function from numpy import sin from numpy import arange from matplotlib import pyplot
# objective function def objective(x): return sin(x) + sin((10.0 / 3.0) * x)
# define range for input r_min, r_max = -2.7, 7.5 # sample input range uniformly at 0.1 increments inputs = arange(r_min, r_max, 0.1) # compute targets results = objective(inputs) # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 5.145735 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

__Multimodal Function 2__

The range is bounded to 0.0 and 1.2 and the optimum input value is 0.96609.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | # multimodal function from numpy import sin from numpy import arange from matplotlib import pyplot
# objective function def objective(x): return -(1.4 – 3.0 * x) * sin(18.0 * x)
# define range for input r_min, r_max = 0.0, 1.2 # sample input range uniformly at 0.01 increments inputs = arange(r_min, r_max, 0.01) # compute targets results = objective(inputs) # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 0.96609 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

__Multimodal Function 3__

The range is bounded to 0.0 and 10.0 and the optimal input value is 7.9787.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | # multimodal function from numpy import sin from numpy import arange from matplotlib import pyplot
# objective function def objective(x): return -x * sin(x)
# define range for input r_min, r_max = 0.0, 10.0 # sample input range uniformly at 0.1 increments inputs = arange(r_min, r_max, 0.1) # compute targets results = objective(inputs) # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 7.9787 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

__Discontinuous Functions (Non-Smooth)__

A function might possess a discontinuity, implying that the smooth change in inputs to the function may have the outcome of non-smooth alterations in the output.

We might make references to functions with this attribute as non-smooth functions or discontinuous functions.

There are several differing variants of discontinuity, even though one typical instance is a jump or acute change in direction within the output values of the function, which is simple to observe in a plot of the function.

Discontinuous Function

The range is bounded to -2.0 and 2.0 and the optimal input value is 1.0.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | # non-smooth optimization function from numpy import arange from matplotlib import pyplot
# objective function def objective(x): if x > 1.0: return x**2.0 elif x == 1.0: return 0.0 return 2.0 – x
# define range for input r_min, r_max = -2.0, 2.0 # sample input range uniformly at 0.1 increments inputs = arange(r_min, r_max, 0.1) # compute targets results = [objective(x) for x in inputs] # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 1.0 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

__Noisy Functions__

A Function might possess noise, implying that every evaluation might possess a stochastic component, which modifies the output of the function just a bit every time.

Any non-noisy function can be made noisy by including small Gaussian random numbers to the input values.

The range for the function below is bounded to -5.0 and 5.0 and the optimal input value is 0.0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | # noisy optimization function from numpy import arange from numpy.random import randn from matplotlib import pyplot
# objective function def objective(x): return (x + randn(len(x))*0.3)**2.0
# define range for input r_min, r_max = -5.0, 5.0 # sample input range uniformly at 0.1 increments inputs = arange(r_min, r_max, 0.1) # compute targets results = objective(inputs) # create a line plot of input vs result pyplot.plot(inputs, results) # define optimal input value x_optima = 0.0 # draw a vertical line at the optimal input pyplot.axvline(x=x_optima, ls=’–‘, color=’red’) # show the plot pyplot.show() |

Executing the instance creates a line plot of the function and marks the optima with a red line.

__Conclusion__

In this guide, you found out about conventional one-dimensional functions you can leverage when learning about function optimisation.