## #MonthOfJulia Day 22: Optimisation

Sudoku-as-a-Service is a great illustration of Julia’s integer programming facilities. Julia has several packages which implement various flavours of optimisation: JuMP, JuMPeR, Gurobi, CPLEX, DReal, CoinOptServices and OptimPack. We’re not going to look at anything quite as elaborate as Sudoku today, but focus instead on finding the extrema in some simple (or perhaps not so simple) mathematical functions. At this point you might find it interesting to browse through this catalog of test functions for optimisation.

## Optim

We’ll start out by using the Optim package to find extrema in Himmelblau’s function:

This function has one maximum and four minima. One of the minima is conveniently located at

As usual the first step is to load the required package.

Then we set up the objective function along with its gradient and Hessian functions.

There are a number of algorithms at our disposal. We’ll start with the Nelder Mead method which only uses the objective function itself. I am very happy with the detailed output provided by the `optimize()`

function and clearly it converges on a result which is very close to what we expected.

Next we’ll look at the limited-memory version of the BFGS algorithm. This can be applied either with or without an explicit gradient function. In this case we’ll provide the gradient function defined above. Again we converge on the right result, but this time with far fewer iterations required.

Finally we’ll try out Newton’s method, where we’ll provide both gradient and Hessian functions. The result is spot on and we’ve shaved off one iteration. Very nice indeed!

There is also a Simulated Annealing solver in the Optim package.

## NLopt

NLopt is an optimisation library with interfaces for a variety of programming languages. NLopt offers a variety of optimisation algorithms. We’ll apply both a gradient-based and a derivative-free technique to maximise the function

subject to the constraints

and

Before we load the NLopt package, it’s a good idea to restart your Julia session to flush out any remnants of the Optim package.

We’ll need to write the objective function and a generalised constraint function.

The COBYLA (Constrained Optimization BY Linear Approximations) algorithm is a local optimiser which doesn’t use the gradient function.

We impose generous upper and lower bounds on the solution space and use two inequality constraints. Either `min_objective!()`

or `max_objective!()`

is used to specify the objective function and whether or not it is a minimisation or maximisation problem. Constraints can be either inequalities using `inequality_constraint!()`

or equalities using `equality_constraint!()`

.

After making an initial guess we let the algorithm loose. I’ve purged some of the output to spare you from the floating point deluge.

It takes a number of iterations to converge, but arrives at a solution which seems eminently reasonable (and which satisfies both of the constraints).

Next we’ll use the MMA (Method of Moving Asymptotes) gradient-based algorithm.

We remove the second inequality constraint and simply confine the solution space appropriately. This is definitely a more efficient approach!

This algorithm converges more rapidly (because it takes advantage of the gradient function!) and we arrive at the same result.

I’m rather impressed. Both of these packages provide convenient interfaces and I could solve my test problems without too much effort. Have a look at the videos below for more about optimisation in Julia and check out github for the complete code for today’s examples. We’ll kick off next week with a quick look at some alternative data structures.