# #MonthOfJulia Day 27: Distributions

Today I'm looking at the Distributions package. Let's get things rolling by loading it up.

julia> using Distributions


There's some overlap between the functionality in Distributions and what we saw yesterday in the StatsFuns package. So, instead of looking at functions to evaluate various aspects of PDFs and CDFs, we'll focus on sampling from distributions and calculating summary statistics.

Julia has native support for sampling from a uniform distribution. We've seen this before, but here's a reminder.

julia> srand(359)                          # Set random number seed.
julia> rand()                              # Random number on [0, 1)
0.4770241944535658


What if you need to generate samples from a more exotic distribution? The Normal distribution, although not particularly exotic, seems like a natural place to start. The Distributions package exposes a type for each supported distribution. For the Normal distribution the type is appropriately named Normal. It's derived from Distribution with characteristics Univariate and Continuous.

julia> super(Normal)
Distribution{Univariate,Continuous}
julia> names(Normal)
2-element Array{Symbol,1}:
:μ
:σ


The constructor accepts two parameters: mean (μ) and standard deviation (σ). We'll instantiate a Normal object with mean 1 and standard deviation 3.

julia> d1 = Normal(1.0, 3.0)
Normal(μ=1.0, σ=3.0)
julia> params(d1)
(1.0,3.0)
julia> d1.μ
1.0
julia> d1.σ
3.0


Thanks to the wonders of multiple dispatch we are then able to generate samples from this object with the rand() method.

julia> x = rand(d1, 1000);


We'll use Gadfly to generate a histogram to validate that the samples are reasonable. They look pretty good.

There are functions like pdf(), cdf(), logpdf() and logcdf() which allow the density function of our distribution object to be evaluated at particular points. Check those out. We're moving on to truncating a portion of the distribution, leaving a Truncated distribution object.

julia> d2 = Truncated(d1, -4.0, 6.0);


Again we can use Gadfly to get an idea of what this looks like. This time we'll plot the actual PDF rather than a histogram of samples.

The Distributions package implements an extensive selection of other continuous distributions, like Exponential, Poisson, Gamma and Weibull. The basic interface for each of these is consistent with what we've seen for Normal above, although there are some methods which are specific to some distributions.

Let's look at a discrete distribution, using a Bernoulli distribution with success rate of 25% as an example.

julia> d4 = Bernoulli(0.25)
Bernoulli(p=0.25)
julia> rand(d4, 10)
10-element Array{Int64,1}:
0
1
0
1
1
0
0
0
1
0


What about a Binomial distribution? Suppose that we have a success rate of 25% per trial and want to sample the number of successes in a batch of 100 trials.

julia> d5 = Binomial(100, 0.25)
Binomial(n=100, p=0.25)
julia> rand(d5, 10)
10-element Array{Int64,1}:
22
21
30
23
28
25
26
26
28
21


Finally let's look at an example of fitting a distribution to a collection of samples using Maximum Likelihood.

julia> x = rand(d1, 10000);
julia> fit(Normal, x)
Normal(μ=1.0015796782177036, σ=3.033914550184868)


Yup, those values are in pretty good agreement with the mean and standard deviation we specified for our Normal object originally.

That's it for today. There's more to the Distributions package though. Check out my github repository to see other examples which didn't make it into the today's post.

# #MonthOfJulia Day 26: Statistics

JuliaStats is a meta-project which consolidates various packages related to statistics and machine learning in Julia. Well worth taking a look if you plan on working in this domain.

julia> x = rand(10);
julia> mean(x)
0.5287191472784906
julia> std(x)
0.2885446536178459


Julia already has some builtin support for statistical operations, so additional packages are not strictly necessary. However they do increase the scope and ease of possible operations (as we'll see below).Julia already has some builtin support for statistical operations. Let's kick off by loading all the packages that we'll be looking at today.

julia> using StatsBase, StatsFuns, StreamStats


## StatsBase

The documentation for StatsBase can be found here. As the package name implies, it provides support for basic statistical operations in Julia.

High level summary statistics are generated by summarystats().

julia> summarystats(x)
Summary Stats:
Mean:         0.528719
Minimum:      0.064803
1st Quartile: 0.317819
Median:       0.529662
3rd Quartile: 0.649787
Maximum:      0.974760


Weighted versions of the mean, variance and standard deviation are implemented. There're also geometric and harmonic means.

julia> w = WeightVec(rand(1:10, 10));      # A weight vector.
julia> mean(x, w)                          # Weighted mean.
0.48819933297961043
julia> var(x, w)                           # Weighted variance.
0.08303843715334995
julia> std(x, w)                           # Weighted standard deviation.
0.2881639067498738
julia> skewness(x, w)
0.11688162715805048
julia> kurtosis(x, w)
-0.9210456851144664
julia> mean_and_std(x, w)
(0.48819933297961043,0.2881639067498738)


There's a weighted median as well as functions for calculating quantiles.

julia> median(x)                           # Median.
0.5296622773635412
julia> median(x, w)                        # Weighted median.
0.5729104703595038
julia> quantile(x)
5-element Array{Float64,1}:
0.0648032
0.317819
0.529662
0.649787
0.97476
julia> nquantile(x, 8)
9-element Array{Float64,1}:
0.0648032
0.256172
0.317819
0.465001
0.529662
0.60472
0.649787
0.893513
0.97476
julia> iqr(x)                              # Inter-quartile range.
0.3319677541313941


Sampling from a population is also catered for, with a range of algorithms which can be applied to the sampling procedure.

julia> sample(['a':'z'], 5)                # Sampling (with replacement).
5-element Array{Char,1}:
'w'
'x'
'e'
'e'
'o'
julia> wsample(['T', 'F'], [5, 1], 10)     # Weighted sampling (with replacement).
10-element Array{Char,1}:
'F'
'T'
'T'
'T'
'F'
'T'
'T'
'T'
'T'
'T'


There's also functionality for empirical estimation of distributions from histograms and a range of other interesting and useful goodies.

## StatsFuns

The StatsFuns package provides constants and functions for statistical computing. The constants are by no means essential but certainly very handy. Take, for example, twoπ and sqrt2.

There are some mildly exotic mathematical functions available like logistic, logit and softmax.

julia> logistic(-5)
0.0066928509242848554
julia> logistic(5)
0.9933071490757153
julia> logit(0.25)
-1.0986122886681098
julia> logit(0.75)
1.0986122886681096
julia> softmax([1, 3, 2, 5, 3])
5-element Array{Float64,1}:
0.0136809
0.101089
0.0371886
0.746952
0.101089


Finally there is a suite of functions relating to various statistical distributions. The functions for the Normal distribution are illustrated below, but there're functions for Beta and Binomial distribution, the Gamma and Hypergeometric distribution and many others. The function naming convention is consistent across all distributions.

julia> normpdf(0);                         # PDF
julia> normlogpdf(0);                      # log PDF
julia> normcdf(0);                         # CDF
julia> normccdf(0);                        # Complementary CDF
julia> normlogcdf(0);                      # log CDF
julia> normlogccdf(0);                     # log Complementary CDF
julia> norminvcdf(0.5);                    # inverse-CDF
julia> norminvccdf(0.99);                  # inverse-Complementary CDF
julia> norminvlogcdf(-0.693147180559945);  # inverse-log CDF
julia> norminvlogccdf(-0.693147180559945); # inverse-log Complementary CDF


## StreamStats

Finally, the StreamStats package supports calculating online statistics for a stream of data which is being continuously updated.

julia> average = StreamStats.Mean()
Online Mean
* Mean: 0.000000
* N:    0
julia> variance = StreamStats.Var()
Online Variance
* Variance: NaN
* N:        0
julia> for x in rand(10)
update!(average, x)
update!(variance, x)
@printf("x = %3.f: mean = %.3f | variance = %.3f\n", x, state(average),
state(variance))
end
x = 0.928564: mean = 0.929 | variance = NaN
x = 0.087779: mean = 0.508 | variance = 0.353
x = 0.253300: mean = 0.423 | variance = 0.198
x = 0.778306: mean = 0.512 | variance = 0.164
x = 0.566764: mean = 0.523 | variance = 0.123
x = 0.812629: mean = 0.571 | variance = 0.113
x = 0.760074: mean = 0.598 | variance = 0.099
x = 0.328495: mean = 0.564 | variance = 0.094
x = 0.303542: mean = 0.535 | variance = 0.090
x = 0.492716: mean = 0.531 | variance = 0.080


In addition to the mean and variance illustrated above, the package also supports online versions of min() and max(), and can be used to generate incremental confidence intervals for Bernoulli and Poisson processes.

That's it for today. Check out the full code on github and watch the video below.

# #MonthOfJulia Day 25: Interfacing with Other Languages

Julia has native support for calling C and FORTRAN functions. There are also add on packages which provide interfaces to C++, R and Python. We'll have a brief look at the support for C and R here. Further details on these and the other supported languages can be found on github.

Why would you want to call other languages from within Julia? Here are a couple of reasons:

• to access functionality which is not implemented in Julia;
• to exploit some efficiency associated with another language.

The second reason should apply relatively seldom because, as we saw some time ago, Julia provides performance which rivals native C or FORTRAN code.

## C

C functions are called via ccall(), where the name of the C function and the library it lives in are passed as a tuple in the first argument, followed by the return type of the function and the types of the function arguments, and finally the arguments themselves. It's a bit klunky, but it works!

julia> ccall((:sqrt, "libm"), Float64, (Float64,), 64.0)
8.0


It makes sense to wrap a call like that in a native Julia function.

julia> csqrt(x) = ccall((:sqrt, "libm"), Float64, (Float64,), x);
julia> csqrt(64.0)
8.0


This function will not be vectorised by default (just try call csqrt() on a vector!), but it's a simple matter to produce a vectorised version using the @vectorize_1arg macro.

julia> @vectorize_1arg Real csqrt;
julia> methods(csqrt)
# 4 methods for generic function "csqrt":
csqrt{T<:Real}(::AbstractArray{T<:Real,1}) at operators.jl:359
csqrt{T<:Real}(::AbstractArray{T<:Real,2}) at operators.jl:360
csqrt{T<:Real}(::AbstractArray{T<:Real,N}) at operators.jl:362
csqrt(x) at none:6


Note that a few extra specialised methods have been introduced and now calling csqrt() on a vector works perfectly.

julia> csqrt([1, 4, 9, 16])
4-element Array{Float64,1}:
1.0
2.0
3.0
4.0


## R

I'll freely admit that I don't dabble in C too often these days. R, on the other hand, is a daily workhorse. So being able to import R functionality into Julia is very appealing. The first thing that we need to do is load up a few packages, the most important of which is RCall. There's great documentation for the package here.

julia> using RCall
julia> using DataArrays, DataFrames


We immediately have access to R's builtin data sets and we can display them using rprint().

julia> rprint(:HairEyeColor)
, , Sex = Male

Eye
Hair    Brown Blue Hazel Green
Black    32   11    10     3
Brown    53   50    25    15
Red      10   10     7     7
Blond     3   30     5     8

, , Sex = Female

Eye
Hair    Brown Blue Hazel Green
Black    36    9     5     2
Brown    66   34    29    14
Red      16    7     7     7
Blond     4   64     5     8


We can also copy those data across from R to Julia.

julia> airquality = DataFrame(:airquality);
6x6 DataFrame
| Row | Ozone | Solar.R | Wind | Temp | Month | Day |
|-----|-------|---------|------|------|-------|-----|
| 1   | 41    | 190     | 7.4  | 67   | 5     | 1   |
| 2   | 36    | 118     | 8.0  | 72   | 5     | 2   |
| 3   | 12    | 149     | 12.6 | 74   | 5     | 3   |
| 4   | 18    | 313     | 11.5 | 62   | 5     | 4   |
| 5   | NA    | NA      | 14.3 | 56   | 5     | 5   |
| 6   | 28    | NA      | 14.9 | 66   | 5     | 6   |


rcopy() provides a high-level interface to function calls in R.

julia> rcopy("runif(3)")
3-element Array{Float64,1}:
0.752226
0.683104
0.290194


However, for some complex objects there is no simple way to translate between R and Julia, and in these cases rcopy() fails. We can see in the case below that the object of class lm returned by lm() does not diffuse intact across the R-Julia membrane.

julia> "fit <- lm(bwt ~ ., data = MASS::birthwt)" |> rcopy
ERROR: rcopy has no method matching rcopy(::LangSxp)
in rcopy at no file
in map_to! at abstractarray.jl:1311
in map_to! at abstractarray.jl:1320
in map at abstractarray.jl:1331
in rcopy at /home/colliera/.julia/v0.3/RCall/src/sexp.jl:131
in rcopy at /home/colliera/.julia/v0.3/RCall/src/iface.jl:35
in |> at operators.jl:178


But the call to lm() was successful and we can still look at the results.

julia> rprint(:fit)

Call:
lm(formula = bwt ~ ., data = MASS::birthwt)

Coefficients:
(Intercept)          low          age          lwt         race
3612.51     -1131.22        -6.25         1.05      -100.90
smoke          ptl           ht           ui          ftv
-174.12        81.34      -181.95      -336.78        -7.58


You can use R to generate plots with either the base functionality or that provided by libraries like ggplot2 or lattice.

julia> reval("plot(1:10)");                # Will pop up a graphics window...
julia> reval("library(ggplot2)");
julia> rprint("ggplot(MASS::birthwt, aes(x = age, y = bwt)) + geom_point() + theme_classic()")
julia> reval("dev.off()")                  # ... and close the window.


Watch the videos below for some other perspectives on multi-language programming with Julia. Also check out the complete code for today (including examples with C++, FORTRAN and Python) on github.

# #MonthOfJulia Day 24: Graphs

If you're not too familiar Graph Theory, then it might be an idea to take a moment to get the basics. Graphs are an extremely versatile data structure for storing data consisting of linked entities. I'm going to look at two packages for managing graphs in Julia: LightGraphs and Graphs.

## LightGraphs

As usual, the first step is to load the package.

julia> using LightGraphs


LightGraphs has methods which generate a selection of standard graphs like StarGraph(), WheelGraph() and FruchtGraph(). There are also functions for random graphs, for example, erdos_renyi() and watts_strogatz(). We'll start off by creating two small graphs. One will have 10 nodes connected by 20 random edges. The other will be a directed star graph consisting of four nodes, the central node being connected to every other node.

julia> g1 = Graph(10, 20)
{10, 20} undirected graph
julia> g2 = StarDiGraph(4)
{4, 3} directed graph
julia> edges(g2)
Set{Pair{Int64,Int64}}({edge 1 - 2,edge 1 - 4,edge 1 - 3})


It's simple to find the degree and neighbours of a given node.

julia> degree(g1, 4)                       # How many neighbours for vertex 4?
6
julia> neighbors(g1, 4)                    # Find neighbours of vertex 4
6-element Array{Int64,1}:
1
3
6
2
9
7


There's a straightforward means to add and remove edges from the graph.

julia> add_edge!(g1, 4, 8)                 # Add edge between vertices 4 and 8
edge 4 - 8
julia> rem_edge!(g1, 4, 6)                 # Remove edge between vertices 4 and 6
edge 6 - 4


The package has functionality for performing high level tests on the graph (checking, for instance, whether it is cyclic or connected). There's also support for path based algorithms, but we'll dig into those when we look at the Graphs package.

## Graphs

Before we get started with the Graphs package you might want to restart your Julia session to purge all of that LightGraphs goodness. Take a moment to browse the Graphs.jl documentation, which is very comprehensive.

julia> using Graphs


As with LightGraphs, there are numerous options for generating standard graphs.

julia> g1a = simple_frucht_graph()
Undirected Graph (20 vertices, 18 edges)
julia> g1b = simple_star_graph(8)
Directed Graph (8 vertices, 7 edges)
julia> g1c = simple_wheel_graph(8)
Directed Graph (8 vertices, 14 edges)


Graphs uses the GraphViz library to generate plots.

julia> plot(g1a)


Of course, a graph can also be constructed manually.

julia> g2 = simple_graph(4)
Directed Graph (4 vertices, 0 edges)
edge [1]: 1 -- 2
edge [2]: 1 -- 3
edge [3]: 2 -- 3


Individual vertices (a vertex is the same as a node) can be interrogated. Since we are considering a directed graph we look separately at the edges exiting and entering a node.

julia> num_vertices(g2)
4
julia> vertices(g2)
1:4
julia> out_degree(1, g2)
2
julia> out_edges(1, g2)
2-element Array{Edge{Int64},1}:
edge [1]: 1 -- 2
edge [2]: 1 -- 3
julia> in_degree(2, g2)
1
julia> in_edges(2, g2)
1-element Array{Edge{Int64},1}:
edge [1]: 1 -- 2


Vertices can be created with labels and attributes.

julia> V1 = ExVertex(1, "V1");
julia> V1.attributes["size"] = 5.0
5.0
julia> V2 = ExVertex(2, "V2");
julia> V2.attributes["size"] = 3.0
3.0
julia> V3 = ExVertex(3, "V3")
vertex [3] "V3"


Those vertices can then be used to define edges, which in turn can have labels and attributes.

julia> E1 = ExEdge(1, V1, V2)
edge [1]: vertex [1] "V1" -- vertex [2] "V2"
julia> E1.attributes["distance"] = 50
50
julia> E1.attributes["color"] = "green"
"green"


Finally the collection of vertices and edges can be gathered into a graph.

julia> g3 = edgelist([V1, V2], [E1], is_directed = true)
Directed Graph (2 vertices, 1 edges)


It's possible to systematically visit all connected vertices in a graph, applying an operation at every vertex. traverse_graph() performs the graph traversal using either a depth first or breadth first algorithm. In the sample code below the operation applied at each vertex is LogGraphVisitor(), which is a simple logger.

julia> traverse_graph(g1c, DepthFirst(), 1, LogGraphVisitor(STDOUT))
discover vertex: 1
examine neighbor: 1 -> 2 (vertexcolor = 0, edgecolor= 0)
discover vertex: 2
open vertex: 2
examine neighbor: 2 -> 3 (vertexcolor = 0, edgecolor= 0)
discover vertex: 3
open vertex: 3
examine neighbor: 3 -> 4 (vertexcolor = 0, edgecolor= 0)
discover vertex: 4
open vertex: 4
examine neighbor: 4 -> 5 (vertexcolor = 0, edgecolor= 0)
discover vertex: 5
open vertex: 5
examine neighbor: 5 -> 6 (vertexcolor = 0, edgecolor= 0)
discover vertex: 6
open vertex: 6
examine neighbor: 6 -> 7 (vertexcolor = 0, edgecolor= 0)
discover vertex: 7
open vertex: 7
examine neighbor: 7 -> 8 (vertexcolor = 0, edgecolor= 0)
discover vertex: 8
open vertex: 8
examine neighbor: 8 -> 2 (vertexcolor = 1, edgecolor= 0)
close vertex: 8
close vertex: 7
close vertex: 6
close vertex: 5
close vertex: 4
close vertex: 3
close vertex: 2
examine neighbor: 1 -> 3 (vertexcolor = 2, edgecolor= 0)
examine neighbor: 1 -> 4 (vertexcolor = 2, edgecolor= 0)
examine neighbor: 1 -> 5 (vertexcolor = 2, edgecolor= 0)
examine neighbor: 1 -> 6 (vertexcolor = 2, edgecolor= 0)
examine neighbor: 1 -> 7 (vertexcolor = 2, edgecolor= 0)
examine neighbor: 1 -> 8 (vertexcolor = 2, edgecolor= 0)
close vertex: 1


We can use Dijkstra's Algorithm to calculate the distance from a given vertex to all other vertices in the graph. We see, for instance, that the distance from vertex 1 to vertex 4 is three steps. Since vertex 1 and vertex 20 are not connected, the distance between them is infinite. There are a couple of other algorithms available for calculating shortest paths.

julia> distances = ones(num_edges(g1a));   # Assign distance of 1 to each edge.
julia> d = dijkstra_shortest_paths(g1a, distances, 1);
julia> d.dists                             # Vector of distances to all other vertices.
20-element Array{Float64,1}:
0.0
1.0
2.0
3.0
3.0
2.0
1.0
1.0
3.0
4.0
2.0
2.0
Inf
Inf
Inf
Inf
Inf
Inf
Inf
Inf


As with the most of the packages that I have looked at already, the functionality summarised above is just a small subset of what's available. Have a look at the home pages for these packages and check out the full code for today (which looks at a number of other features) on github. Some time in the future I plan on looking at the EvolvingGraphs which caters for graphs where the structure changes with time.

# #MonthOfJulia Day 23: Data Structures

Although Julia has integrated support for various data structures (arrays, tuples, dictionaries, sets), it doesn't exhaust the full gamut of ptions. More exotic structures (like queues and deques, stacks, counters, heaps, tries and variations on sets and dictionaries) are implemented in the DataStructures package.

julia> using DataStructures


I won't attempt to illustrate all structures offered by the package (that would make for an absurdly dull post), but focus instead on queues and counters. The remaining types are self-explanatory and well illustrated in the package documentation.

## Queue

Let's start off with a queue. The data type being queued must be specified at instantiation. We'll make a queue which can hold items of Any type. Can't get more general than that.

julia> queue = Queue(Any);


The rules of a queue are such that new items are always added to the back. Adding items is done with enqueue!().

julia> enqueue!(queue, "First in.");
julia> for n in [2:4]; enqueue!(queue, n); end
julia> enqueue!(queue, "Last in.")
Queue{Deque{Any}}(Deque [{"First in.",2,3,4,"Last in."}])
julia> length(queue)
5


The queue now holds five items. We can take a look at the items at the front and back of the queue using front() and back(). Note that indexing does not work on a queue (that would violate the principles of queuing!).

julia> front(queue)
"First in."
julia> back(queue)
"Last in."


Finally we can remove items from the front of the queue using dequeue!(). The queue implements FIFO (which is completely different from the other form of FIFO, which I only discovered today).

julia> dequeue!(queue)
"First in."


## Counter

The counter() function returns an Accumulator object, which is used to assemble item counts.

julia> cnt = counter(ASCIIString)
Accumulator{ASCIIString,Int64}(Dict{ASCIIString,Int64}())


Using a Noah's Ark example we'll count the instances of different types of domestic animals.

julia> push!(cnt, "dog")                   # Add 1 dog
1
julia> push!(cnt, "cat", 3)                # Add 3 cats
3
julia> push!(cnt, "cat")                   # Add another cat (returns current count)
4
julia> push!(cnt, "mouse", 5)              # Add 5 mice
5


Let's see what the counter looks like now.

julia> cnt
Accumulator{ASCIIString,Int64}(["mouse"=>5,"cat"=>4,"dog"=>1])


We can return (and remove) the count for a particular item using pop!().

julia> pop!(cnt, "cat")
4
julia> cnt["cat"]                          # How many cats do we have now? All gone.
0


And simply accessing the count for an item is done using [] indexing notation.

julia> cnt["mouse"]                        # But we still have a handful of mice.
5


I've just finished reading through the second early access version of Learn Julia by Chris von Csefalvay. In the chapter on Strings the author present a nice example in which he counts the times each character speaks in Shakespeare's Hamlet. I couldn't help but think that this would've been even more elegant using an Accumulator.

Tomorrow we'll take a look at an extremely useful data structure: a graph. Until then, feel free to check out the full code for today on github.

# #MonthOfJulia Day 22: Optimisation

Sudoku-as-a-Service is a great illustration of Julia's integer programming facilities. Julia has several packages which implement various flavours of optimisation: JuMP, JuMPeR, Gurobi, CPLEX, DReal, CoinOptServices and OptimPack. We're not going to look at anything quite as elaborate as Sudoku today, but focus instead on finding the extrema in some simple (or perhaps not so simple) mathematical functions. At this point you might find it interesting to browse through this catalog of test functions for optimisation.

## Optim

We'll start out by using the Optim package to find extrema in Himmelblau's function:

$f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2$.

This function has one maximum and four minima. One of the minima is conveniently located at

$(x, y) = (3, 2)$.

As usual the first step is to load the required package.

julia> using Optim


Then we set up the objective function along with its gradient and Hessian functions.

julia> function himmelblau(x::Vector)
(x[1]^2 + x[2] - 11)^2 + (x[1] + x[2]^2 - 7)^2
end
himmelblau (generic function with 1 method)
gradient[1] = 4 * x[1] * (x[1]^2 + x[2] - 11) + 2 * (x[1] + x[2]^2 - 7)
gradient[2] = 2 * (x[1]^2 + x[2] - 11) + 4 * x[2] * (x[1] + x[2]^2 - 7)
end
himmelblau_gradient! (generic function with 1 method)
julia> function himmelblau_hessian!(x::Vector, hessian::Matrix)
hessian[1, 1] = 4 * (x[1]^2 + x[2] - 11) + 8 * x[1]^2 + 2
hessian[1, 2] = 4 * x[1] + 4 * x[2]
hessian[2, 1] = 4 * x[1] + 4 * x[2]
hessian[2, 2] = 4 * (x[1] + x[2]^2 -  7) + 8 * x[2]^2 + 2
end
himmelblau_hessian! (generic function with 1 method)


There are a number of algorithms at our disposal. We'll start with the Nelder Mead method which only uses the objective function itself. I am very happy with the detailed output provided by the optimize() function and clearly it converges on a result which is very close to what we expected.

julia> optimize(himmelblau, [2.5, 2.5], method = :nelder_mead)
Results of Optimization Algorithm
* Starting Point: [2.5,2.5]
* Minimum: [3.0000037281643586,2.0000105449945313]
* Value of Function at Minimum: 0.000000
* Iterations: 35
* Convergence: true
* |x - x'| < NaN: false
* |f(x) - f(x')| / |f(x)| < 1.0e-08: true
* |g(x)| < NaN: false
* Exceeded Maximum Number of Iterations: false
* Objective Function Calls: 69


Next we'll look at the limited-memory version of the BFGS algorithm. This can be applied either with or without an explicit gradient function. In this case we'll provide the gradient function defined above. Again we converge on the right result, but this time with far fewer iterations required.

julia> optimize(himmelblau, himmelblau_gradient!, [2.5, 2.5], method = :l_bfgs)
Results of Optimization Algorithm
* Algorithm: L-BFGS
* Starting Point: [2.5,2.5]
* Minimum: [2.999999999999385,2.0000000000001963]
* Value of Function at Minimum: 0.000000
* Iterations: 6
* Convergence: true
* |x - x'| < 1.0e-32: false
* |f(x) - f(x')| / |f(x)| < 1.0e-08: false
* |g(x)| < 1.0e-08: true
* Exceeded Maximum Number of Iterations: false
* Objective Function Calls: 25


Finally we'll try out Newton's method, where we'll provide both gradient and Hessian functions. The result is spot on and we've shaved off one iteration. Very nice indeed!

julia> optimize(himmelblau, himmelblau_gradient!, himmelblau_hessian!, [2.5, 2.5],
method = :newton)
Results of Optimization Algorithm
* Algorithm: Newton's Method
* Starting Point: [2.5,2.5]
* Minimum: [3.0,2.0]
* Value of Function at Minimum: 0.000000
* Iterations: 5
* Convergence: true
* |x - x'| < 1.0e-32: false
* |f(x) - f(x')| / |f(x)| < 1.0e-08: true
* |g(x)| < 1.0e-08: true
* Exceeded Maximum Number of Iterations: false
* Objective Function Calls: 19


There is also a Simulated Annealing solver in the Optim package.

## NLopt

NLopt is an optimisation library with interfaces for a variety of programming languages. NLopt offers a variety of optimisation algorithms. We'll apply both a gradient-based and a derivative-free technique to maximise the function

$\sin\alpha \cos\beta$

subject to the constraints

$2 \alpha \leq \beta$

and

$\beta \leq \pi/2$.

Before we load the NLopt package, it's a good idea to restart your Julia session to flush out any remnants of the Optim package.

julia> using NLopt


We'll need to write the objective function and a generalised constraint function.

julia> count = 0;
grad[2] = - sin(x[1]) * sin(x[2])
end

global count
count::Int += 1
println("Iteration $count:$x")

sin(x[1]) * cos(x[2])
end
objective (generic function with 1 method)
julia> function constraint(x::Vector, grad::Vector, a, b, c)
end
a * x[1] + b * x[2] - c
end
constraint (generic function with 1 method)


The COBYLA (Constrained Optimization BY Linear Approximations) algorithm is a local optimiser which doesn't use the gradient function.

julia> opt = Opt(:LN_COBYLA, 2);                       # Algorithm and dimension of problem
julia> ndims(opt)
2
julia> algorithm(opt)
:LN_COBYLA
julia> algorithm_name(opt)                             # Text description of algorithm
"COBYLA (Constrained Optimization BY Linear Approximations) (local, no-derivative)"


We impose generous upper and lower bounds on the solution space and use two inequality constraints. Either min_objective!() or max_objective!() is used to specify the objective function and whether or not it is a minimisation or maximisation problem. Constraints can be either inequalities using inequality_constraint!() or equalities using equality_constraint!().

julia> lower_bounds!(opt, [0., 0.])
julia> upper_bounds!(opt, [pi, pi])
julia> xtol_rel!(opt, 1e-6)
julia> max_objective!(opt, objective)
julia> inequality_constraint!(opt, (x, g) -> constraint(x, g, 2, -1, 0), 1e-8)
julia> inequality_constraint!(opt, (x, g) -> constraint(x, g, 0, 2, pi), 1e-8)


After making an initial guess we let the algorithm loose. I've purged some of the output to spare you from the floating point deluge.

julia> initial = [0, 0];                               # Initial guess
julia> (maxf, maxx, ret) = optimize(opt, initial)
Iteration 1: [0.0,0.0]
Iteration 2: [0.7853981633974483,0.0]
Iteration 3: [0.7853981633974483,0.7853981633974483]
Iteration 4: [0.0,0.17884042066163552]
Iteration 5: [0.17562036827601815,0.3512407365520363]
Iteration 6: [0.5268611048280544,1.053722209656109]
Iteration 7: [0.7853981633974481,1.5707963267948961]
Iteration 8: [0.7526175675681757,0.9963866471510139]
Iteration 9: [0.785398163397448,1.570796326794896]
Iteration 10: [0.35124073655203625,0.7024814731040726]
.
.
.
Iteration 60: [0.42053333513020824,0.8410666702604165]
Iteration 61: [0.42053467500728553,0.8410693500145711]
Iteration 62: [0.4205360148843628,0.8410720297687256]
Iteration 63: [0.4205340050687469,0.8410680101374938]
Iteration 64: [0.4205340249920041,0.8410677994554656]
Iteration 65: [0.42053333513020824,0.8410666702604165]
Iteration 66: [0.42053456716611504,0.8410679945560181]
Iteration 67: [0.42053333513020824,0.8410666702604165]
Iteration 68: [0.42053365382801033,0.8410673076560207]
(0.27216552697496077,[0.420534,0.841067],:XTOL_REACHED)
julia> println("got $maxf at$maxx after $count iterations.") got 0.27216552697496077 at [0.42053365382801033,0.8410673076560207] after 68 iterations.  It takes a number of iterations to converge, but arrives at a solution which seems eminently reasonable (and which satisfies both of the constraints). Next we'll use the MMA (Method of Moving Asymptotes) gradient-based algorithm. julia> opt = Opt(:LD_MMA, 2);  We remove the second inequality constraint and simply confine the solution space appropriately. This is definitely a more efficient approach! julia> lower_bounds!(opt, [0., 0.]) julia> upper_bounds!(opt, [pi, pi / 2]) julia> xtol_rel!(opt, 1e-6) julia> max_objective!(opt, objective) julia> inequality_constraint!(opt, (x, g) -> constraint(x, g, 2, -1, 0), 1e-8)  This algorithm converges more rapidly (because it takes advantage of the gradient function!) and we arrive at the same result. julia> (maxf, maxx, ret) = optimize(opt, initial) Iteration 1: [0.0,0.0] Iteration 2: [0.046935706114911574,0.12952531487499092] Iteration 3: [0.1734128499487191,0.5065804625164063] Iteration 4: [0.3449211909390502,0.7904095832845456] Iteration 5: [0.4109653874949588,0.8281977630709889] Iteration 6: [0.41725447118163134,0.8345944447401356] Iteration 7: [0.4188068871033356,0.8376261095301502] Iteration 8: [0.4200799333613666,0.8401670014914709] Iteration 9: [0.4203495290598476,0.8406993867808531] Iteration 10: [0.4205138682235357,0.8410278412850836] Iteration 11: [0.4205289336960578,0.8410578710185219] Iteration 12: [0.42053231747822034,0.8410646372592685] Iteration 13: [0.42053444274035756,0.8410688833806734] Iteration 14: [0.4205343574933894,0.8410687141629858] Iteration 15: [0.4205343707980632,0.8410687434944638] Iteration 16: [0.420534312041705,0.8410686169530415] Iteration 17: [0.4205343317839936,0.8410686604482764] Iteration 18: [0.42053433111342814,0.8410686565253115] Iteration 19: [0.42053433035398824,0.8410686525997696] (0.27216552944315736,[0.420534,0.841069],:XTOL_REACHED) julia> println("got$maxf at $maxx after$count iterations.")
got 0.27216552944315736 at [0.42053433035398824,0.8410686525997696] after 19 iterations.


I'm rather impressed. Both of these packages provide convenient interfaces and I could solve my test problems without too much effort. Have a look at the videos below for more about optimisation in Julia and check out github for the complete code for today's examples. We'll kick off next week with a quick look at some alternative data structures.

# #MonthOfJulia Day 21: Differential Equations

Yesterday we had a look at Julia's support for Calculus. The next logical step is to solve some differential equations. We'll look at two packages today: Sundials and ODE.

## Sundials

The Sundials package is based on a library which implements a number of solvers for differential equations. First off you'll need to install that library. In Ubuntu this is straightforward using the package manager. Alternatively you can download the source distribution.

sudo apt-get install libsundials-serial-dev


Next install the Julia package and load it.

julia> Pkg.add("Sundials")
julia> using Sundials


To demonstrate we'll look at a standard "textbook" problem: a damped harmonic oscillator (mass on a spring with friction). This is a second order differential equation with general form

$\ddot{x} + a \dot{x} + b x = 0$

where $x$ is the displacement of the oscillator, while $a$ and $b$ characterise the damping coefficient and spring stiffness respectively. To solve this numerically we need to convert it into a system of first order equations:

\begin{aligned} \dot{x} &= v \\ \dot{v} &= - a v - b x \end{aligned}

We'll write a function for those relationships and assign specific values to $a$ and $b$.

julia> function oscillator(t, y, ydot)
ydot[1] = y[2]
ydot[2] = - 3 * y[1] - y[2] / 10
end
oscillator (generic function with 2 methods)


Next the initial conditions and time steps for the solution.

julia> initial = [1.0, 0.0];                   # Initial conditions
julia> t = float([0:0.125:30]);                # Time steps


And finally use cvode() to integrate the system.

julia> xv = Sundials.cvode(oscillator, initial, t);
julia> xv[1:5,:]
5x2 Array{Float64,2}:
1.0        0.0
0.97676   -0.369762
0.908531  -0.717827
0.799076  -1.02841
0.65381   -1.28741


The results for the first few time steps look reasonable: the displacement (left column) is decreasing and the velocity (right column) is becoming progressively more negative. To be sure that the solution has the correct form, have a look at the Gadfly plot below. The displacement (black) and velocity (blue) curves are 90° out of phase, as expected, and both gradually decay with time due to damping. Looks about right to me!

## ODE

The ODE package provides a selection of solvers, all of which are implemented with a consistent interface (which differs a bit from Sundials).

julia> using ODE


Again we need to define a function to characterise our differential equations. The form of the function is a little different with the ODE package: rather than passing the derivative vector by reference, it's simply returned as the result. I've consider the same problem as above, but to spice things up I added a sinusoidal driving force.

julia> function oscillator(t, y)
[y[2]; - a * y[1] - y[2] / 10 + sin(t)]
end
oscillator (generic function with 2 methods)


We'll solve this with ode23(), which is a second order adaptive solver with third order error control. Because it's adaptive we don't need to explicitly specify all of the time steps, just the minimum and maximum.

julia> a = 1;                                  # Resonant
julia> T, xv = ode23(oscillator, initial, [0.; 40]);
julia> xv = hcat(xv...).';                     # Vector{Vector{Float}} -> Matrix{Float}


The results are plotted below. Driving the oscillator at the resonant frequency causes the amplitude of oscillation to grow with time as energy is transferred to the oscillating mass.

If we move the oscillator away from resonance the behavior becomes rather interesting.

julia> a = 3;                                  # Far from resonance


Now, because the oscillation and the driving force aren't synchronised (and there's a non-rational relationship between their frequencies) the displacement and velocity appear to change irregularly with time.

How about a double pendulum (a pendulum with a second pendulum suspended from its end)? This seemingly simple system exhibits a rich range of dynamics. It's behaviour is sensitive to initial conditions, one of the characteristics of chaotic systems.

First we set up the first order equations of motion. The details of this system are explained in the video below.

julia> function pendulum(t, y)
Y = [
6 * (2 * y[3] - 3 * cos(y[1] - y[2]) * y[4]) / (16 - 9 * cos(y[1] - y[2])^2);
6 * (8 * y[4] - 3 * cos(y[1] - y[2]) * y[3]) / (16 - 9 * cos(y[1] - y[2])^2)
]
[
Y[1];
Y[2];
- (Y[1] * Y[2] * sin(y[1] - y[2]) + 3 * sin(y[1])) / 2;
- (sin(y[2]) - Y[1] * Y[2] * sin(y[1] - y[2])) / 2;
]
end
pendulum (generic function with 1 method)


Define initial conditions and let it run...

julia> initial = [pi / 4, 0, 0, 0];            # Initial conditions -> deterministic behaviour
T, xv = ode23(pendulum, initial, [0.; 40]);


Below are two plots which show the results. The first is a time series showing the angular displacement of the first (black) and second (blue) mass. Next is a phase space plot which shows a different view of the same variables. It's clear to see that there is a regular systematic relationship between them.

Next we'll look at a different set of initial conditions. This time both masses are initially located above the primary vertex of the pendulum. This represents an initial configuration with much more potential energy.

julia> initial = [3/4 * pi, pi, 0, 0];         # Initial conditions -> chaotic behaviour


The same pair of plots now illustrate much more interesting behaviour. Note the larger range of angles, θ2, achieved by the second bob. With these initial conditions the pendulum is sufficiently energetic for it to "flip over". Look at the video below to get an idea of what this looks like with a real pendulum.

It's been a while since I've played with any Physics problems. That was fun. The full code for today is available at github. Come back tomorrow when I'll take a look at Optimisation in Julia.

# #MonthOfJulia Day 20: Calculus

Mathematica is the de facto standard for symbolic differentiation and integration. But many other languages also have great facilities for Calculus. For example, R has the deriv() function in the base stats package as well as the numDeriv, Deriv and Ryacas packages. Python has NumPy and SymPy.

Let's check out what Julia has to offer.

## Numerical Differentiation

julia> using Calculus


The derivative() function will evaluate the numerical derivative at a specific point.

julia> derivative(x -> sin(x), pi)
-0.9999999999441258
julia> derivative(sin, pi, :central)       # Options: :forward, :central or :complex
-0.9999999999441258


There's also a prime notation which will do the same thing (but neatly handle higher order derivatives).

julia> f(x) = sin(x);
julia> f'(0.0)                             # cos(x)
0.9999999999938886
julia> f''(0.0)                            # -sin(x)
0.0
julia> f'''(0.0)                           # -cos(x)
-0.9999977482682358


There are functions for second derivatives, gradients (for multivariate functions) and Hessian matrices too. Related packages for derivatives are ForwardDiff and ReverseDiffSource.

## Symbolic Differentiation

Symbolic differentiation works for univariate and multivariate functions expressed as strings.

julia> differentiate("sin(x)", :x)
:(cos(x))
julia> differentiate("sin(x) + exp(-y)", [:x, :y])
2-element Array{Any,1}:
:(cos(x))
:(-(exp(-y)))


It also works for expressions.

julia> differentiate(:(x^2 * y * exp(-x)), :x)
:((2x) * y * exp(-x) + x^2 * y * -(exp(-x)))
julia> differentiate(:(sin(x) / x), :x)
:((cos(x) * x - sin(x)) / x^2)


Have a look at the JuliaDiff project which is aggregating resources for differentiation in Julia.

## Numerical Integration

Numerical integration is a snap.

julia> integrate(x -> 1 / (1 - x), -1 , 0)
0.6931471805602638


Compare that with the analytical result. Nice.

julia> diff(map(x -> - log(1 - x), [-1, 0]))
1-element Array{Float64,1}:
0.693147


By default the integral is evaluated using Simpson's Rule. However, we can also use Monte Carlo integration.

julia> integrate(x -> 1 / (1 - x), -1 , 0, :monte_carlo)
0.6930203819567551


## Symbolic Integration

There is also an interface to the Sympy Python library for symbolic computation. Documentation can be found here. You might want to restart your Julia session before loading the SymPy package.

julia> using Sympy


Revisiting the same definite integral from above we find that we now have an analytical expression as the result.

julia> integrate(1 / (1 - x), (x, -1, 0))
log(2)
julia> convert(Float64, ans)
0.6931471805599453


To perform symbolic integration we need to first define a symbolic object using Sym().

julia> x = Sym("x");                       # Creating a "symbolic object"
julia> typeof(x)
Sym (constructor with 6 methods)
julia> sin(x) |> typeof                    # f(symbolic object) is also a symbolic object
Sym (constructor with 6 methods)


There's more to be said about symbolic objects (they are the basis of pretty much everything in SymPy), but we are just going to jump ahead to constructing a function and integrating it.

julia> f(x) = cos(x) - sin(x) * cos(x);
julia> integrate(f(x), x)
2
sin (x)
- ─────── + sin(x)
2


What about an integral with constant parameters? No problem.

julia> k = Sym("k");
julia> integrate(1 / (x + k), x)
log(k + x)


We have really only grazed the surface of SymPy. The capabilities of this package are deep and broad. Seriously worthwhile checking out the documentation if you are interested in symbolic computation.

I'm not ready to throw away my dated version of Mathematica just yet, but I'll definitely be using this functionality often. Come back tomorrow when I'll take a look at solving differential equations with Julia.

# #MonthOfJulia Day 19: Units

The packages we'll be looking at today should bring joy to the hearts of all Physical Scientists. Actually they should make any flavour of Scientist happy.

It is natural for man to relate the units of distance by which he travels to the dimensions of the globe that he inhabits. Thus, in moving about the earth, he may know by the simple denomination of distance its proportion to the whole circuit of the earth. This has the further advantage of making nautical and celestial measurements correspond. The navigator often needs to determine, one from the other, the distance he has traversed from the celestial arc lying between the zeniths at his point of departure and at his destination. It is important, therefore, that one of these magnitudes should be the expression of the other, with no difference except in the units. But to that end, the fundamental linear unit must be an aliquot part of the terrestrial meridian. ... Thus, the choice of the metre was reduced to that of the unity of angles.
Pierre-Simon Laplace

## SIUnits

The SIUnits package provides unit-checked operations for quantities expressed in SI units.

julia> using SIUnits
julia> using SIUnits.ShortUnits


It supports both long and short forms of units and all the expected arithmetic operations.

julia> 1KiloGram + 2kg
3 kg
julia> 4Meter - 2m
2 m
julia> 4m / 2s
2.0 m s⁻¹


Note that it only recognises the American spelling of "meter" and not the (IMHO correct) "metre"! But this is a small matter. And I don't want to engage in any religious wars.

Speaking of small matters, it's possible to define new units of measure. Below we'll define the micron and Angstrom along with their conversion functions.

julia> import Base.convert
julia> Micron = SIUnits.NonSIUnit{typeof(Meter),:µm}()
µm
julia> convert(::Type{SIUnits.SIQuantity},::typeof(Micron)) = Micro*Meter
convert (generic function with 461 methods)
julia> Angstrom = SIUnits.NonSIUnit{typeof(Meter),:Å}()
Å
julia> convert(::Type{SIUnits.SIQuantity},::typeof(Angstrom)) = Nano/10*Meter
convert (generic function with 462 methods)


And now we can freely use these new units in computations.

julia> 5Micron
5 µm
julia> 1Micron + 1m
1000001//1000000 m
julia> 5200Angstrom                        # Green light
5200 Å


Read on below to find out about the Physical package.

## Physical

The Physical package is documented here. Apparently it's not as performant as SIUnits but it does appear to have a wider scope of functionality. We'll use it to address an issue raised on Day 17: converting between Imperial and Metric units.

using Physical


There's a lot of functionality available, but we are going to focus on just one thing: converting pounds and inches into kilograms and metres. First we define a pair of derived units. To do this, of course, we need to know the appropriate conversion factors!

julia> Inch = DerivedUnit("in", 0.0254*Meter)
1 in
julia> Pound = DerivedUnit("lb", 0.45359237*Kilogram)
1 lb


We can then freely change the average heights and weights that we saw earlier from Imperial to Metric units.

julia> asbase(66Inch)
1.6764 m
julia> asbase(139Pound)
63.04933943 kg


On a related note I've just put together a package of physical constants for Julia.

julia> using PhysicalConstants
julia> PhysicalConstants.MKS.SpeedOfLight
2.99792458e8
julia> PhysicalConstants.MKS.Teaspoon
4.92892159375e-6


Did you know that a teaspoon was 4.92892 millilitres? There I was, wallowing in my ignorance, thinking that it was 5 millilitres. Pfffft. Silly me. There are

Units can be a contentious issue. Watch the video below to see what Richard Feynman had to say about the profusion of units used by Physicists to measure energy. Also check out the full code for today along with the index to the entire series of #MonthOfJulia posts on github.

For those who want some proof that physicists are human, the proof is in the idiocy of all the different units which they use for measuring energy.
Richard P. Feynman

# #MonthOfJulia Day 18: Plotting

There's a variety of options for plotting in Julia. We'll focus on those provided by Gadfly, Bokeh and Plotly and.

Gadfly is the flavour of the month for plotting in Julia. It's based on the Grammar of Graphics, so users of ggplot2 should find it familiar.

To start using Gadfly we'll first need to load the package. To enable generation of PNG, PS, and PDF output we'll also want the Cairo package.

julia> using Gadfly
julia> using Cairo


You can easily generate plots from data vectors or functions.

julia> plot(x = 1:100, y = cumsum(rand(100) - 0.5), Geom.point, Geom.smooth)
julia> plot(x -> x^3 - 9x, -5, 5)


Gadfly plots are by default rendered onto a new tab in your browser. These plots are mildly interactive: you can zoom and pan across the plot area. You can also save plots directly to files of various formats.

julia> dampedsin = plot([x -> sin(x) / x], 0, 50)
julia> draw(PNG("damped-sin.png", 800px, 400px), dampedsin)


Let's load up some data from the nlschools dataset in R's MASS package and look at the relationship between language score test and IQ for pupils broken down according to whether or not they are in a mixed-grade class.

julia> using RDatasets
julia> plot(dataset("MASS", "nlschools"), x="IQ", y="Lang", color="COMB",


Those two examples just scratched the surface. Gadfly can produce histograms, boxplots, ribbon plots, contours and violin plots. There's detailed documentation with numerous examples on the homepage.

Watch the video below (Daniel Jones at JuliaCon 2014) then read on about Bokeh and Plotly.

## Bokeh

Bokeh is a visualisation library for Python. Bokeh, like D3, renders plots as Javascript, which is viewable in a web browser. In addition to the examples on the library homepage, more can be found on the homepage for Julia's Bokeh package.

The first thing you'll need to do is install the Bokeh library. If you already have a working Python installation then this is easily done from the command line:

\$ pip install bokeh


Next load up the package and generate a simple plot.

julia> using Bokeh
julia> autoopen(true);
julia> x = linspace(0, pi);
julia> y = cos(2 * x);
julia> plot(x, y, title = "Cosine")
Plot("Cosine" with 1 datacolumns)


The plot will be written to a file bokeh_plot.html in the working directory, which will in turn be opened by the browser. Use plotfile() to change the name of the file. The plot is interactive, with functionality to pan and zoom as well as resize the plot window.

## Plotly

The Plotly package provides a complete interface to plot.ly, an online plotting service with interfaces for Python, R, MATLAB and now Julia. To get an idea of what's possible with plot.ly, check out their feed. The first step towards making your own awesomeness with be loading the package.

using Plotly


Next you should set up your plot.ly credentials using Plotly.set_credentials_file(). You only need to do this once because the values will be cached.

Data series are stored in Julia dictionaries.

julia> p1 = ["x" => 1:10, "y" => rand(0:20, 10), "type" => "scatter", "mode" => "markers"];
julia> p2 = ["x" => 1:10, "y" => rand(0:20, 10), "type" => "scatter", "mode" => "lines"];
julia> p3 = ["x" => 1:10, "y" => rand(0:20, 10), "type" => "scatter", "mode" => "lines+markers"];

julia> Plotly.plot([p1, p2, p3], ["filename" => "basic-line", "fileopt" => "overwrite"])
Dict{String,Any} with 5 entries:
"error"    => ""
"message"  => ""
"warning"  => ""
"filename" => "basic-line"
"url"      => "https://plot.ly/~collierab/17"


You can either open the URL provided in the result dictionary or do it programatically:

julia> Plotly.openurl(ans["url"])


By making small jumps through similar hoops it's possible to create some rather intricate visualisations like the 3D scatter plot below. For details of how that was done, check out my code on github.

Obviously plotting and visualisation in Julia are hot topics. Other plotting packages worth checking out are PyPlot, Winston and Gaston. Come back tomorrow when we'll take a look at using physical units in Julia.