Filtering Data with L2 Regularisation
I have just finished reading Momentum Strategies with L1 Filter by Tung-Lam Dao. The smoothing results presented in this paper are interesting and I thought it would be cool to implement the L1 and L2 filtering schemes in R. We’ll start with the L2 scheme here because it has an exact solution and I will follow up with the L1 scheme later on.
Formulation of the Problem
Consider some time series data consisting of points, , where is a smooth signal, is noise and is the combined noisy sign. If we have observations of , how can get back to an estimate of ?
The Hodrick-Prescott filter works by minimising the objective function
The regularisation parameter, , balances the contributions of the first and second summations, where the first is the sum of squared residuals and the second is the sum of squared curvatures in the filtered signal (characterised by the central difference approximation for the second derivative). A small value for causes the residuals to dominate the optimisation problem. A large value for will result in a solution which minimises curvature.
Implementation and Test Data
Implementing a function to perform the optimisation is pretty simple.
It has a nested objective function. The BFGS method is specified for optim() because the Nelder and Mead optimisation scheme converged too slowly.
First we’ll try this out on some test data.
If we use then regularisation has no effect and the objective function is minimised when . Not surprisingly, in this case the filtered signal is the same as the original signal.
If, on the other hand, we use a large value for the regularisation parameter then the filtered signal is significantly different.
A plot is the most sensible way to visualise the effects of . Below the original data (circles) are plotted along with the filtered data for values of from 0.1 to 100. In the top panel, weak regularisation results in a filtered signal which is not too different from the original. At the other extreme, the bottom panel shows strong regularisation where the filtered signal is essentially a straight line (all curvature has been removed). The other two panels represent intermediate levels of regularisation and it is clear how the original signal is being smoothed to varying degrees.
As it happens there is an exact solution to the Hodrick-Prescott optimisation problem, which involves some simple matrix algebra. The core of the solution is a band matrix with a right bandwidth of 2. The non-zero elements on each row are 1, -2 and 1. The function below constructs this matrix in a rather naive way. However, it is simply for illustration: we will look at a better implementation using sparse matrices.
Applying this function to the same set of test data, we get results consistent with those from optimisation.
In principle the matrix solution is much more efficient than the optimisation. However, an implementation using a dense matrix (as above) would not be feasible for a data series of any appreciable length due to memory constraints. A sparse matrix implementation does the trick though.
Again we can check that this gives the right results.
Application: S&P500 Data
So, let’s take this out for a drive in the real world. We’ll get out hands on some S&P500 data from Quandl.
Then systematically apply the filter with a range of regularisation parameters scaling from 0.1 to 100000 in multiples of 10. The results are plotted below. In each panel the grey data reflect the raw daily values for the S&P500 index. Superimposed on top of these are the results of filtering the signal using the specified regularisation parameter. As anticipated, larger values of result in a smoother curve since the filter is more heavily penalised for curvature in the filtered signal.
I think that the results look rather compelling. The only major drawback to this filter seems to be the fact, if used dynamically, the algorithm can (and most likely will) cause previous states to change. If used, for example, as the basis for an indicator on a chart, this would cause repainting of historical values.