New paper: Learning Energy Conserving Dynamics Efficiently with Hamiltonian Gaussian Processess
I spent the summer of 2022 visiting Markus Heinonen at Aalto University in Finland. Together we worked on energy conserving GP models, and I am happy to say our paper on the work was recently accepted to TMLR. Check out the paper, the code, or some visualizations.
We place a GP prior over the Hamiltonian and, using a set of inducing points, map function samples through Hamilton’s equations to obtain system derivative samples, to which we apply ODE solver to obtain sample trajectories.
New paper: Learning Nonparametric Volterra Kernels with Gaussian Processes
Update: This paper was published at NeurIPS 2021 check out the final version or my presentation.
This work, with Mauricio Alvarez and Mike Smith, has been the main focus of the first year of my PhD, check it out on arXiv.
The sampling process for the model described in the paper, which is a non-parametric, and can represent data generated by non-linear operators. There will be a blog post explaining the key ideas coming soon!
Simulation is easy, probability is hard…
There are many interesting things that can be learned about probabilistic modelling from the world of trading and finance, where, perhaps due to the direct connection to very large sums of money, attitudes to problems are generally very different to those of the typical statistician or ML practitioner. Two books I have enjoyed recently in the area are The Education of a Speculator by Victor Niederhoffer and Fooled by Randomness by Nassim Nicholas Taleb.
Podcasts about ML
There are a lot of podcasts out there, and this includes loads that are related to machine learning in some way. Over the summer I’ve worked my way through a fair few of them, so I thought I’d compile a list of my favorites. There are a couple of lists like this out there already, but most of them are out of date, either they include podcasts that are now inactive, or leave out ones that have started recently.