My take is that the Rasmussen book isn't especially approachable, and that this book has actually held back the wider adoption of GPs in the world.
The book has been seen as the authoritative source on the topic, so people were hesitant to write anything else. At the same time, the book borders on impenetrable.
This is the definitive reference on the topic! I have some notes on the topic as well, if you want something concise, but that doesn't ignore the math [1].
These are very cool, thanks. Do you know what kind of jobs are more likely to require Gaussian process expertise? I have experience in using GP for surrogate modeling and will be on the job market soon.
Also a resource I enjoyed is the book by Bobby Gramacy [0] which, among other things, spends a good bit on local GP approximation [1] (and has fun exercises).
Aside from secondmind [1] I don't know of any companies (only because I haven't looked)... But if I had to look for places with strong research culture on GPs (I don't know if you're) I would find relevant papers on arxiv and Google scholar, and see if any of them come from industry labs. If I had to take a guess on Bayesian tools at work, maybe the industries to look at would be advertising and healthcare.I would also look out for places that hire econometricists.
I would argue there are more applications overall where Gaussian processes are superior, as most scientific applications have smaller data sets. Not everything has enough data to take advantage of feature learning in NNs. They are generally reliable, interpretable, and provide excellent uncertainty estimates for free. They can be made to be multiscale, achieving higher precisions as a function approximator than most other methods. Plus, they can exhibit reversion to the prior when you need that.
Another example where it is used is for emulating outputs of an agent-based model for sensitivity analyses.
Bayesian optimization of, say, hyperparameters is the canonical modern usage in my view, and there are other similar optimization problems where it's the preferred approach.
AFAIK state of the art is still a mix of new DNN and old school techniques. Things like parameter efficiency, data efficiency, runtime performance, and understandability would factor into the decision making process.
Good to see GPs still being discussed in 2025!
Here was my attempt at a 'second' introduction a few years ago: https://maximerobeyns.com/second_intro_gps
My take is that the Rasmussen book isn't especially approachable, and that this book has actually held back the wider adoption of GPs in the world.
The book has been seen as the authoritative source on the topic, so people were hesitant to write anything else. At the same time, the book borders on impenetrable.
This is the definitive reference on the topic! I have some notes on the topic as well, if you want something concise, but that doesn't ignore the math [1].
[1] https://blog.quipu-strands.com/bayesopt_1_key_ideas_GPs#gaus...
These are very cool, thanks. Do you know what kind of jobs are more likely to require Gaussian process expertise? I have experience in using GP for surrogate modeling and will be on the job market soon.
Also a resource I enjoyed is the book by Bobby Gramacy [0] which, among other things, spends a good bit on local GP approximation [1] (and has fun exercises).
[0] https://bobby.gramacy.com/surrogates/surrogates.pdf
[1] https://arxiv.org/abs/1303.0383
Aside from secondmind [1] I don't know of any companies (only because I haven't looked)... But if I had to look for places with strong research culture on GPs (I don't know if you're) I would find relevant papers on arxiv and Google scholar, and see if any of them come from industry labs. If I had to take a guess on Bayesian tools at work, maybe the industries to look at would be advertising and healthcare.I would also look out for places that hire econometricists.
Also thank you for the book recommendation!
[1] https://www.secondmind.ai/
Why would you learn Gaussian Processes today? Is there any application where they are still leading and have not been superseeded by Deep NNets?
I would argue there are more applications overall where Gaussian processes are superior, as most scientific applications have smaller data sets. Not everything has enough data to take advantage of feature learning in NNs. They are generally reliable, interpretable, and provide excellent uncertainty estimates for free. They can be made to be multiscale, achieving higher precisions as a function approximator than most other methods. Plus, they can exhibit reversion to the prior when you need that.
Another example where it is used is for emulating outputs of an agent-based model for sensitivity analyses.
Bayesian optimization of, say, hyperparameters is the canonical modern usage in my view, and there are other similar optimization problems where it's the preferred approach.
AFAIK state of the art is still a mix of new DNN and old school techniques. Things like parameter efficiency, data efficiency, runtime performance, and understandability would factor into the decision making process.
Stationary GPs are just stochastic linear dynamical systems. (Not just the Matern covariance kernel)
For the visually inclined: https://distill.pub/2019/visual-exploration-gaussian-process...
On the HN front page for 16 hours (though with strangely little discussion) just two days ago:
A Visual Exploration of Gaussian Processes (2019) - https://news.ycombinator.com/item?id=44919831 - Aug 2025 (1 comment)