The Gaussian Processes Web Site
This web site aims to provide an overview of resources
concerned with probabilistic modeling, inference and learning based on Gaussian
processes. Although Gaussian processes have a long history in the field of
statistics, they seem to have been employed extensively only in niche
areas. With the advent of kernel machines in the machine learning community,
models based on Gaussian processes have become commonplace for problems of
regression (kriging) and classification as well as a host of more specialized
applications.
Gaussian Processes for Machine Learning, Carl Edward
Rasmussen and Chris Williams, the MIT Press, 2006, online version.
Statistical
Interpolation of Spatial Data: Some Theory for Kriging, Michael L. Stein,
Springer, 1999.
Statistics
for Spatial Data (revised edition), Noel A. C. Cressie, Wiley, 1993
Spline Models for
Observational Data, Grace Wahba, SIAM, 1990
The Bayesian Research
Kitchen at The Wordsworth Hotel, Grasmere, Ambleside, Lake
District, United Kingdom 05 - 07 September 2008.
A tutorial
entitled Advances in
Gaussian Processes on Dec. 4th at NIPS 2006 in VanCouver, slides, lecture.
The Gaussian Processes in
Practice workshop at Bletchley Park, U.K., June 12-13 2006.
The Open Problems in Gaussian Processes
for Machine Learning workshop at nips*05 in Whistler, December 10th,
2005.
The Gaussian Process
Round Table meeting in Sheffield, June 9-10, 2005.
The kernel-machines web
site.
Wikipedia entry
on Gaussian processes.
The ai-geostats web site for
spatial statistics and geostatistics.
The Bibliography of Gaussian
Process Models in Dynamic Systems Modelling web site maintained by Juš Kocijan.
Andreas Geiger has written a
simple Gaussian process
regression Java applet, illustrating the behaviour of covariance functions
and hyperparameters.
package |
title |
author |
implementation |
description |
bcm |
The Bayesian Committee Machine |
Anton Schwaighofer |
matlab and NETLAB |
An extension of the Netlab implementation for GP regression. It allows large
scale regression based on the BCM approximation, see also
the accompanying paper
|
fbm |
Software for Flexible Bayesian Modeling |
Radford M. Neal |
C for linux/unix |
An extensive and well documented package implementing Markov chain Monte Carlo methods for
Bayesian inference in neural networks, Gaussian processes (regression, binary
and multi-class classification), mixture models and Dirichlet Diffusion trees. |
gp-lvm and fgp-lvm |
A (fast) implementation of Gaussian Process Latent Variable Models |
Neil D. Lawrence |
matlab and C | |
gpml |
Code from the Rasmussen and Williams: Gaussian Processes for Machine Learning book. |
Carl Edward Rasmussen and Hannes Nickisch |
matlab and octave |
The GPML toolbox implements approximate inference algorithms for
Gaussian processes such as Expectation Propagation, the Laplace
Approximation and Variational Bayes for a wide class of likelihood
functions for both regression and classification. It comes with a big
algebra of covariance and mean functions allowing for flexible modeling.
The code is fully compatible to Octave 3.2.x. JMLR
paper describing the toolbox. |
c++-ivm |
Sparse approximations based on the Informative Vector Machine |
Neil D. Lawrence |
C++ | IVM Software in C++ , also includes the null category noise model for semi-supervised learning. |
BFD |
Bayesian Fisher's Discriminant software |
Tonatiuh Peña
Centeno |
matlab |
Implements a Gaussian process
interpretation of Kernel Fisher's discriminant. |
gpor |
Gaussian Processes for Ordinal Regression |
Wei Chu |
C for linux/unix |
Software implementation of Gaussian Processes for Ordinal Regression. Provides Laplace Approximation, Expectation Propagation and Variational Lower Bound. |
MCMCstuff |
MCMC Methods for MLP and GP and Stuff |
Aki Vehtari |
matlab and C |
A collection of matlab functions for Bayesian
inference with Markov chain Monte Carlo (MCMC) methods. The purpose of this
toolbox was to port some of the features in fbm to matlab for easier development for matlab users. |
ogp |
Sparse Online Gaussian Processes |
Lehel Csató |
matlab and NETLAB |
Approximate online learning in sparse Gaussian process models for regression (including
several non-Gaussian likelihood functions) and classification. |
sogp |
Sparse Online Gaussian Process C++ Library |
Dan Grollman |
C++ |
Sparse online Gaussian process C++ library based on the PhD thesis of Lehel Csató |
spgp .tgz or .zip |
Sparse Pseudo-input Gaussian Processes |
Ed Snelson |
matlab |
Implements sparse GP regression as described in Sparse Gaussian Processes using Pseudo-inputs and Flexible and efficient Gaussian process models for machine learning. The SPGP uses gradient-based marginal likelihood optimization to find suitable basis points and kernel hyperparameters in a single joint optimization. |
tgp |
Treed Gaussian Processes |
Robert B. Gramacy |
C/C++ for R |
Bayesian Nonparametric and
nonstationary regression by treed Gaussian processes with jumps to the limiting
linear model (LLM). Special cases also implememted include Bayesian linear
models, linear CART, stationary separable and isotropic Gaussian process
regression. Includes 1-d and 2-d plotting functions (with higher dimension
projection and slice capabilities), and tree drawing, designed for
visualization of tgp class output. See also Gramacy 2007 |
Tpros |
Gaussian Process Regression |
David MacKay and Mark Gibbs |
C |
Tpros is the Gaussian Process program written by Mark Gibbs and David
MacKay. |
GP Demo |
Octave demonstration of Gaussian process interpolation |
David MacKay |
octave |
This DEMO works fine with octave-2.0 and did not work with 2.1.33. |
GPClass |
Matlab code for Gaussian Process Classification |
David Barber and
C. K. I. Williams
|
matlab |
Implements Laplace's approximation as described in Bayesian Classification with Gaussian Processes for binary and multiclass classification. |
VBGP |
Variational Bayesian Multinomial Probit Regression with Gaussian Process Priors |
Mark Girolami and
Simon Rogers
|
matlab |
Implements a variational
approximation for Gaussian Process based multiclass classification as
described in the paper Variational Bayesian Multinomial Probit Regression. |
pyGPs |
Gaussian Processes for Regression and Classification |
Marion Neumann
|
Python |
pyGPs is a library containing an object-oriented python implementation for Gaussian Process (GP) regression and classification. github |
gaussian-process |
Gaussian process regression |
Anand Patil
|
Python |
under development |
gptk |
Gaussian Process Tool-Kit |
Alfredo Kalaitzis
|
R |
The gptk package implements a general-purpose toolkit for Gaussian process regression with an RBF covariance function. Based on a MATLAB implementation written by Neil D. Lawrence. |
|
Other software that way be useful for implementing Gaussian process models:
Below is a collection of papers relevant to learning in Gaussian process
models. The papers are ordered according to topic, with occational papers
occuring under multiple headings.
Tutorials
Several papers provide tutorial material suitable for a first introduction to
learning in Gaussian process models. These range from very short [Williams 2002] over intermediate [MacKay 1998], [Williams 1999]
to the more elaborate [Rasmussen and Williams
2006]. All of these require only a minimum of prerequisites in the form of
elementary probability theory and linear algebra.
Regression
The simplest uses of Gaussian process models are for (the conjugate case of)
regression with Gaussian noise. See the approximation
section for papers which deal specifically with sparse or fast approximation
techniques. O'Hagan 1978 represents an early reference
from the statistics comunity for the use of a Gaussian process as a prior over
functions, an idea which was only introduced to the machine learning community
by Williams and Rasmussen 1996.
Classification
Exact inference in Gaussian process models for classification is not tractable.
Several approximation schemes have been suggested, including Laplace's method,
variational approximations, mean field methods, Markov chain Monte Carlo and
Expectation Propagation. See also the approximation
section. Multi-class classification may be treated explicitly, or decomposed
into multiple, binary (one against the rest) problems. For introductions, see
for example Williams and Barber 1998 or Kuss and Rasmussen 2005. Bounds from the
PAC-Bayesian perspective are applied in Seeger 2002.
Covariance Functions and Properties of Gaussian Processes
The properties of Gaussian processes are controlled by the (mean function and)
covariance function. Some references here describe difference covariance
functions, while others give mathematical characterizations, see eg. Abrahamsen 1997 for a review. Some references
describe non-standard covariance functions leading to non-stationarity etc.
Model Selection
Approximations
There are two main reasons for doing approximations in Gaussian process models.
Either because of analytical intractability such as arrises in classification
and regression with non-Gaussian noise. Or in order to gain a computational
advantage when using large datasets, by the use of sparse
approximations. Some methods address both issues simultaneously. The
approximation methods and approximate inference algorithms are quite diverse,
see Quiñonero-Candela and
Ramussen 2005 for a unifying framework for sparse approximations in the
Gaussian regression model.
References from the Statistics Community
Gaussian processes have a long history in the statistics community. They have
been particularly well developed in geostatistics under the name of
kriging. The papers have been grouped because they are written using a
common terminology, and have slightly different focus from typical machine
learning papers,
Consistency, Learning Curves and Bounds
The papers in this section give theoretical results on learning
curves, which describe the expected generalization performance as a
function of the number of training cases. Consistency addresses the question
whether the solution approaches the true data generating process in the limit
of infinitely many training examples.
Reproducing Kernel Hilbert Spaces
Reinforcement Learning
Gaussian Process Latent Variable Models (GP-LVM)
Applications
Other Topics
This section contains a very diverse collection of other uses of inference
in Gaussian processes, which don't fit well in any of the above categories.