Gaussian Processes (GPs) can conveniently be used for Bayesian supervised learning, such as regression and classification. In its simplest form, GP inference can be implemented in a few lines of code. However, in practice, things typically get a little more complicated: you might want to use expressive covariance and mean functions, learn good values for hyperparameters, use non-Gaussian likelihood functions (rendering exact inference intractable), use approximate inference algorithms, or combinations of many or all of the above.
A comprehensive introduction to Gaussian Processes for Machine Learning is provided in the GPML book by Rasmussen and Williams, 2006.
The following combinations of inference methods and likelihood functions are possible:
This table lists the functionality implemented in pyGPs.
|Linear||Linear||Cumulative Gaussian (Erf)||EP||CG|
|Composite||Sum (+)||Sum (+)|
|Product (*)||Product (*)|
|Scale (*)||Scale (*)|
|Sparse GP||FITC Exact|
|Pseudo Inv Laplace|
|p-step Random Walk|
pyGPs also provide cross-validation and some built-in evalation methods.