Introduction
GPflow manual
GPflow with TensorFlow 2
GPflow 2 Upgrade Guide
Notebooks
Derivations
API reference
2.4.0
Bayesian Gaussian process latent variable model (Bayesian GPLVM)
Basic (binary) GP classification model
Monitoring Optimisation
Basic (Gaussian likelihood) GP regression model
More details on models with many observation points
Change points
Convolutional Gaussian Processes
A simple demonstration of coregionalization
Faster predictions by caching
Stochastic Variational Inference for scalability with SVGP
Heteroskedastic Likelihood and Multi-Latent GP
Manipulating kernels
MCMC (Markov Chain Monte Carlo)
Multiclass classification
Multi-output Gaussian processes in GPflow
Natural gradients
Optimizers
Ordinal regression
Variational Fourier Features in the GPflow framework
Gaussian process regression with varying output noise
Custom mean functions: metalearning with GPs
Mixing TensorFlow models with GPflow
Kernel Design
Likelihood Design
Mixture Density Networks in GPflow
Models with observed and latent variables
Updating model with new data
Comparing FITC approximation to VFE approximation
Derivation of SGPR equations
Sanity checking when model behaviours should overlap
Discussion of the GP marginal likelihood upper bound
Derivation of VGP equations
Architecture
Manipulating GPflow models
Utilities
Likelihood Design
ΒΆ
see models.ipynb (2/2), other option is to have heteroscedastic noise.