CurrentModule = KernelFunctions
These are the basic kernels without any transformation of the data. They are the building blocks of KernelFunctions.
ZeroKernel
ConstantKernel
WhiteKernel
EyeKernel
CosineKernel
ExponentialKernel
GibbsKernel
LaplacianKernel
SqExponentialKernel
SEKernel
GaussianKernel
RBFKernel
GammaExponentialKernel
ExponentiatedKernel
FBMKernel
gaborkernel
MaternKernel
Matern12Kernel
Matern32Kernel
Matern52Kernel
NeuralNetworkKernel
PeriodicKernel
PeriodicKernel(::DataType, ::Int)
PiecewisePolynomialKernel
LinearKernel
PolynomialKernel
RationalKernel
RationalQuadraticKernel
GammaRationalKernel
spectral_mixture_kernel
spectral_mixture_product_kernel
WienerKernel
The modular design of KernelFunctions uses [base kernels](@ref base_kernels) as building blocks for more complex kernels. There are a variety of composite kernels implemented, including those which [transform the inputs](@ref input_transforms) to a wrapped kernel to implement length scales, scale the variance of a kernel, and sum or multiply collections of kernels together.
TransformedKernel
∘(::Kernel, ::Transform)
ScaledKernel
KernelSum
KernelProduct
KernelTensorProduct
NormalizedKernel
Kernelfunctions implements multi-output kernels as scalar kernels on an extended output domain. For more details on this read [the section on inputs for multi-output GPs](@ref Inputs-for-Multiple-Outputs).
For a function f(x) \rightarrow y
denote the inputs as x, x'
, such that we compute the covariance between output components y_{p}
and y_{p'}
. The total number of outputs is m
.
MOKernel
IndependentMOKernel
LatentFactorMOKernel
IntrinsicCoregionMOKernel
LinearMixingModelKernel