-
gammy.models.bayespy.
GAM
GAM¶
-
class
GAM
(formula, tau=None, theta=None)[source]¶ Bases:
object
Generalized additive model with BayesPy backend
Currently tau is fixed to Gamma distribution, i.e., it is not possible to manually define the noise level. Note though that one can set tight values for α, β in Gamma(α, β), recalling that mean = α / β and variance = α / β ** 2. The upside is that by estimating the noise level, one gets a nice prediction uncertainty estimate.
Currently Gaussian requirement deeply built in. Tau being Gamma implies, by conjugacy, that theta must be Gaussian.
Does not support scalar valued Gaussian r.v.. Could be implemented using GaussianARD but this would require a lot of refactoring for such a small feature – after all one can define an auxiliary bias term with a very tight prior.
TODO: Statistics for basis function evaluations at grid points.
FIXME: BayesPy fit fails with the following example:
gammy.bayespy.GAM( gammy.Scalar() ).fit(np.array([0]), np.array([1]))
- Parameters
- formulagammy.formulae.Formula
Formula object containing the terms and prior
- thetabp.nodes.Gaussian
Model parameters vector
- taubp.nodes.Gamma
Observation noise precision (inverse variance)
-
property
covariance_theta
¶ Covariance estimate of model parameters
-
fit
(input_data, y, repeat=1000, verbose=False, **kwargs) → gammy.models.bayespy.GAM[source]¶ Update BayesPy nodes and construct a GAM predictor
WARNING: Currently mutates the original object’s
theta
andtau
.An option to “reset” the original object to prior is to use the method
initialize_from_prior()
of BayesPy nodes.- Parameters
- input_datanp.ndarray
Input data
- ynp.ndarray
Observations
- repeatint
BayesPy allowed repetitions in variational Bayes learning
- verbosebool
BayesPy logging
-
formula
¶ Model formula
-
property
inv_mean_tau
¶ Additive observation noise variance estimate
-
load
(filepath: str, **kwargs) → gammy.models.bayespy.GAM[source]¶ Load model from a file on disk
-
marginal_residual
(input_data, y, i: int) → numpy.ndarray[source]¶ Calculate marginal residual for a given term
- Parameters
- input_datanp.ndarray
-
marginal_residuals
(input_data, y) → List[numpy.ndarray][source]¶ Marginal (partial) residuals
- Parameters
- input_datanp.ndarray
Input data
- ynp.ndarray
Observations
-
property
mean_theta
¶ Mean estimate of model parameters
Posterior if model is fitted, otherwise prior.
-
predict
(input_data) → numpy.ndarray[source]¶ Calculate mean of posterior predictive at inputs
- Parameters
- input_datanp.ndarray
-
predict_marginal
(input_data, i: int) → numpy.ndarray[source]¶ Predict a term separately
- Parameters
- input_datanp.ndarray
-
predict_marginals
(input_data) → List[numpy.ndarray][source]¶ Predict all terms separately
- Parameters
- input_datanp.ndarray
-
predict_variance
(input_data) → Tuple[numpy.ndarray][source]¶ Predict mean and variance
- Parameters
- input_datanp.ndarray
-
predict_variance_marginal
(input_data, i: int) → Tuple[numpy.ndarray][source]¶ Predict marginal distributions means and variances
- Parameters
- input_datanp.ndarray
-
predict_variance_marginals
(input_data) → List[Tuple[numpy.ndarray]][source]¶ Predict variance (theta) for marginal parameter distributions
NOTE: Analogous to self.predict_variance_theta but for marginal distributions. Adding observation noise does not make sense as we don’t know how it is splitted among the model terms.
- Parameters
- input_datanp.ndarray
-
predict_variance_theta
(input_data) → Tuple[numpy.ndarray][source]¶ Predict observations with variance from model parameters
- Parameters
- input_datanp.ndarray
-
tau
¶ Node for additive noise precision
-
theta
¶ Node for model parameters
-
theta_marginal
(i: int) → bayespy.inference.vmp.nodes.gaussian.Gaussian[source]¶ Extract marginal distribution for a specific term
-
property
theta_marginals
¶ Nodes for the basis specific marginal distributions