sklearn.gaussian_process.kernels
.Exponentiation¶
-
class
sklearn.gaussian_process.kernels.
Exponentiation
(kernel, exponent)[source]¶ Exponentiate kernel by given exponent.
The resulting kernel is defined as k_exp(X, Y) = k(X, Y) ** exponent
New in version 0.18.
- Parameters
- kernelKernel object
The base kernel
- exponentfloat
The exponent for the base kernel
- Attributes
bounds
Returns the log-transformed bounds on the theta.
hyperparameters
Returns a list of all hyperparameter.
n_dims
Returns the number of non-fixed hyperparameters of the kernel.
theta
Returns the (flattened, log-transformed) non-fixed hyperparameters.
Methods
__call__
(self, X[, Y, eval_gradient])Return the kernel k(X, Y) and optionally its gradient.
clone_with_theta
(self, theta)Returns a clone of self with given hyperparameters theta.
diag
(self, X)Returns the diagonal of the kernel k(X, X).
get_params
(self[, deep])Get parameters of this kernel.
is_stationary
(self)Returns whether the kernel is stationary.
set_params
(self, \*\*params)Set the parameters of this kernel.
-
__init__
(self, kernel, exponent)[source]¶ Initialize self. See help(type(self)) for accurate signature.
-
__call__
(self, X, Y=None, eval_gradient=False)[source]¶ Return the kernel k(X, Y) and optionally its gradient.
- Parameters
- Xarray, shape (n_samples_X, n_features)
Left argument of the returned kernel k(X, Y)
- Yarray, shape (n_samples_Y, n_features), (optional, default=None)
Right argument of the returned kernel k(X, Y). If None, k(X, X) if evaluated instead.
- eval_gradientbool (optional, default=False)
Determines whether the gradient with respect to the kernel hyperparameter is determined.
- Returns
- Karray, shape (n_samples_X, n_samples_Y)
Kernel k(X, Y)
- K_gradientarray (opt.), shape (n_samples_X, n_samples_X, n_dims)
The gradient of the kernel k(X, X) with respect to the hyperparameter of the kernel. Only returned when eval_gradient is True.
-
property
bounds
¶ Returns the log-transformed bounds on the theta.
- Returns
- boundsarray, shape (n_dims, 2)
The log-transformed bounds on the kernel’s hyperparameters theta
-
clone_with_theta
(self, theta)[source]¶ Returns a clone of self with given hyperparameters theta.
- Parameters
- thetaarray, shape (n_dims,)
The hyperparameters
-
diag
(self, X)[source]¶ Returns the diagonal of the kernel k(X, X).
The result of this method is identical to np.diag(self(X)); however, it can be evaluated more efficiently since only the diagonal is evaluated.
- Parameters
- Xarray, shape (n_samples_X, n_features)
Left argument of the returned kernel k(X, Y)
- Returns
- K_diagarray, shape (n_samples_X,)
Diagonal of kernel k(X, X)
-
get_params
(self, deep=True)[source]¶ Get parameters of this kernel.
- Parameters
- deepboolean, optional
If True, will return the parameters for this estimator and contained subobjects that are estimators.
- Returns
- paramsmapping of string to any
Parameter names mapped to their values.
-
property
hyperparameters
¶ Returns a list of all hyperparameter.
-
property
n_dims
¶ Returns the number of non-fixed hyperparameters of the kernel.
-
set_params
(self, **params)[source]¶ Set the parameters of this kernel.
The method works on simple kernels as well as on nested kernels. The latter have parameters of the form
<component>__<parameter>
so that it’s possible to update each component of a nested object.- Returns
- self
-
property
theta
¶ Returns the (flattened, log-transformed) non-fixed hyperparameters.
Note that theta are typically the log-transformed values of the kernel’s hyperparameters as this representation of the search space is more amenable for hyperparameter search, as hyperparameters like length-scales naturally live on a log-scale.
- Returns
- thetaarray, shape (n_dims,)
The non-fixed, log-transformed hyperparameters of the kernel