Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement additional kernels #6

Closed
edaub opened this issue Apr 23, 2019 · 3 comments
Closed

Implement additional kernels #6

edaub opened this issue Apr 23, 2019 · 3 comments
Assignees

Comments

@edaub
Copy link
Collaborator

edaub commented Apr 23, 2019

Code could be extended to use kernels other than squared exponential.

@edaub
Copy link
Collaborator Author

edaub commented Jul 19, 2019

Made some significant progress on this front (in the kernel branch). Wrote separate functions in the Kernels.py file for squared exponential and Matérn 5/2 kernels and their gradient calculations. As of right now (19 July 2019), the code still only uses squared exponential kernels in the default GP implementation, but the kernel computations are done in separate functions so it should be easy to modify the GP class in order to give options for the various kernels when initializing (or the option to modify once initialized). However, we need to decide on an interface for how to set this choice. There are a few ways to do it, but the interface for creating a GP is getting a bit clunky and could use some re-work anyways so I plan to tackle it then.

For the moment, you can use other kernels by importing the desired kernel functions from Kernels.py and then manually changing the kernel_f and kernel_deriv attributes of the initialized GaussianProcess object. The kernel functions contain a full suite of unit tests for the individual functions. However, I have not written unit tests for the Matérn kernel incorporated into the GP.

More broadly, we need to decide on a uniform way of handling kernel functions, as these are likely to be used in multiple parts of the implementation. One way to do it would involve pairing up the main kernel function and the derivative into a single object, though there may be other things to consider.

One final performance note: the current implementation where the kernel derivative is computed separately from the log-likelihood gradient is inefficient in space, as it forms the entire (D x n x n) array in the kernel derivative function. Each of the D dimensions is used independently in the log-likelihood gradient computation, so the entire kernel derivative need not be stored all at once. One improvement is to just return one component of the gradient and re-use the storage of the (n x n) matrix when computing each component. This should only really be an issue on really large problems at the moment, but something to consider in future design decisions.

@edaub edaub self-assigned this Jul 19, 2019
@edaub
Copy link
Collaborator Author

edaub commented Sep 11, 2019

Full kernel functionality is implemented. Any stationary kernel (i.e. depends only on a distance metric and a radial function of that distance metric) can be easily implemented now by subclassing the base Kernel class. The user only needs to define a radial function and first and second derivatives if desired to use the kernel function. Different distance metrics can be similarly specified by subclassing and the modifying the distance function and the respective derivatives of distance with respect to the hyperparameters if derivatives are desired.

The only thing that is still missing is a nice interface to select the kernel in the main GP code. The interface to the GP class has gotten a bit clunky, and I plan to integrate this when re-working the interface code in the init method. However, kernels can be selected manually by modifying the kernel attribute of an instantiated GP object with an instance of one of the kernels.

@edaub edaub mentioned this issue Mar 3, 2020
@edaub
Copy link
Collaborator Author

edaub commented Apr 7, 2020

The GP interface now allows selection of a kernel in PR #81, closing.

@edaub edaub closed this as completed Apr 7, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant