Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add krr page in cookbook #3078

Merged
merged 1 commit into from
Mar 16, 2016
Merged

add krr page in cookbook #3078

merged 1 commit into from
Mar 16, 2016

Conversation

sanuj
Copy link
Contributor

@sanuj sanuj commented Mar 15, 2016

@karlnapf looking forward to your comments :)

Kernel Ridge Regression
=======================

Kernel ridge regression is a kernel-based regularized form of regression which learns a function in the space induced by the respective kernel and the data by minimizing a squared error loss with :math:`L_2` regularization. For linear ridge regression, it boils down to solving a linear system:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Kernel ridge regression is a non-parametric form of ridge regression. The aim is to learn a function in the space induced by the respective kernel $k$ by minimizing a squared loss with a squared norm regularization term

@karlnapf
Copy link
Member

I think there also will be a description for krr, which can be removed

@sanuj
Copy link
Contributor Author

sanuj commented Mar 15, 2016

@karlnapf I don't fully understand your last comment.
Rest is updated :)

@karlnapf
Copy link
Member

Check the descriptions folder in the examples dir

On Tuesday, 15 March 2016, Sanuj Sharma notifications@github.com wrote:

@karlnapf https://github.com/karlnapf I don't fully understand your
last comment.
Rest is updated :)


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#3078 (comment)

@sanuj
Copy link
Contributor Author

sanuj commented Mar 16, 2016

@karlnapf removed the description file.

@karlnapf
Copy link
Member

Cool this is good to merge.
Just a thought I had: Can you also explain how to extract the alpha vector (and the w vector in linear regression)? This will (annoyingly) also change the data files, so we can do it in a separate patch (for linear regression).

But I think it is good to explain such things too.

{\bf \alpha} = \left({\bf K}+\tau{\bf I}\right)^{-1}{\bf y}

where :math:`{\bf K}` is the kernel matrix and :math:`{\bf \alpha}` is the vector of weights in the space induced by the kernel.
The learned function can then be evaluated as :math:`f(x)=\sum_{i=1}^Nk(x,x_i)`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the alpha_i are missing here .. sorry I forgot above

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, i was thinking the same but was not sure. Updating.

@sanuj
Copy link
Contributor Author

sanuj commented Mar 16, 2016

@karlnapf Updated

karlnapf added a commit that referenced this pull request Mar 16, 2016
@karlnapf karlnapf merged commit 603245b into shogun-toolbox:develop Mar 16, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants