Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added new train_folds parameter for xgb.cv() #5064

Closed
wants to merge 0 commits into from

Conversation

iblumin
Copy link

@iblumin iblumin commented Nov 24, 2019

New parameter and code allows user to select which indices will be used for training in each cross-validation fold if desired. This is to be used in conjunction with the existing folds parameter.

Currently, xgb.cv() assumes that if folds is not NULL, the given indices are to be used for the validation sets for each cross-validation fold, and all remaining indices should be used for training in each respective fold. Adding a train_folds parameter to specify which indices are to be used for the training set for each fold allows the implementation of a walk-forward or rolling window approach to cross-validation.

I came across this code on stackoverflow, where it was originally contributed by @RolandASc.
https://stackoverflow.com/questions/32433458/how-to-specify-train-and-test-indices-for-xgb-cv-in-r-package-xgboost/51412073

@trivialfis
Copy link
Member

@iblumin Sorry you used your master branch and somehow I evened it. I created a new branch for your PR in #5114 .

@trivialfis
Copy link
Member

And added your commit there.

@lock lock bot locked as resolved and limited conversation to collaborators Mar 11, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants