This repository has been archived by the owner on Jan 7, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Support hyperparameters tuning #287
Labels
Comments
I have just found this code: https://github.com/kuz/caffe-with-spearmint |
/cc @jmancewicz - you've been looking into parameter-sweep stuff, right? |
Yes, but won't be able to look at it for little while more. |
With #708 now you can sweep through many values of learning rate and batch size. |
Thank you for the follow up. I will try to use it. |
Closing (enhancement implemented in #708) |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Being able to automatically optimize hyperparameters directly in digits would be a great feature. The two obvious methods would be random search or bayesian optimization.
I've been playing for the past week with using spearmint (bayesian optimization) with digits. Here is how I do it for now:
json_dict
method ofModelJob
the best accuracy computed during training on the validation set.We end up with a bunch of model jobs in digits, and the optimized values for the parameters outputted by spearmint.
I'm now wondering what would be the best way to integrate random search or spearmint support into digits. I imagine it could be in the form of a special
ModelJob
with training tasks generated by spearmint for instance.What do you think ?
The text was updated successfully, but these errors were encountered: