Hyperparameter Tuning in Lightning CLI #9108
-
I wonder how people do Hyperparameter Tuning with Lightning CLI? Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Personally when I tune hyperparameters (e.g. with optuna or nevergrad), I don't use the Lightning CLI much but use the programmatic way to inject the arguments there (since it's easier for communication across different python libs directly in python and not leaving it to os calls). |
Beta Was this translation helpful? Give feedback.
-
Dear @tshu-w, I was a bit confused with your question. Are you talking using In the later case, here is something you could try out. from pytorch_lightning.utilities.cli import LightningCLI
from unittest import mock
import optuna
config_path = ...
class MyModel(LightningModule):
def __init__(self, num_layers):
...
def objective(trial):
num_layers = trial.suggest_uniform('num_layers', 10, 100)
with mock.patch("sys.argv", ["any.py", "--config", str(config_path), "--model.num_layers", str(num_layers)]):
cli = LightningCLI(MyModel, MyDataModule)
return cli.trainer.model_checkpoint.best_score
study = optuna.create_study()
study.optimize(objective, n_trials=100)
study.best_params @carmocca Any thoughts ? |
Beta Was this translation helpful? Give feedback.
Personally when I tune hyperparameters (e.g. with optuna or nevergrad), I don't use the Lightning CLI much but use the programmatic way to inject the arguments there (since it's easier for communication across different python libs directly in python and not leaving it to os calls).