Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lr_scheduler config issue #647

Open
GeonHyeock opened this issue Jul 14, 2024 · 0 comments · May be fixed by #648
Open

lr_scheduler config issue #647

GeonHyeock opened this issue Jul 14, 2024 · 0 comments · May be fixed by #648

Comments

@GeonHyeock
Copy link

GeonHyeock commented Jul 14, 2024

if self.hparams.scheduler is not None:
scheduler = self.hparams.scheduler(optimizer=optimizer)
return {
"optimizer": optimizer,
"lr_scheduler": {
"scheduler": scheduler,
"monitor": "val/loss",
"interval": "epoch",
"frequency": 1,
},
}
return {"optimizer": optimizer}

There is an issue with not being able to change the monitor interval frequency value of the lr_scheduler using the config.

If I need to change the learning rate at each step, I will have to modify the module file directly.

@GeonHyeock GeonHyeock linked a pull request Jul 14, 2024 that will close this issue
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant