Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update LR scheduler configuration #5846

Merged
merged 9 commits into from
Aug 14, 2024

Conversation

xiyang-aads-lilly
Copy link
Contributor

@xiyang-aads-lilly xiyang-aads-lilly commented Aug 6, 2024

This PR is based on #5726.

The current lr scheduler initialization always prioritize config over manual defined scheduler in the code. However, the optimizer initialization implementation prioritize manual defined optimizer over config. This PR aims to make initialization behavior for both optimizer and lr scheduler consistent where if lr scheduler is defined in the code, then it will overwrite config.

@xiyang-aads-lilly
Copy link
Contributor Author

@microsoft-github-policy-service agree

@xiyang-aads-lilly
Copy link
Contributor Author

add line 339 to fix test failure

@loadams loadams added this pull request to the merge queue Aug 14, 2024
Merged via the queue into deepspeedai:master with commit f994fb2 Aug 14, 2024
13 checks passed
@xiyang-aads-lilly xiyang-aads-lilly deleted the update_lr_init_#5726 branch February 4, 2025 20:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants