Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update train.py #13497

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Conversation

ShubhamPhapale
Copy link

@ShubhamPhapale ShubhamPhapale commented Jan 22, 2025

updated deprecated call

πŸ› οΈ PR Summary

Made with ❀️ by Ultralytics Actions

🌟 Summary

Updated AMP (Automatic Mixed Precision) autocast usage for compatibility with PyTorch 2.0+.

πŸ“Š Key Changes

  • Replaced torch.cuda.amp.autocast(amp) with torch.amp.autocast('cuda', amp) in the training script.

🎯 Purpose & Impact

  • Purpose: Aligns the code with PyTorch 2.0+ updates, making YOLOv5 compatible with newer PyTorch versions.
  • Impact: Prevents potential errors or deprecation warnings, ensuring smoother training experiences for users upgrading to the latest PyTorch versions. πŸš€

updated deprecated call

Signed-off-by: Shubham Phapale <94707673+ShubhamPhapale@users.noreply.github.com>
Copy link
Contributor

github-actions bot commented Jan 22, 2025

All Contributors have signed the CLA. βœ…
Posted by the CLA Assistant Lite bot.

@UltralyticsAssistant UltralyticsAssistant added dependencies Dependencies and packages enhancement New feature or request labels Jan 22, 2025
@UltralyticsAssistant
Copy link
Member

πŸ‘‹ Hello @ShubhamPhapale, thank you for submitting a ultralytics/yolov5 πŸš€ PR! To ensure a seamless integration of your work, please review the following checklist:

  • βœ… Define a Purpose: Clearly explain the purpose of your fix or feature in your PR description, and link to any relevant issues. Ensure your commit messages are clear, concise, and adhere to the project's conventions.
  • βœ… Synchronize with Source: Confirm your PR is synchronized with the ultralytics/yolov5 main branch. If it's behind, update it by clicking the 'Update branch' button or by running git pull and git merge main locally.
  • βœ… Ensure CI Checks Pass: Verify all Ultralytics Continuous Integration (CI) checks are passing. If any checks fail, please address the issues.
  • βœ… Update Documentation: Update the relevant documentation for any new or modified features.
  • βœ… Add Tests: If applicable, include or update tests to cover your changes, and confirm that all tests are passing.
  • βœ… Sign the CLA: Please ensure you have signed our Contributor License Agreement if this is your first Ultralytics PR by writing "I have read the CLA Document and I sign the CLA" in a new message.
  • βœ… Minimize Changes: Limit your changes to the minimum necessary for your bug fix or feature addition. "It is not daily increase but daily decrease, hack away the unessential. The closer to the source, the less wastage there is." β€” Bruce Lee

For more guidance, please refer to our Contributing Guide. Don’t hesitate to leave a comment if you have any questions. Thank you for contributing to Ultralytics! πŸš€

πŸ› οΈ Notes

It looks like your PR updates AMP autocast usage for compatibility with PyTorch 2.0+, which is an important improvement.

If applicable, please include a minimum reproducible example (MRE) so we can fully understand and test the impact of this change. For example, providing specific training scenarios where the prior implementation failed due to autograd issues with PyTorch 2.0+ would help validate this fix.

An Ultralytics engineer will also review this PR shortly. Stay tuned for additional feedback! πŸš€

Made with ❀️ by Ultralytics Actions

@ShubhamPhapale
Copy link
Author

I have read the CLA Document and I sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Dependencies and packages enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants