-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] Fix the bug that vit cannot load pretrain properly when using i… #999
Conversation
…nit_cfg to specify the pretrain scheme
Codecov Report
@@ Coverage Diff @@
## master #999 +/- ##
==========================================
+ Coverage 89.70% 89.76% +0.06%
==========================================
Files 119 119
Lines 6603 6605 +2
Branches 1028 1029 +1
==========================================
+ Hits 5923 5929 +6
+ Misses 477 475 -2
+ Partials 203 201 -2
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please resolve some comments.
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
open-mmlab#999) * [Fix] Fix the bug that vit cannot load pretrain properly when using init_cfg to specify the pretrain scheme * [Fix] fix the coverage problem * Update mmseg/models/backbones/vit.py Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn> * [Fix] make the predicate more concise and clearer * [Fix] Modified the judgement logic * Update tests/test_models/test_backbones/test_vit.py Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn> * add comments Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
* [Flax] Add finetune Stable Diffusion * temporary fix * drop_last and seed * add dtype for mixed precision training * style * Add Flax example
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
Fix the bug that vit cannot load pretrain properly when using init_cfg to specify the pretrain scheme
Modification
Fix the bug that vit cannot load pretrain properly when using init_cfg to specify the pretrain scheme