Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

translate tutorial 0_config #819

Merged
merged 6 commits into from
Aug 6, 2021
Merged

Conversation

liqikai9
Copy link
Collaborator

No description provided.

@CLAassistant
Copy link

CLAassistant commented Jul 28, 2021

CLA assistant check
All committers have signed the CLA.

@jin-s13 jin-s13 requested a review from ly015 July 28, 2021 08:30
@codecov
Copy link

codecov bot commented Jul 28, 2021

Codecov Report

Merging #819 (8a036d3) into master (77d78f0) will decrease coverage by 0.00%.
The diff coverage is n/a.

❗ Current head 8a036d3 differs from pull request most recent head 5451456. Consider uploading reports for the commit 5451456 to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##           master     #819      +/-   ##
==========================================
- Coverage   83.59%   83.59%   -0.01%     
==========================================
  Files         176      176              
  Lines       14145    14145              
  Branches     2364     2364              
==========================================
- Hits        11825    11824       -1     
  Misses       1713     1713              
- Partials      607      608       +1     
Flag Coverage Δ
unittests 83.52% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmpose/datasets/pipelines/shared_transform.py 88.00% <0.00%> (-0.50%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 77d78f0...5451456. Read the comment docs.

…cs_zh-CN0

first try to update current branch from remote branch:master
@jin-s13
Copy link
Collaborator

jin-s13 commented Aug 4, 2021

@liqikai9 please check the linting

@ly015 ly015 mentioned this pull request Aug 4, 2021
23 tasks
@@ -1,3 +1,235 @@
# 教程 0: 模型配置文件

内容建设中……
我们使用python文件作为配置文件,将模块化设计和继承设计结合到配置系统中,便于进行各种实验。
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add space between cn and en characters

我们使用 python 文件作...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Thanks for reminding me!


## 通过脚本参数修改配置

当使用 "tools/train.py" 或 "tools/test.py" 提交作业时,您可以指定 `--cfg-options` 来就地修改配置。
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

就地

```python
# 运行设置
log_level = 'INFO' # 日志记录级别
load_from = None # 从给定路径将模型作为预训练的模型加载。这将不会重新开始训练
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

从给定路径加载预训练模型。

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does "这将不会重新开始训练" mean

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it means that the model loaded will not be trained from scratch, so I used the word "不会重新开始训练."
Actually, I was wondering how to translate the word "resume." Can I change the whole translation into "从给定路径加载预训练模型,这样,模型不会重头开始训练"? Is this appropriate?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

感觉不需要“这将不会重新开始训练”这句, 给人感觉比较confusing. “加载预训练模型”已经表达清楚了

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

好的

# 学习率调整策略
lr_config = dict( # 用于注册 LrUpdater 钩子的学习率调度器的配置
policy='step', # 调整策略, 还支持 CosineAnnealing, Cyclic, 等等,请参阅 https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/hooks/lr_updater.py#L9 获取支持的 LrUpdater 细节
warmup='linear', # 使用的预热类型,它可以是None (不使用预热), 'constant', 'linear' 或者 'exp'.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

space before None

@ly015 ly015 merged commit f9540ad into open-mmlab:master Aug 6, 2021
shuheilocale pushed a commit to shuheilocale/mmpose that referenced this pull request May 6, 2023
* add docs zh-CN tutorial0
* translate tutorial 0_config
HAOCHENYE added a commit to HAOCHENYE/mmpose that referenced this pull request Jun 27, 2023
* [Feature] Add ReduceOnPlateauParamScheduler and change ParamSchedulerHook

* [Feature] add ReduceOnPlateauLR and ReduceOnPlateauMomentum

* pre-commit check

* add a little docs

* change position

* fix the conflict between isort and yapf

* fix ParamSchedulerHook after_val_epoch execute without train_loop and param_schedulers built

* Apply suggestions from code review

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>

* update ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ParamSchedulerHook

* fix get need_step_args attribute error in ParamSchedulerHook

* fix load_state_dict error for rule in ReduceOnPlateauParamScheduler

* add docs for ParamSchedulerHook and fix a few codes

* [Docs] add ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ReduceOnPlateauLR docs

* [Refactor] adjust the order of import

* [Fix] add init check for threshold in ReduceOnPlateauParamScheduler

* [Test] add test for ReduceOnPlateauParamScheduler, ReduceOnPlateauLR and ReduceOnPlateauMomentum

* [Fix] fix no attribute self.min_value

* [Fix] fix numerical problem in tests

* [Fix] fix error in tests

* [Fix] fix ignore first param in tests

* [Fix] fix bug in tests

* [Fix] fix bug in tests

* [Fix] fix bug in tests

* [Fix] increase coverage

* [Fix] fix count self._global_step bug and docs

* [Fix] fix tests

* [Fix] modified ParamSchedulerHook test

* Update mmengine/optim/scheduler/param_scheduler.py

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>

* [Fix] modified something according to commented

* [Docs] add api for en and zh_cn

* [Fix] fix bug in test_param_scheduler_hook.py

* [Test] support more complicated test modes(less, greater, rel, abs) for ReduceOnPlateauParamScheduler

* [Docs] add docs for rule

* [Fix] fix pop from empty list bug in test

* [Fix] fix check param_schedulers is not built bug

* [Fix] fix step_args bug and without runner._train_loop bug

* [Fix] fix step_args bug and without runner._train_loop bug

* [Fix] fix scheduler type bug

* [Test] rename step_args to step_kwargs

* [Fix] remove redundancy check

* [Test] remove redundancy check

* Apply suggestions from code review

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>

* [Test] fix some defects

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
ajgrafton pushed a commit to ajgrafton/mmpose that referenced this pull request Mar 6, 2024
* add docs zh-CN tutorial0
* translate tutorial 0_config
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants