Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code changes to finetune PointBERT or ACT #5

Open
yhxu022 opened this issue Oct 19, 2023 · 1 comment
Open

Code changes to finetune PointBERT or ACT #5

yhxu022 opened this issue Oct 19, 2023 · 1 comment

Comments

@yhxu022
Copy link

yhxu022 commented Oct 19, 2023

What changes should be made to the code if I want to finetune PointBERT or ACT pre-trained models?

@zyh16143998882
Copy link
Owner

IDPT is easily migrated to Point-BERT or other models. Only the following modifications are required:

  1. add "part: only_new," to the .yaml file in the cfg directory, for example:
optimizer : {
  type: AdamW,
  part: only_new,
  kwargs: {
  lr : 0.0005,
  weight_decay : 0.05
}}
  1. Modify the "add_weight_decay" function in the "build_opti_sche function" with reference to here
  2. Modify the Transformer backbone to include code of the Dynamic Prompt Generation Module(here)

@zyh16143998882 zyh16143998882 mentioned this issue Nov 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants