Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is the code complete yet? #2

Open
zzzqqqyyy opened this issue Nov 14, 2024 · 4 comments
Open

Is the code complete yet? #2

zzzqqqyyy opened this issue Nov 14, 2024 · 4 comments

Comments

@zzzqqqyyy
Copy link

Hello author, I would like to reproduce your experiment, I'm still getting errors when I reproduce it so I wanted to make sure the code is complete!
Error Content:
AttributeError: Can't pickle local object 'ResidualBlock.init. . '

@AkaliKong
Copy link
Owner

Thank you for following our work. Please provide your complete error report, and I'll see if I can help resolve the issue.

@zzzqqqyyy
Copy link
Author

Thanks for your reply, as I was firstly getting bitsandbytesbit errors after installing the environment of requirement.txt file.
CUDA Setup failed despite GPU being available. Inspect the CUDA SETUP outputs above to fix your environment!
If you cannot find any issues and suspect a bug, please open an issue with detals about your environment:
https://github.com/TimDettmers/bitsandbytes/issues
So I installed bitsandbytesbit to the latest version and when I rerun the script, the following error was reported
error
ps: my python version is 3.9.20, cuda version is 12.4

@AkaliKong
Copy link
Owner

Thank you again for your attention. Installing the version of PyTorch with CUDA 11.8 along with its corresponding toolkit may help resolve your issue.

@betoobusy
Copy link

The following model_kwargs are not used by the model: ['user_embeds', 'gate_weights'] (note: typos in the generate arguments will also show up in this list)
File "/root/iLoRA_init/model/peft/tuners/moelora.py", line 217, in generate
return self.self_model_generate(**kwargs)
File "/root/iLoRA_init/model/peft/peft_model.py", line 921, in generate
outputs = self.base_model.generate(**kwargs)
File "/root/iLoRA_init/model/model_interface.py", line 74, in generate
generate_ids = self.llama_model.generate(
File "/root/iLoRA_init/model/model_interface.py", line 160, in validation_step
generate_output = self.generate(batch)
File "/root/iLoRA_init/main.py", line 68, in main
trainer.fit(model=model, datamodule=data_module)
File "/root/iLoRA_init/main.py", line 149, in
main(args)
ValueError: The following model_kwargs are not used by the model: ['user_embeds', 'gate_weights'] (note: typos in the generate arguments will also show up in this list)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants