Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatGLM2-6B load error #3013

Closed
1 task done
melodyee opened this issue Jul 5, 2023 · 1 comment
Closed
1 task done

ChatGLM2-6B load error #3013

melodyee opened this issue Jul 5, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@melodyee
Copy link

melodyee commented Jul 5, 2023

Describe the bug

  1. trust-remote-code already checked.
  2. load error

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

just load the model

Screenshot

No response

Logs

Traceback (most recent call last): File “/data/git/text-generation-webui/server.py”, line 67, in load_model_wrapper shared.model, shared.tokenizer = load_model(shared.model_name, loader) File “/data/git/text-generation-webui/modules/models.py”, line 74, in load_model output = load_func_map[loader](http://127.0.0.1:7866/model_name) File “/data/git/text-generation-webui/modules/models.py”, line 206, in huggingface_loader model = LoaderClass.from_pretrained(checkpoint, **params) File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py”, line 479, in from_pretrained return model_class.from_pretrained( File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/modeling_utils.py”, line 2881, in from_pretrained ) = cls._load_pretrained_model( File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/modeling_utils.py”, line 3228, in _load_pretrained_model new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model( File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/modeling_utils.py”, line 728, in _load_state_dict_into_meta_model set_module_quantized_tensor_to_device( File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/utils/bitsandbytes.py”, line 89, in set_module_quantized_tensor_to_device new_value = bnb.nn.Int8Params(new_value, requires_grad=False, **kwargs).to(device) File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/nn/modules.py”, line 227, in to return self.cuda(device) File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/nn/modules.py”, line 191, in cuda CB, CBt, SCB, SCBt, coo_tensorB = bnb.functional.double_quant(B) File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/functional.py”, line 1642, in double_quant row_stats, col_stats, nnz_row_ptr = get_colrow_absmax( File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/functional.py”, line 1531, in get_colrow_absmax lib.cget_col_row_stats(ptrA, ptrRowStats, ptrColStats, ptrNnzrows, ct.c_float(threshold), rows, cols) File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/ctypes/init.py”, line 387, in getattr func = self.getitem(name) File “/home/melodyli/miniconda3/envs/textgen/lib/python3.10/ctypes/init.py”, line 392, in getitem func = self._FuncPtr((name_or_ordinal, self)) AttributeError: /home/melodyli/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cget_col_row_stats

System Info

Linux, 2080ti GPU

@melodyee melodyee added the bug Something isn't working label Jul 5, 2023
@melodyee
Copy link
Author

melodyee commented Jul 5, 2023

@melodyee melodyee closed this as completed Jul 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant