Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ggjt v2 models don't load (or error gracefully) #1559

Closed
akx opened this issue May 22, 2023 · 1 comment
Closed

ggjt v2 models don't load (or error gracefully) #1559

akx opened this issue May 22, 2023 · 1 comment

Comments

@akx
Copy link
Contributor

akx commented May 22, 2023

I freshly pulled 7e4ea5b and make clean && maked and it fails to load a model converted from pytorch using the tools from revision 63d2046 (using https://github.com/akx/ggify):

llama.cpp: loading model from models/ausboss-llama-30b-supercot-q8_0.bin
error loading model: llama.cpp: tensor '�+� ��s��93:�a-�%��Y��8Ɓ0�&�M,�9�4������"/�@�չ�"*+c�5�������9�>+n��!������O...' should not be 2563577093-dimensional
llama_init_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model 'models/ausboss-llama-30b-supercot-q8_0.bin'
main: error: unable to load model

I re-converted the model with 7e4ea5b; apparently the old file had been

llama_model_load_internal: format     = ggjt v2 (latest)

and the new one is

llama_model_load_internal: format     = ggjt v3 (latest)

(and 6% smaller!)

It would be nice if there was an error saying that ggjt v2 is not supported, instead of dumping out garbage tensor names and mind-bendingly large tensor dimensionalities 😁 but I suppose this doesn't necessarily need any action right now.

This seems to be related to

@akx
Copy link
Contributor Author

akx commented May 22, 2023

EDIT: Right, this is obviously just #1508 🤦

Closing.

@akx akx closed this as completed May 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant