Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Manticore (Wizard Mega) unsupported #369

Closed
Travistyse opened this issue May 22, 2023 · 3 comments
Closed

Manticore (Wizard Mega) unsupported #369

Travistyse opened this issue May 22, 2023 · 3 comments
Labels
enhancement New feature or request primordial Related to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT

Comments

@Travistyse
Copy link

E:\Projects\Ai LLM\privateGPT>python privateGPT.py
Using embedded DuckDB with persistence: data will be stored in: db
llama.cpp: loading model from models/Manticore-13B.ggmlv3.q4_1.bin
error loading model: unknown (magic, version) combination: 67676a74, 00000003; is this really a GGML file?
llama_init_from_file: failed to load model
Traceback (most recent call last):
File "E:\Projects\Ai LLM\privateGPT\privateGPT.py", line 75, in
main()
File "E:\Projects\Ai LLM\privateGPT\privateGPT.py", line 33, in main
llm = LlamaCpp(model_path=model_path, n_ctx=model_n_ctx, callbacks=callbacks, verbose=False)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
root
Could not load Llama model from path: models/Manticore-13B.ggmlv3.q4_1.bin. Received error (type=value_error)

Side note, and this may be valuable information but:

I created a symlink to my obsidian vault
I installed some C++ libraries from MS, keywords being help visual c downloads and 2977003
I pip installed pillow to reinstall it
I used VS Code to run
import nltk

nltk.download('punkt')
nltk.download('averaged_perceptron_tagger')

@Travistyse Travistyse added the enhancement New feature or request label May 22, 2023
@bmarquismarkail
Copy link

bmarquismarkail commented May 22, 2023

I believe that is due to LLama changing their code: ggerganov/llama.cpp#1508

Edit requirements.txt; find the line with llama-cpp-python and change it to this:

llama-cpp-python==0.1.53

@Crimsonfart
Copy link

I believe that is due to LLama changing their code: ggerganov/llama.cpp#1508

find the line with llama-cpp-python and change it to this:

llama-cpp-python==0.1.53

In requirements.txt?

@bmarquismarkail
Copy link

Yes. Afterwards run python -m pip -r requirements.txt to update.
Alternately, you can manually update llama-cpp-python by executing python -m pip install --upgrade llama-cpp-python

@imartinez imartinez added the primordial Related to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT label Oct 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request primordial Related to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT
Projects
None yet
Development

No branches or pull requests

4 participants