Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

std::runtime_error: unexpectedly reached end of file #3

Closed
hindfelt opened this issue May 24, 2023 · 6 comments
Closed

std::runtime_error: unexpectedly reached end of file #3

hindfelt opened this issue May 24, 2023 · 6 comments

Comments

@hindfelt
Copy link

Hi,

I get a an error saying that there's an unexpected end to the ggml q4_0.bin file.

(.venv) ➜ GPT4all-mh python my_knowledge_qna.py

llama.cpp: loading model from ./models/ggml-model-q4_0.bin
libc++abi: terminating due to uncaught exception of type std::runtime_error: unexpectedly reached end of file
[1] 99899 abort python my_knowledge_qna.py
(.venv) ➜ GPT4all-mh

Used the link in you medium post. After some reading up, could this have anything too do with a breaking update in ggerganov/llama.cpp#1508 ?

@fabiomatricardi
Copy link
Owner

Ciao,
what is your python version?
can you share your pip freeze > requirements.txt ?

@afathonih
Copy link

The following will work.

llama-cpp-python==0.1.49

@fabiomatricardi
Copy link
Owner

As pointed out by

Update and bug fixes - 2023.05.23

Cannot install llama-cpp-python
This happens usually only on Windows users. Running the installation of llama-cpp-python, required by
LangChain with the llamaEmbeddings, on windows CMake C complier is not installed by default, so you cannot build from source.
On Mac Users with Xtools and on Linux, usually the C complier is already available on the OS.
To avoid the issue you MUST use pre complied wheel.
Go here https://github.com/abetlen/llama-cpp-python/releases
and look for the complied wheel for your architecture and python version - you MUST take Weels Version 0.1.49
because higher versions are not compatible.

In my case I have Windows 10, 64 bit, python 3.10
so my file is llama_cpp_python-0.1.49-cp310-cp310-win_amd64.whl

@hindfelt
Copy link
Author

pip freeze > requirements.txt

Hi Fabio! Sorry for slow response.

aiohttp==3.8.4
aiosignal==1.3.1
altair==4.2.2
anyio==3.6.2
argilla==1.7.0
async-timeout==4.0.2
attrs==23.1.0
backoff==2.2.1
blinker==1.6.2
cachetools==5.3.0
certifi==2023.5.7
cffi==1.15.1
charset-normalizer==3.1.0
click==8.1.3
commonmark==0.9.1
cryptography==40.0.2
dataclasses-json==0.5.7
decorator==5.1.1
Deprecated==1.2.13
entrypoints==0.4
et-xmlfile==1.1.0
faiss-cpu==1.7.4
filelock==3.12.0
frozenlist==1.3.3
gitdb==4.0.10
GitPython==3.1.31
h11==0.14.0
httpcore==0.16.3
httpx==0.23.3
idna==3.4
importlib-metadata==6.6.0
Jinja2==3.1.2
joblib==1.2.0
jsonschema==4.17.3
langchain==0.0.149
llama-cpp-python==0.1.54
lxml==4.9.2
Markdown==3.4.3
markdown-it-py==2.2.0
MarkupSafe==2.1.2
marshmallow==3.19.0
marshmallow-enum==1.5.1
mdurl==0.1.2
monotonic==1.6
mpmath==1.3.0
msg-parser==1.2.0
multidict==6.0.4
mypy-extensions==1.0.0
networkx==3.1
nltk==3.8.1
numexpr==2.8.4
numpy==1.23.5
olefile==0.46
openapi-schema-pydantic==1.2.4
openpyxl==3.1.2
packaging==23.1
pandas==1.5.3
pdf2image==1.16.3
pdfminer.six==20221105
Pillow==9.5.0
protobuf==3.20.3
pyarrow==12.0.0
pycparser==2.21
pydantic==1.10.8
pydeck==0.8.1b0
Pygments==2.15.1
pygpt4all==1.0.1
pygptj==2.0.3
pyllamacpp==1.0.6
Pympler==1.0.1
pypandoc==1.11
pypdf==3.8.1
pyrsistent==0.19.3
pytesseract==0.3.1
python-dateutil==2.8.2
python-docx==0.8.11
python-magic==0.4.27
python-pptx==0.6.21
pytz==2023.3
PyYAML==6.0
regex==2023.5.5
requests==2.31.0
rfc3986==1.5.0
rich==13.0.1
sentencepiece==0.1.99
six==1.16.0
smmap==5.0.0
sniffio==1.3.0
SQLAlchemy==2.0.15
streamlit==1.22.0
streamlit-ace==0.1.1
sympy==1.12
tenacity==8.2.2
toml==0.10.2
toolz==0.12.0
torch==2.0.1
tornado==6.3.2
tqdm==4.65.0
typer==0.9.0
typing-inspect==0.8.0
typing_extensions==4.6.1
tzdata==2023.3
tzlocal==5.0.1
unstructured==0.6.5
urllib3==2.0.2
validators==0.20.0
wrapt==1.14.1
XlsxWriter==3.1.1
yarl==1.9.2
zipp==3.15.0

@hindfelt
Copy link
Author

That looks like it works! Thx @fabiomatricardi 👍🙏

@fabiomatricardi
Copy link
Owner

Ciao, please refer to the comment above.
This happens usually only on Windows users. Running the installation of llama-cpp-python, required by
LangChain with the llamaEmbeddings, on windows CMake C complier is not installed by default, so you cannot build from source.
To avoid the issue you MUST use pre complied wheel.
Go here https://github.com/abetlen/llama-cpp-python/releases
and look for the complied wheel for your architecture and python version - you MUST take wheels Version 0.1.49
because higher versions are not compatible
.

In my case I have Windows 10, 64 bit, python 3.10
so my file is llama_cpp_python-0.1.49-cp310-cp310-win_amd64.whl

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants