Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

Ggml v3 support in Llama.cpp #84

Closed
synw opened this issue May 28, 2023 · 4 comments
Closed

Ggml v3 support in Llama.cpp #84

synw opened this issue May 28, 2023 · 4 comments
Labels
sync deps Regular update and synchronisation of upper-stream dependencies

Comments

@synw
Copy link

synw commented May 28, 2023

Hi, thanks for this nice package. LLama.cpp made a new breaking change in their quantization methods recently (PR ref).

Would it be possible to get an update for the llama-node package to be able to use the ggml v3 models? Actually the new ggml models that come out are all using this format

@hlhr202
Copy link
Member

hlhr202 commented May 29, 2023

will update in the following few days.

@hlhr202 hlhr202 added the sync deps Regular update and synchronisation of upper-stream dependencies label May 29, 2023
@hlhr202
Copy link
Member

hlhr202 commented May 29, 2023

resolved in v0.1.6

@hlhr202 hlhr202 closed this as completed May 29, 2023
@synw
Copy link
Author

synw commented May 29, 2023

I've just installed 0.1.6 to test it but it refuses to compile typescript code with this error:

$ tsc -p .
node_modules/@llama-node/llama-cpp/index.d.ts:137:31 - error TS2304: Cannot find name 'LoadModel'.

137   static load(params: Partial<LoadModel>, enableLogger: boolean): Promise<LLama>

@synw
Copy link
Author

synw commented May 29, 2023

If I change in llama-cpp/index.d.ts line 137 static load(params: Partial<LoadModel> by static load(params: Partial<ModelLoad> it works. And I confirm that the compiled code can run ggml v3 models: nice 👍

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
sync deps Regular update and synchronisation of upper-stream dependencies
Projects
None yet
Development

No branches or pull requests

2 participants