Skip to content
This repository has been archived by the owner on May 31, 2023. It is now read-only.

Bump llama-cpp-python[server] from 0.1.50 to 0.1.54 #7

Merged
merged 1 commit into from
May 24, 2023

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github May 24, 2023

Bumps llama-cpp-python[server] from 0.1.50 to 0.1.54.

Commits
  • e5d596e Bump version
  • c41b1eb Update llama.cpp
  • aa3d7a6 Merge pull request #263 from abetlen/dependabot/pip/mkdocs-material-9.1.14
  • 2240b94 Bump mkdocs-material from 9.1.12 to 9.1.14
  • 01c79e7 Merge pull request #258 from Pipboyguy/main
  • c3e80b1 Merge pull request #262 from abetlen/dependabot/pip/httpx-0.24.1
  • 8e41d72 Bump httpx from 0.24.0 to 0.24.1
  • e6639e6 Change docker build dynamic param to image instead of cuda version
  • 4f7a6da Merge pull request #248 from localagi/main
  • 0adb9ec Use model_name and index in response
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.50 to 0.1.54.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Commits](abetlen/llama-cpp-python@v0.1.50...v0.1.54)

---
updated-dependencies:
- dependency-name: llama-cpp-python[server]
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label May 24, 2023
@ChaoticByte
Copy link
Owner

llama.cpp changed their format again, models have to be requantized again. See ggerganov/llama.cpp#1508
-> new models are in ggml3 format

@ChaoticByte ChaoticByte merged commit 060d522 into main May 24, 2023
@dependabot dependabot bot deleted the dependabot/pip/llama-cpp-python-server--0.1.54 branch May 24, 2023 19:06
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant