Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump llama-cpp-python from 0.2.57 to 0.2.64 #91

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Apr 23, 2024

Bumps llama-cpp-python from 0.2.57 to 0.2.64.

Changelog

Sourced from llama-cpp-python's changelog.

[0.2.64]

  • feat: Update llama.cpp to ggml-org/llama.cpp@4e96a81
  • feat: Add llama-3 chat format by @​andreabak in #1371
  • feat: Use new llama_token_is_eog in create_completions by @​abetlen in d40a250ef3cfaa8224d12c83776a2f1de96ae3d1
  • feat(server): Provide ability to dynamically allocate all threads if desired using -1 by @​sean-bailey in #1364
  • ci: Build arm64 wheels by @​gaby in 611781f5319719a3d05fefccbbf0cc321742a026
  • fix: Update scikit-build-core build dependency avoid bug in 0.9.1 by @​evelkey in #1370

[0.2.63]

  • feat: Update llama.cpp to ggml-org/llama.cpp@0e4802b
  • feat: Add stopping_criteria to ChatFormatter, allow stopping on arbitrary token ids, fixes llama3 instruct by @​abetlen in cc81afebf04d26ca1ac3cf72f23f18da6ab58588

[0.2.62]

[0.2.61]

  • feat: Update llama.cpp to ggml-org/llama.cpp@ba5e134
  • fix: pass correct type to chat handlers for chat completion logprobs by @​abetlen in bb65b4d76411112c6fb0bf759efd746f99ef3c6b
  • feat: Add support for yaml based server configs by @​abetlen in 060bfa64d529ade2af9b1f4e207a3937bbc4138f
  • feat: Add typechecking for ctypes structure attributes by @​abetlen in 1347e1d050fc5a9a32ffe0bb3e22858da28003bd

[0.2.60]

  • feat: Update llama.cpp to ggml-org/llama.cpp@75cd4c7
  • fix: Always embed metal library by @​abetlen in b3bfea6dbfb6ed9ce18f9a2723e0a9e4bd1da7ad
  • fix: missing logprobs in response, incorrect response type for functionary by @​abetlen in 1ae3abbcc3af7f4a25a3ffc40b246f18039565e8
  • fix(docs): incorrect tool_choice example by @​CISC in #1330

[0.2.59]

[0.2.58]

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) from 0.2.57 to 0.2.64.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Changelog](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md)
- [Commits](abetlen/llama-cpp-python@v0.2.57...v0.2.64)

---
updated-dependencies:
- dependency-name: llama-cpp-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Apr 23, 2024
Copy link
Contributor Author

dependabot bot commented on behalf of github Apr 26, 2024

Superseded by #93.

@dependabot dependabot bot closed this Apr 26, 2024
@dependabot dependabot bot deleted the dependabot/pip/llama-cpp-python-0.2.64 branch April 26, 2024 17:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants