Skip to content

Actions: Pints-AI/llama.cpp

flake8 Lint

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
12 workflow run results
12 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

(wip) add Flash decoding
flake8 Lint #12: Pull request #5 synchronize by FSSRepo
March 7, 2024 22:00 23s flash-decoding
March 7, 2024 22:00 23s
optional flash-decoding
flake8 Lint #11: Commit eecf7ee pushed by FSSRepo
March 7, 2024 22:00 20s flash-decoding
March 7, 2024 22:00 20s
(wip) add Flash decoding
flake8 Lint #10: Pull request #5 synchronize by FSSRepo
March 7, 2024 20:02 21s flash-decoding
March 7, 2024 20:02 21s
enable flash decoding
flake8 Lint #9: Commit f490812 pushed by FSSRepo
March 7, 2024 20:02 23s flash-decoding
March 7, 2024 20:02 23s
(wip) add Flash decoding
flake8 Lint #8: Pull request #5 synchronize by FSSRepo
March 7, 2024 17:23 26s flash-decoding
March 7, 2024 17:23 26s
fix NaNs when context reset
flake8 Lint #7: Commit 936cea0 pushed by FSSRepo
March 7, 2024 17:22 22s flash-decoding
March 7, 2024 17:22 22s
(wip) add Flash decoding
flake8 Lint #6: Pull request #5 opened by FSSRepo
March 6, 2024 05:29 21s flash-decoding
March 6, 2024 05:29 21s
cleanup FA implementation + flash decoding kernel (wip)
flake8 Lint #5: Commit 888a724 pushed by FSSRepo
March 6, 2024 05:23 21s flash-decoding
March 6, 2024 05:23 21s
Merge branch 'master' of https://github.com/Pints-App/llama.cpp
flake8 Lint #4: Commit c42f7cb pushed by FSSRepo
March 6, 2024 03:38 20s master
March 6, 2024 03:38 20s
compare-llama-bench.py : remove mul_mat_q (#5892)
flake8 Lint #2: Commit 652ca2b pushed by FSSRepo
March 6, 2024 03:27 27s master
March 6, 2024 03:27 27s
ggml : fix IQ3_S AVX implementation (#5834)
flake8 Lint #1: Commit 494c870 pushed by FSSRepo
March 2, 2024 19:14 24s master
March 2, 2024 19:14 24s