Skip to content

Actions: Pints-AI/llama.cpp

Code Coverage

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
12 workflow run results
12 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

(wip) add Flash decoding
Code Coverage #12: Pull request #5 synchronize by FSSRepo
March 7, 2024 22:00 2m 9s flash-decoding
March 7, 2024 22:00 2m 9s
optional flash-decoding
Code Coverage #11: Commit eecf7ee pushed by FSSRepo
March 7, 2024 22:00 2m 7s flash-decoding
March 7, 2024 22:00 2m 7s
(wip) add Flash decoding
Code Coverage #10: Pull request #5 synchronize by FSSRepo
March 7, 2024 20:02 2m 5s flash-decoding
March 7, 2024 20:02 2m 5s
enable flash decoding
Code Coverage #9: Commit f490812 pushed by FSSRepo
March 7, 2024 20:02 2m 4s flash-decoding
March 7, 2024 20:02 2m 4s
(wip) add Flash decoding
Code Coverage #8: Pull request #5 synchronize by FSSRepo
March 7, 2024 17:23 2m 12s flash-decoding
March 7, 2024 17:23 2m 12s
fix NaNs when context reset
Code Coverage #7: Commit 936cea0 pushed by FSSRepo
March 7, 2024 17:22 2m 4s flash-decoding
March 7, 2024 17:22 2m 4s
(wip) add Flash decoding
Code Coverage #6: Pull request #5 opened by FSSRepo
March 6, 2024 05:29 2m 21s flash-decoding
March 6, 2024 05:29 2m 21s
cleanup FA implementation + flash decoding kernel (wip)
Code Coverage #5: Commit 888a724 pushed by FSSRepo
March 6, 2024 05:23 2m 6s flash-decoding
March 6, 2024 05:23 2m 6s
Merge branch 'master' of https://github.com/Pints-App/llama.cpp
Code Coverage #4: Commit c42f7cb pushed by FSSRepo
March 6, 2024 03:38 2m 6s master
March 6, 2024 03:38 2m 6s
compare-llama-bench.py : remove mul_mat_q (#5892)
Code Coverage #2: Commit 652ca2b pushed by FSSRepo
March 6, 2024 03:27 2m 12s master
March 6, 2024 03:27 2m 12s
ggml : fix IQ3_S AVX implementation (#5834)
Code Coverage #1: Commit 494c870 pushed by FSSRepo
March 2, 2024 19:14 2m 7s master
March 2, 2024 19:14 2m 7s