Skip to content

Actions: Pints-AI/llama.cpp

CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
6 workflow run results
6 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

(wip) add Flash decoding
CI #6: Pull request #5 synchronize by FSSRepo
March 7, 2024 22:00 24m 43s flash-decoding
March 7, 2024 22:00 24m 43s
(wip) add Flash decoding
CI #5: Pull request #5 synchronize by FSSRepo
March 7, 2024 20:02 20m 27s flash-decoding
March 7, 2024 20:02 20m 27s
(wip) add Flash decoding
CI #4: Pull request #5 synchronize by FSSRepo
March 7, 2024 17:23 20m 31s flash-decoding
March 7, 2024 17:23 20m 31s
(wip) add Flash decoding
CI #3: Pull request #5 opened by FSSRepo
March 6, 2024 05:29 25m 30s flash-decoding
March 6, 2024 05:29 25m 30s
compare-llama-bench.py : remove mul_mat_q (#5892)
CI #2: Commit 652ca2b pushed by FSSRepo
March 6, 2024 03:27 22m 21s master
March 6, 2024 03:27 22m 21s
ggml : fix IQ3_S AVX implementation (#5834)
CI #1: Commit 494c870 pushed by FSSRepo
March 2, 2024 19:14 22m 17s master
March 2, 2024 19:14 22m 17s