Skip to content

Actions: Pints-AI/llama.cpp

Nix CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
7 workflow run results
7 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

(wip) add Flash decoding
Nix CI #7: Pull request #5 synchronize by FSSRepo
March 7, 2024 22:00 13m 27s flash-decoding
March 7, 2024 22:00 13m 27s
(wip) add Flash decoding
Nix CI #6: Pull request #5 synchronize by FSSRepo
March 7, 2024 20:02 13m 18s flash-decoding
March 7, 2024 20:02 13m 18s
(wip) add Flash decoding
Nix CI #5: Pull request #5 synchronize by FSSRepo
March 7, 2024 17:23 13m 6s flash-decoding
March 7, 2024 17:23 13m 6s
(wip) add Flash decoding
Nix CI #4: Pull request #5 opened by FSSRepo
March 6, 2024 05:29 13m 34s flash-decoding
March 6, 2024 05:29 13m 34s
Merge branch 'master' of https://github.com/Pints-App/llama.cpp
Nix CI #3: Commit c42f7cb pushed by FSSRepo
March 6, 2024 03:38 4m 23s master
March 6, 2024 03:38 4m 23s
compare-llama-bench.py : remove mul_mat_q (#5892)
Nix CI #2: Commit 652ca2b pushed by FSSRepo
March 6, 2024 03:27 4m 52s master
March 6, 2024 03:27 4m 52s
ggml : fix IQ3_S AVX implementation (#5834)
Nix CI #1: Commit 494c870 pushed by FSSRepo
March 2, 2024 19:14 7m 53s master
March 2, 2024 19:14 7m 53s