Skip to content

Actions: Pints-AI/llama.cpp

EditorConfig Checker

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
7 workflow run results
7 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

(wip) add Flash decoding
EditorConfig Checker #7: Pull request #5 synchronize by FSSRepo
March 7, 2024 22:00 25s flash-decoding
March 7, 2024 22:00 25s
(wip) add Flash decoding
EditorConfig Checker #6: Pull request #5 synchronize by FSSRepo
March 7, 2024 20:02 22s flash-decoding
March 7, 2024 20:02 22s
(wip) add Flash decoding
EditorConfig Checker #5: Pull request #5 synchronize by FSSRepo
March 7, 2024 17:23 22s flash-decoding
March 7, 2024 17:23 22s
(wip) add Flash decoding
EditorConfig Checker #4: Pull request #5 opened by FSSRepo
March 6, 2024 05:29 21s flash-decoding
March 6, 2024 05:29 21s
Merge branch 'master' of https://github.com/Pints-App/llama.cpp
EditorConfig Checker #3: Commit c42f7cb pushed by FSSRepo
March 6, 2024 03:38 17s master
March 6, 2024 03:38 17s
compare-llama-bench.py : remove mul_mat_q (#5892)
EditorConfig Checker #2: Commit 652ca2b pushed by FSSRepo
March 6, 2024 03:27 23s master
March 6, 2024 03:27 23s
ggml : fix IQ3_S AVX implementation (#5834)
EditorConfig Checker #1: Commit 494c870 pushed by FSSRepo
March 2, 2024 19:14 22s master
March 2, 2024 19:14 22s