Skip to content

Actions: Pints-AI/llama.cpp

flake8 Lint

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
24 workflow run results
24 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

(wip) add Flash decoding
flake8 Lint #24: Pull request #5 synchronize by FSSRepo
March 23, 2024 17:46 20s flash-decoding
March 23, 2024 17:46 20s
flash original
flake8 Lint #23: Commit 7c28e0f pushed by FSSRepo
March 23, 2024 17:46 20s flash-decoding
March 23, 2024 17:46 20s
(wip) add Flash decoding
flake8 Lint #22: Pull request #5 synchronize by FSSRepo
March 14, 2024 17:59 22s flash-decoding
March 14, 2024 17:59 22s
flash decoding load tile data in sram
flake8 Lint #21: Commit 9d3b57e pushed by FSSRepo
March 14, 2024 17:59 22s flash-decoding
March 14, 2024 17:59 22s
(wip) add Flash decoding
flake8 Lint #20: Pull request #5 synchronize by FSSRepo
March 9, 2024 01:21 22s flash-decoding
March 9, 2024 01:21 22s
update test
flake8 Lint #19: Commit 653b257 pushed by FSSRepo
March 9, 2024 01:21 21s flash-decoding
March 9, 2024 01:21 21s
(wip) add Flash decoding
flake8 Lint #18: Pull request #5 synchronize by FSSRepo
March 9, 2024 00:22 25s flash-decoding
March 9, 2024 00:22 25s
fix mixtral models - flash decoding
flake8 Lint #17: Commit 7b979d1 pushed by FSSRepo
March 9, 2024 00:22 20s flash-decoding
March 9, 2024 00:22 20s
(wip) add Flash decoding
flake8 Lint #16: Pull request #5 synchronize by FSSRepo
March 8, 2024 17:29 27s flash-decoding
March 8, 2024 17:29 27s
add more debug mods
flake8 Lint #15: Commit 82374d0 pushed by FSSRepo
March 8, 2024 17:29 20s flash-decoding
March 8, 2024 17:29 20s
(wip) add Flash decoding
flake8 Lint #14: Pull request #5 synchronize by FSSRepo
March 8, 2024 02:54 24s flash-decoding
March 8, 2024 02:54 24s
fix bug + debug prints
flake8 Lint #13: Commit be9ecd6 pushed by FSSRepo
March 8, 2024 02:54 20s flash-decoding
March 8, 2024 02:54 20s
(wip) add Flash decoding
flake8 Lint #12: Pull request #5 synchronize by FSSRepo
March 7, 2024 22:00 23s flash-decoding
March 7, 2024 22:00 23s
optional flash-decoding
flake8 Lint #11: Commit eecf7ee pushed by FSSRepo
March 7, 2024 22:00 20s flash-decoding
March 7, 2024 22:00 20s
(wip) add Flash decoding
flake8 Lint #10: Pull request #5 synchronize by FSSRepo
March 7, 2024 20:02 21s flash-decoding
March 7, 2024 20:02 21s
enable flash decoding
flake8 Lint #9: Commit f490812 pushed by FSSRepo
March 7, 2024 20:02 23s flash-decoding
March 7, 2024 20:02 23s
(wip) add Flash decoding
flake8 Lint #8: Pull request #5 synchronize by FSSRepo
March 7, 2024 17:23 26s flash-decoding
March 7, 2024 17:23 26s
fix NaNs when context reset
flake8 Lint #7: Commit 936cea0 pushed by FSSRepo
March 7, 2024 17:22 22s flash-decoding
March 7, 2024 17:22 22s
(wip) add Flash decoding
flake8 Lint #6: Pull request #5 opened by FSSRepo
March 6, 2024 05:29 21s flash-decoding
March 6, 2024 05:29 21s
cleanup FA implementation + flash decoding kernel (wip)
flake8 Lint #5: Commit 888a724 pushed by FSSRepo
March 6, 2024 05:23 21s flash-decoding
March 6, 2024 05:23 21s
Merge branch 'master' of https://github.com/Pints-App/llama.cpp
flake8 Lint #4: Commit c42f7cb pushed by FSSRepo
March 6, 2024 03:38 20s master
March 6, 2024 03:38 20s
compare-llama-bench.py : remove mul_mat_q (#5892)
flake8 Lint #2: Commit 652ca2b pushed by FSSRepo
March 6, 2024 03:27 27s master
March 6, 2024 03:27 27s
ggml : fix IQ3_S AVX implementation (#5834)
flake8 Lint #1: Commit 494c870 pushed by FSSRepo
March 2, 2024 19:14 24s master
March 2, 2024 19:14 24s