Run llama.cpp
with tea
– without the installation pain!
#487
Replies: 2 comments 1 reply
-
where do i run this? cmd? please... |
Beta Was this translation helpful? Give feedback.
-
Hi. Found you. Got this: 04/19 23:33:07 [NOTICE] Downloading 1 item(s) 04/19 23:33:07 [ERROR] Exception caught while loading DHT routing table from /home/bla/.cache/aria2/dht.dat 04/19 23:33:07 [NOTICE] IPv4 DHT: listening on UDP port 6997 04/19 23:33:07 [NOTICE] IPv4 BitTorrent: listening on TCP port 6957 04/19 23:33:07 [NOTICE] IPv6 BitTorrent: listening on TCP port 6957 I don't know if it actually does something, though. edit: CTRL+C, running the command again ... and the error message vanished. Any infos would be greatly appreciated. |
Beta Was this translation helpful? Give feedback.
-
First of all, on behalf of open-source developers and users, thank you so much for porting LLaMA to C++ ❤️
Running open-source made easy
At tea1, we love open-source, so we packaged up
llama.cpp
and download the 7B model via torrents.All you need to get started is
sh <(curl https://tea.xyz)
llama.cpp -p "Getting paid to write open source can be accomplished in 3 simple steps:"
How it works
By running the package command
llama.cpp -p "..."
, tea automagically downloads all required dependencies and runsllama.cpp
for you. All dependencies get downloaded into an isolated~/.tea
directory, which doesn't mess with your system installations.If you are curious, you can take a look at the package file for llama.cpp here.
Footnotes
tea
is the next-generation cross-platform package manager from the creator of Homebrew. ↩Beta Was this translation helpful? Give feedback.
All reactions