Skip to content
/ chat Public

A simple natural language to shell command translator using an LLM running locally in Ollama.

License

Notifications You must be signed in to change notification settings

seakayone/chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chat

A simple natural language to shell command translator using Ollama with the codestral LLM.

Currently it supports only zsh and MacOS.

Demo

Demo of chat in action

Pre-requisites

Ollama must be installed and up and running. The codestral model must be available.

Run Ollama in the background and pull the codestral LLM before using the chatbot:

ollama serve &

ollama pull codestral

Build and Run

Build the project by running:

cargo build --release

Run the chat from cargo:

cargo run

Install the chatbot by placing the binary from target/release/chat in your path.

Usage

For now the chat is a simple terminal chatbot. You can chat with it by running:

chat

Future Work

  • Add a non-interactive mode and execute the commands directly
  • Add chat history
  • Add a feature which explains the generated command
  • Make it configurable to use different LLMs
  • Make it configurable to support different additional tools installed on the machine
  • Add support for more shells
  • Add support for other OSs

Know Issues

These are known issue and will be fixed in the future:

  • Sometimes the model will not return only the command but also some explanation.
  • Very complex workflows might fail to generate the correct command.

About

A simple natural language to shell command translator using an LLM running locally in Ollama.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages