Skip to content

orbitalquark/textadept-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Ollama

Chat with local Ollama models using Textadept.

Requires Ollama and curl to be installed, and Ollama needs to be running in server mode with one or more local models available.

Install this module by copying it into your ~/.textadept/modules/ directory or Textadept's modules/ directory, and then putting the following in your ~/.textadept/init.lua:

local ollama = require('ollama')

Start a chat session from the "Tools > Ollama > Chat..." menu.

Pressing Enter will prompt the model with the current or selected lines. Pressing Shift+Enter adds a new line without prompting the model. Typing @ will prompt you for an open file to inline as context to the model prompt.

Chatting with external models

You can configure this module to talk to external models that do not use the Ollama API. Here is a sample configuration to talk to an OpenAI-compatible model (tested with LiteLLM):

local ollama = require('ollama')
ollama.url = 'https://example.com'
ollama.models_endpoint = '/models'
ollama.model_name_key = 'id'
ollama.chat_endpoint = '/chat/completions'
ollama.chat_message = function(response) return response.choices[1].message end
ollama.curl_headers = {['Content-Type'] = 'application/json'}
ollama.api_key = 'API_KEY'

ollama.MARK_PROMPT

The marker number for prompt lines.

ollama.MARK_PROMPT_COLOR

The color of prompt markers.

ollama.api_key

API authorization key when chatting with external models.

The default value is nil since Ollama does not need this.

ollama.chat([model])

Opens a new chat session with a model.

Parameters:

  • model: String model name to chat with. If nil, the user is prompted for one.

ollama.chat_endpoint

REST endpoint for chatting with a model.

The default value is '/api/chat' and should only be changed if you are not using Ollama.

ollama.chat_message(response)

Function to extract the message from the REST response for chat_endpoint.

This should only be changed if you are not using Ollama.

Parameters:

  • response:

ollama.curl_headers

Optional map of HTTP headers to send with curl requests to an external model.

The default value is an empty map since Ollama does not need any headers.

ollama.model_name_key

The key whose value is the model name for each model in the REST response for models_endpoint.

The default value is 'name' and should only be changed if you are not using Ollama.

ollama.model_options

Map of model names with their options.

Options are tables that will be encoded into JSON before being sent to Ollama.

ollama.models_endpoint

REST endpoint for fetching a list of available models.

The default value is '/api/tags' and should only be changed if you are not using Ollama.

ollama.prompt(input)

Prompts the current chat model with input.

A model's response will be printed when it finishes thinking.

Parameters:

  • input: String input to prompt with. Any '@filename' references are replaced with their file's contents.

ollama.url

URL Ollama is running on (http://host:port).

The default value is http://localhost:11434 and should only be changed if Ollama is running on a different port, or if you are not using Ollama.

About

Chat with local Ollama models using Textadept.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Languages