Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Chatting within your Editor #45

Merged
merged 9 commits into from
Aug 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

52 changes: 36 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,24 @@
<picture>
<source media="(prefers-color-scheme: dark)" srcset="/logos/logo-white-no-background-1024x1024.png">
<source media="(prefers-color-scheme: light)" srcset="/logos/logo-white-black-background-1024x1024.png">
<img alt="Logo" src="/logos/logo-white-black-background-1024x1024.png" width="128em">
</picture>
<div align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/user-attachments/assets/7849b743-a3d5-4fde-8ac7-960205c1b019">
<source media="(prefers-color-scheme: light)" srcset="https://github.com/user-attachments/assets/7903b3c2-a5ac-47e0-ae23-bd6a47b864ee">
<img alt="Logo" src="" width="650em">
</picture>
</div>

# LSP-AI
<p align="center">
<p align="center"><b>Empowering not replacing programmers.</b></p>
</p>

[![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/vKxfuAxA6Z)
<p align="center">
| <a href="https://github.com/SilasMarvin/lsp-ai/wiki"><b>Documentation</b></a> | <a href="https://silasmarvin.dev"><b>Blog</b></a> | <a href="https://discord.gg/vKxfuAxA6Z"><b>Discord</b></a> |
</p>

LSP-AI is an open source [language server](https://microsoft.github.io/language-server-protocol/) that serves as a backend for performing completion with large language models and soon other AI powered functionality. Because it is a language server, it works with any editor that has LSP support.
---

**The goal of LSP-AI is to assist and empower software engineers by integrating with the tools they already know and love not replace software engineers.**
LSP-AI is an open source [language server](https://microsoft.github.io/language-server-protocol/) that serves as a backend for AI-powered functionality in your favorite code editors. It offers features like in-editor chatting with LLMs and code completions. Because it is a language server, it works with any editor that has LSP support.

**The goal of LSP-AI is to assist and empower software engineers by integrating with the tools they already know and love, not replace software engineers.**

A short list of a few of the editors it works with:
- VS Code
Expand All @@ -21,13 +29,15 @@ A short list of a few of the editors it works with:

It works with many many many more editors.

See the wiki for instructions on:
- [Getting Started](https://github.com/SilasMarvin/lsp-ai/wiki)
- [Installation](https://github.com/SilasMarvin/lsp-ai/wiki/Installation)
- [Configuration](https://github.com/SilasMarvin/lsp-ai/wiki/Configuration)
- [Plugins](https://github.com/SilasMarvin/lsp-ai/wiki/Plugins)
- [Server Capabilities](https://github.com/SilasMarvin/lsp-ai/wiki/Server-Capabilities-and-Functions)
- [and more](https://github.com/SilasMarvin/lsp-ai/wiki)
# Features

## In-Editor Chatting

Chat directly in your codebase with your favorite local or hosted models.

*Chatting with Claude Sonnet in Helix*

## Code Completions

LSP-AI can work as an alternative to Github Copilot.

Expand All @@ -37,6 +47,16 @@ https://github.com/SilasMarvin/lsp-ai/assets/19626586/59430558-da23-4991-939d-57

**Note that speed for completions is entirely dependent on the backend being used. For the fastest completions we recommend using either a small local model or Groq.**

# Documentation

See the wiki for instructions on:
- [Getting Started](https://github.com/SilasMarvin/lsp-ai/wiki)
- [Installation](https://github.com/SilasMarvin/lsp-ai/wiki/Installation)
- [Configuration](https://github.com/SilasMarvin/lsp-ai/wiki/Configuration)
- [Plugins](https://github.com/SilasMarvin/lsp-ai/wiki/Plugins)
- [Server Capabilities](https://github.com/SilasMarvin/lsp-ai/wiki/Server-Capabilities-and-Functions)
- [and more](https://github.com/SilasMarvin/lsp-ai/wiki)

# The Case for LSP-AI

**tl;dr LSP-AI abstracts complex implementation details from editor specific plugin authors, centralizing open-source development work into one shareable backend.**
Expand Down
1 change: 1 addition & 0 deletions crates/lsp-ai/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ rayon = { version = "1.1.0", optional = true }
md5 = "0.7.0"
fxhash = "0.2.1"
ordered-float = "4.2.1"
futures = "0.3"

[build-dependencies]
cc="*"
Expand Down
20 changes: 20 additions & 0 deletions crates/lsp-ai/src/config.rs
Original file line number Diff line number Diff line change
Expand Up @@ -340,12 +340,26 @@ pub(crate) struct Completion {
pub(crate) post_process: PostProcess,
}

#[derive(Clone, Debug, Deserialize)]
pub struct Chat {
// The trigger text
pub(crate) trigger: String,
// The name to display in the editor
pub(crate) action_display_name: String,
// The model key to use
pub(crate) model: String,
// Args are deserialized by the backend using them
#[serde(default)]
pub(crate) parameters: Kwargs,
}

#[derive(Clone, Debug, Deserialize)]
#[serde(deny_unknown_fields)]
pub(crate) struct ValidConfig {
pub(crate) memory: ValidMemoryBackend,
pub(crate) models: HashMap<String, ValidModel>,
pub(crate) completion: Option<Completion>,
pub(crate) chat: Option<Vec<Chat>>,
}

#[derive(Clone, Debug, Deserialize, Default)]
Expand Down Expand Up @@ -382,6 +396,10 @@ impl Config {
// Helpers for the backends ///////////
///////////////////////////////////////

pub fn get_chat(&self) -> Option<&Vec<Chat>> {
self.config.chat.as_ref()
}

pub fn is_completions_enabled(&self) -> bool {
self.config.completion.is_some()
}
Expand Down Expand Up @@ -428,6 +446,7 @@ impl Config {
memory: ValidMemoryBackend::FileStore(FileStore { crawl: None }),
models: HashMap::new(),
completion: None,
chat: None,
},
client_params: ValidClientParams { root_uri: None },
}
Expand All @@ -439,6 +458,7 @@ impl Config {
memory: ValidMemoryBackend::VectorStore(vector_store),
models: HashMap::new(),
completion: None,
chat: None,
},
client_params: ValidClientParams { root_uri: None },
}
Expand Down
30 changes: 29 additions & 1 deletion crates/lsp-ai/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ use anyhow::Result;

use lsp_server::{Connection, ExtractError, Message, Notification, Request, RequestId};
use lsp_types::{
request::Completion, CompletionOptions, DidChangeTextDocumentParams, DidOpenTextDocumentParams,
request::{CodeActionRequest, CodeActionResolveRequest, Completion},
CodeActionOptions, CompletionOptions, DidChangeTextDocumentParams, DidOpenTextDocumentParams,
RenameFilesParams, ServerCapabilities, TextDocumentSyncKind,
};
use std::{
Expand Down Expand Up @@ -73,6 +74,12 @@ fn main() -> Result<()> {
text_document_sync: Some(lsp_types::TextDocumentSyncCapability::Kind(
TextDocumentSyncKind::INCREMENTAL,
)),
code_action_provider: Some(lsp_types::CodeActionProviderCapability::Options(
CodeActionOptions {
resolve_provider: Some(true),
..Default::default()
},
)),
..Default::default()
})?;
let initialization_args = connection.initialize(server_capabilities)?;
Expand Down Expand Up @@ -152,6 +159,27 @@ fn main_loop(connection: Connection, args: serde_json::Value) -> Result<()> {
}
Err(err) => error!("{err:?}"),
}
} else if request_is::<CodeActionRequest>(&req) {
match cast::<CodeActionRequest>(req) {
Ok((id, params)) => {
let code_action_request =
transformer_worker::CodeActionRequest::new(id, params);
transformer_tx
.send(WorkerRequest::CodeActionRequest(code_action_request))?;
}
Err(err) => error!("{err:?}"),
}
} else if request_is::<CodeActionResolveRequest>(&req) {
match cast::<CodeActionResolveRequest>(req) {
Ok((id, params)) => {
let code_action_request =
transformer_worker::CodeActionResolveRequest::new(id, params);
transformer_tx.send(WorkerRequest::CodeActionResolveRequest(
code_action_request,
))?;
}
Err(err) => error!("{err:?}"),
}
} else {
error!("lsp-ai currently only supports textDocument/completion, textDocument/generation and textDocument/generationStream")
}
Expand Down
34 changes: 33 additions & 1 deletion crates/lsp-ai/src/memory_backends/file_store.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
use anyhow::Context;
use indexmap::IndexSet;
use lsp_types::TextDocumentPositionParams;
use lsp_types::{Range, TextDocumentIdentifier, TextDocumentPositionParams};
use parking_lot::{Mutex, RwLock};
use ropey::Rope;
use serde_json::Value;
Expand Down Expand Up @@ -318,6 +318,38 @@ impl MemoryBackend for FileStore {
Ok(line)
}

#[instrument(skip(self))]
fn code_action_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
_range: &Range,
trigger: &str,
) -> anyhow::Result<bool> {
Ok(self
.file_map
.read()
.get(text_document_identifier.uri.as_str())
.context("Error file not found")?
.rope
.chunks()
.find(|x| x.contains(trigger))
.is_some())
}

#[instrument(skip(self))]
fn file_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
) -> anyhow::Result<String> {
Ok(self
.file_map
.read()
.get(text_document_identifier.uri.as_str())
.context("Error file not found")?
.rope
.to_string())
}

#[instrument(skip(self))]
async fn build_prompt(
&self,
Expand Down
14 changes: 12 additions & 2 deletions crates/lsp-ai/src/memory_backends/mod.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
use lsp_types::{
DidChangeTextDocumentParams, DidOpenTextDocumentParams, RenameFilesParams,
TextDocumentPositionParams,
DidChangeTextDocumentParams, DidOpenTextDocumentParams, Range, RenameFilesParams,
TextDocumentIdentifier, TextDocumentPositionParams,
};
use serde_json::Value;

Expand Down Expand Up @@ -115,6 +115,16 @@ pub trait MemoryBackend {
Ok(())
}
fn opened_text_document(&self, params: DidOpenTextDocumentParams) -> anyhow::Result<()>;
fn code_action_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
range: &Range,
trigger: &str,
) -> anyhow::Result<bool>;
fn file_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
) -> anyhow::Result<String>;
fn changed_text_document(&self, params: DidChangeTextDocumentParams) -> anyhow::Result<()>;
fn renamed_files(&self, params: RenameFilesParams) -> anyhow::Result<()>;
fn get_filter_text(&self, position: &TextDocumentPositionParams) -> anyhow::Result<String>;
Expand Down
21 changes: 20 additions & 1 deletion crates/lsp-ai/src/memory_backends/postgresml/mod.rs
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
use anyhow::Context;
use lsp_types::TextDocumentPositionParams;
use lsp_types::{Range, TextDocumentIdentifier, TextDocumentPositionParams};
use parking_lot::Mutex;
use pgml::{Collection, Pipeline};
use rand::{distributions::Alphanumeric, Rng};
Expand Down Expand Up @@ -470,11 +470,30 @@ impl PostgresML {

#[async_trait::async_trait]
impl MemoryBackend for PostgresML {
#[instrument(skip(self))]
fn code_action_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
range: &Range,
trigger: &str,
) -> anyhow::Result<bool> {
self.file_store
.code_action_request(text_document_identifier, range, trigger)
}

#[instrument(skip(self))]
fn get_filter_text(&self, position: &TextDocumentPositionParams) -> anyhow::Result<String> {
self.file_store.get_filter_text(position)
}

#[instrument(skip(self))]
fn file_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
) -> anyhow::Result<String> {
self.file_store.file_request(text_document_identifier)
}

#[instrument(skip(self))]
async fn build_prompt(
&self,
Expand Down
23 changes: 21 additions & 2 deletions crates/lsp-ai/src/memory_backends/vector_store.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
use anyhow::Context;
use fxhash::FxBuildHasher;
use lsp_types::{
DidChangeTextDocumentParams, DidOpenTextDocumentParams, RenameFilesParams,
TextDocumentPositionParams,
DidChangeTextDocumentParams, DidOpenTextDocumentParams, Range, RenameFilesParams,
TextDocumentIdentifier, TextDocumentPositionParams,
};
use ordered_float::OrderedFloat;
use parking_lot::{Mutex, RwLock};
Expand Down Expand Up @@ -612,6 +612,25 @@ impl VectorStore {

#[async_trait::async_trait]
impl MemoryBackend for VectorStore {
#[instrument(skip(self))]
fn code_action_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
range: &Range,
trigger: &str,
) -> anyhow::Result<bool> {
self.file_store
.code_action_request(text_document_identifier, range, trigger)
}

#[instrument(skip(self))]
fn file_request(
&self,
text_document_identifier: &TextDocumentIdentifier,
) -> anyhow::Result<String> {
self.file_store.file_request(text_document_identifier)
}

#[instrument(skip(self))]
fn opened_text_document(&self, params: DidOpenTextDocumentParams) -> anyhow::Result<()> {
let uri = params.text_document.uri.to_string();
Expand Down
Loading