-
Notifications
You must be signed in to change notification settings - Fork 9
LSP‐AI for Ace‐linters
The LSP-AI language server provides advanced AI-assisted code analysis features like autocompletion, which enhance the Ace Editor environment for various programming languages.
-
Prerequisites:
- Ensure you have Rust installed on your system. You can install Rust using rustup. This will include Cargo, the Rust package manager.
- ace-linters ^1.2.3
-
Install LSP-AI Language Server: Use Cargo to install the lsp-ai language server. Depending on your system requirements, you can choose to install it with different features:
-
Basic Installation:
cargo install lsp-ai
-
With
llama_cpp
Support: This is needed if you want to leverage AI models for code completion.cargo install lsp-ai --features llama_cpp
-
With
metal
andllama_cpp
for MacOS: Recommended for users on MacOS with Metal support.cargo install lsp-ai --features "llama_cpp metal"
-
With
cuda
andllama_cpp
for Nvidia GPUs: Recommended for users with Nvidia GPUs on Linux.cargo install lsp-ai --features "llama_cpp cuda"
For detailed installation options and troubleshooting, visit the installation guide.
-
-
Running the Language Server: Ensure that the LSP-AI language server is running on the specified WebSocket port. For setting up the WebSocket server for LSP-AI, you can use tools like Language Server WebSocket Bridge for integrating multiple language servers.
To integrate the lsp-ai language server with Ace Linters, configure your Ace Editor environment to connect to the lsp-ai language server via WebSocket. Here’s an example configuration:
// defaultGenerationConfiguration - (https://github.com/SilasMarvin/lsp-ai/wiki/Configuration)
// you will need OPENAI_API_KEY in env or use "auth_token" instead of "auth_token_env_var_name"
const defaultServerConfiguration =
{
"memory": {
"file_store": {}
},
"models": {
"model1": {
"type": "open_ai",
"chat_endpoint": "https://api.openai.com/v1/chat/completions",
"model": "gpt-3.5-turbo-0125",
"auth_token_env_var_name": "OPENAI_API_KEY"
}
},
"completion": {
...defaultGenerationConfiguration
}
}
const serverData = {
module: () => import("ace-linters/build/language-client"),
modes: "javascript",
type: "socket",
socket: new WebSocket("ws://localhost:3030/lsp-ai"),
initializationOptions: defaultServerConfiguration
}
Refer to lsp-ai configuration for configuration defaultGenerationConfiguration
and defaultServerConfiguration
or just look at Full example
-
Connection Issues: Ensure the WebSocket server for the lsp-ai language server is set up correctly and running. Verify the port and endpoint configurations.
-
Feature Limitations: Some advanced features may experience delays or reduced performance due to network latency or server configuration.
-
Error Logs: Check the language server’s logs for any error messages or warnings that might indicate misconfiguration or compatibility issues.
-
lsp-ai GitHub Repository - Access the source code, documentation, and community support for the lsp-ai language server.
-
Ace Editor Documentation - Learn more about Ace Editor configurations and capabilities.
-
Language Server Protocol (LSP) - Understand the protocol standards and implementations.
This guide provides the necessary steps and configuration details for integrating the lsp-ai language server with Ace Linters. Adjust the configuration parameters to match your specific environment and requirements.