-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore:(refactor) - plugin-local-ai-final #3663
Conversation
…e if model produce data or not. Fixed also a precedent issue in the vision.
… Ollama pefroermance very bad compared to StudioLM
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThis pull request updates local AI configurations by adding new environment variables in the Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant LocalAIManager
participant StudioLMManager
participant OllamaManager
Client->>LocalAIManager: Request generateTextOllamaStudio(params)
LocalAIManager->>LocalAIManager: Validate environment config
alt StudioLM selected
LocalAIManager->>StudioLMManager: Initialize/check status & fetch models
StudioLMManager-->>LocalAIManager: Return generated text
else Ollama selected
LocalAIManager->>OllamaManager: Initialize/check server status & fetch models
OllamaManager-->>LocalAIManager: Return generated text
end
LocalAIManager->>Client: Return generated text response
Suggested labels
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
@coderabbitai add summary |
🧩 Analysis chain🏁 Script executed:
Length of output: 1286 🏁 Script executed:
Length of output: 930 🏁 Script executed:
Length of output: 69126 @wtfsayo: Based on my analysis of the PR, here's a comprehensive summary: PR Summary: Refactor - plugin-local-ai-finalThis PR enhances the Local AI plugin by adding support for multiple AI model providers while maintaining backward compatibility with existing local implementations. The key features include: Core Enhancements
Architecture
Capabilities
Configuration
All features have been tested and are functional except for the TTS (Text-to-Speech) component. ✅ Actions performedSummary regeneration triggered. |
going to merge before things pile up; Will report testing later in discord |
The Local AI run on R1 models from Deepseek.
All the features are tested but the TTS.
Added the mananer for the Studio LM
Added the manager for the Ollama (very bad performances)
Summary by CodeRabbit
New Features
Refactor