-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(provider_engine): add DeepSeek integration #501
Conversation
WalkthroughThe changes add support for a new AI provider named DeepSeek. The README is updated to include DeepSeek’s API link and correct an existing Gemini link. In the provider module, a new Provider instance for DeepSeek is registered with its configuration details. Additionally, a new engine class, DeepSeekAIEngine, is introduced with methods to define supported models and to handle both streaming and non-streaming completion responses. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Engine as DeepSeekAIEngine
participant API as DeepSeek API
Client->>Engine: Request streaming completion
Engine->>API: Initiate streaming request
API-->>Engine: Return ChatCompletionChunk(s)
Engine->>Engine: Process each chunk (check reasoning_content)
Engine->>Client: Yield processed content
sequenceDiagram
participant Client
participant Engine as DeepSeekAIEngine
participant API as DeepSeek API
Client->>Engine: Request non-streaming completion
Engine->>Engine: Convert message (prepare_message_response)
Engine->>API: Send formatted completion request
API-->>Engine: Return ChatCompletion response
Engine->>Engine: Process and format response (extract reasoning)
Engine->>Client: Return final message block
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (3)
README.md
(1 hunks)basilisk/provider.py
(1 hunks)basilisk/provider_engine/deepseek_engine.py
(1 hunks)
🔇 Additional comments (1)
README.md (1)
69-69
: DeepSeek addition looks good.
Your new link is properly added and consistent with the existing format.
models = [ | ||
ProviderAIModel( | ||
id="deepseek-chat", | ||
name="DeepSeek-V3", | ||
# Translators: This is a model description | ||
description="", | ||
context_window=64000, | ||
max_temperature=2.0, | ||
default_temperature=1.0, | ||
max_output_tokens=8000, | ||
), | ||
ProviderAIModel( | ||
id="deepseek-reasoner", | ||
name="DeepSeek-R1", | ||
# Translators: This is a model description | ||
description="", | ||
context_window=64000, | ||
max_temperature=2.0, | ||
default_temperature=1.0, | ||
max_output_tokens=8000, | ||
), | ||
] | ||
return models |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick (assertive)
Consider providing a meaningful description for each model.
Currently, the description
fields in your ProviderAIModel
instances are empty strings. Including brief descriptions can greatly assist users in understanding each model’s purpose and differences.
def completion_response_with_stream( | ||
self, stream: Generator[ChatCompletionChunk, None, None] | ||
): | ||
reasoning_content_tag_sent = False | ||
for chunk in stream: | ||
delta = chunk.choices[0].delta | ||
if delta: | ||
if ( | ||
hasattr(delta, "reasoning_content") | ||
and delta.reasoning_content | ||
): | ||
if not reasoning_content_tag_sent: | ||
reasoning_content_tag_sent = True | ||
yield f"```think\n{delta.reasoning_content}" | ||
else: | ||
yield delta.reasoning_content | ||
if delta.content: | ||
if reasoning_content_tag_sent: | ||
reasoning_content_tag_sent = False | ||
yield f"\n```\n\n{delta.content}" | ||
else: | ||
yield delta.content | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick (assertive)
Ensure the code block is properly closed if the stream ends after reasoning content.
If streaming ends immediately after yielding reasoning content, the “think” code block remains unclosed. Consider automatically closing it at the end, or yield a final “
\n” if no more content arrives.
Below is an example fix for the end of your loop:
def completion_response_with_stream(
self, stream: Generator[ChatCompletionChunk, None, None]
):
reasoning_content_tag_sent = False
for chunk in stream:
...
+ # After the loop, close any open code block
+ if reasoning_content_tag_sent:
+ yield "\n```"
def prepare_message_response( | ||
self, message: Message | ||
) -> ChatCompletionUserMessageParam: | ||
return ChatCompletionUserMessageParam( | ||
role=message.role.value, content=message.content | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick (assertive)
Add docstring to clarify the purpose and parameters of prepare_message_response
.
A concise docstring can help maintainers and users quickly understand how the method transforms the Message
object and what is returned.
Provider( | ||
id="deepseek", | ||
name="DeepSeek", | ||
api_type=ProviderAPIType.OPENAI, | ||
base_url="https://api.deepseek.com/v1", | ||
organization_mode_available=False, | ||
require_api_key=True, | ||
env_var_name_api_key="DEEPSEEK_API_KEY", | ||
env_var_name_organization_key=None, | ||
engine_cls_path="basilisk.provider_engine.deepseek_engine.DeepSeekAIEngine", | ||
), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick (assertive)
Verify if api_type=ProviderAPIType.OPENAI
is intentional for DeepSeek.
If DeepSeek’s API is structurally compatible with OpenAI endpoints, this is good. Otherwise, consider creating a dedicated ProviderAPIType
enum member for clarity.
Summary by CodeRabbit
New Features
Bug Fixes