Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(provider_engine): add DeepSeek integration #501

Merged
merged 3 commits into from
Feb 1, 2025
Merged

Conversation

AAClause
Copy link
Member

@AAClause AAClause commented Jan 27, 2025

Summary by CodeRabbit

  • New Features

    • Introduced DeepSeek as a new AI provider in the account configuration, expanding available AI integration options with enhanced processing capabilities.
  • Bug Fixes

    • Corrected the Gemini provider hyperlink in the documentation for accurate configuration guidance.

@AAClause AAClause added this to the 0.1a10 milestone Jan 27, 2025
@AAClause AAClause marked this pull request as ready for review February 1, 2025 10:12
Copy link
Contributor

coderabbitai bot commented Feb 1, 2025

Walkthrough

The changes add support for a new AI provider named DeepSeek. The README is updated to include DeepSeek’s API link and correct an existing Gemini link. In the provider module, a new Provider instance for DeepSeek is registered with its configuration details. Additionally, a new engine class, DeepSeekAIEngine, is introduced with methods to define supported models and to handle both streaming and non-streaming completion responses.

Changes

File(s) Change Summary
README.md Added DeepSeek provider entry with link https://www.deepseek.com/ to the account configuration and corrected the Gemini hyperlink from htps://... to https://...
basilisk/provider.py Added a new Provider instance for DeepSeek with attributes including id, name, API type, base URL, and environment variable names for API key configuration.
basilisk/provider_engine/deepseek_engine.py Added the DeepSeekAIEngine class extending OpenAIEngine. This class defines available models and implements methods for streaming (completion_response_with_stream) and non-streaming (completion_response_without_stream) responses, plus a message preparation method (prepare_message_response).

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant Engine as DeepSeekAIEngine
    participant API as DeepSeek API

    Client->>Engine: Request streaming completion
    Engine->>API: Initiate streaming request
    API-->>Engine: Return ChatCompletionChunk(s)
    Engine->>Engine: Process each chunk (check reasoning_content)
    Engine->>Client: Yield processed content
Loading
sequenceDiagram
    participant Client
    participant Engine as DeepSeekAIEngine
    participant API as DeepSeek API

    Client->>Engine: Request non-streaming completion
    Engine->>Engine: Convert message (prepare_message_response)
    Engine->>API: Send formatted completion request
    API-->>Engine: Return ChatCompletion response
    Engine->>Engine: Process and format response (extract reasoning)
    Engine->>Client: Return final message block
Loading
✨ Finishing Touches
  • 📝 Generate Docstrings (Beta)

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0715f19 and 8c22760.

📒 Files selected for processing (3)
  • README.md (1 hunks)
  • basilisk/provider.py (1 hunks)
  • basilisk/provider_engine/deepseek_engine.py (1 hunks)
🔇 Additional comments (1)
README.md (1)

69-69: DeepSeek addition looks good.
Your new link is properly added and consistent with the existing format.

Comment on lines +29 to +51
models = [
ProviderAIModel(
id="deepseek-chat",
name="DeepSeek-V3",
# Translators: This is a model description
description="",
context_window=64000,
max_temperature=2.0,
default_temperature=1.0,
max_output_tokens=8000,
),
ProviderAIModel(
id="deepseek-reasoner",
name="DeepSeek-R1",
# Translators: This is a model description
description="",
context_window=64000,
max_temperature=2.0,
default_temperature=1.0,
max_output_tokens=8000,
),
]
return models
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Consider providing a meaningful description for each model.
Currently, the description fields in your ProviderAIModel instances are empty strings. Including brief descriptions can greatly assist users in understanding each model’s purpose and differences.

Comment on lines +53 to +75
def completion_response_with_stream(
self, stream: Generator[ChatCompletionChunk, None, None]
):
reasoning_content_tag_sent = False
for chunk in stream:
delta = chunk.choices[0].delta
if delta:
if (
hasattr(delta, "reasoning_content")
and delta.reasoning_content
):
if not reasoning_content_tag_sent:
reasoning_content_tag_sent = True
yield f"```think\n{delta.reasoning_content}"
else:
yield delta.reasoning_content
if delta.content:
if reasoning_content_tag_sent:
reasoning_content_tag_sent = False
yield f"\n```\n\n{delta.content}"
else:
yield delta.content

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Ensure the code block is properly closed if the stream ends after reasoning content.
If streaming ends immediately after yielding reasoning content, the “think” code block remains unclosed. Consider automatically closing it at the end, or yield a final “\n” if no more content arrives.

Below is an example fix for the end of your loop:

 def completion_response_with_stream(
     self, stream: Generator[ChatCompletionChunk, None, None]
 ):
     reasoning_content_tag_sent = False
     for chunk in stream:
         ...
+    # After the loop, close any open code block
+    if reasoning_content_tag_sent:
+        yield "\n```"

Comment on lines +93 to +98
def prepare_message_response(
self, message: Message
) -> ChatCompletionUserMessageParam:
return ChatCompletionUserMessageParam(
role=message.role.value, content=message.content
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Add docstring to clarify the purpose and parameters of prepare_message_response.
A concise docstring can help maintainers and users quickly understand how the method transforms the Message object and what is returned.

Comment on lines +71 to +81
Provider(
id="deepseek",
name="DeepSeek",
api_type=ProviderAPIType.OPENAI,
base_url="https://api.deepseek.com/v1",
organization_mode_available=False,
require_api_key=True,
env_var_name_api_key="DEEPSEEK_API_KEY",
env_var_name_organization_key=None,
engine_cls_path="basilisk.provider_engine.deepseek_engine.DeepSeekAIEngine",
),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Verify if api_type=ProviderAPIType.OPENAI is intentional for DeepSeek.
If DeepSeek’s API is structurally compatible with OpenAI endpoints, this is good. Otherwise, consider creating a dedicated ProviderAPIType enum member for clarity.

@AAClause AAClause merged commit 7e7c32b into master Feb 1, 2025
9 checks passed
@AAClause AAClause deleted the DeepSeek branch February 1, 2025 10:30
@coderabbitai coderabbitai bot mentioned this pull request Feb 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant