Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Core] Streamline stream termination in AsyncLLMEngine #7336

Merged
merged 2 commits into from
Aug 9, 2024

Commits on Aug 9, 2024

  1. [Core] Streamline stream termination in AsyncLLMEngine

    Follow-on from vllm-project#7111, avoid unecessary enqueuing a final message after an exception and avoid aborting requests in the engine that were never started.
    njhill committed Aug 9, 2024
    Configuration menu
    Copy the full SHA
    18fbd1e View commit details
    Browse the repository at this point in the history
  2. Update test

    njhill committed Aug 9, 2024
    Configuration menu
    Copy the full SHA
    edd4768 View commit details
    Browse the repository at this point in the history