Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

Use random port for backend #390

Merged
merged 3 commits into from
Jul 31, 2024

Conversation

joerunde
Copy link
Collaborator

Picks an open port to use and boots both the client and server with it

Signed-off-by: Joe Runde <Joseph.Runde@ibm.com>
Signed-off-by: Joe Runde <Joseph.Runde@ibm.com>
Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which consists a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of default ones by unblocking the steps in your fast-check build on Buildkite UI.

Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge).

To run full CI, you can do one of these:

  • Comment /ready on the PR
  • Add ready label to the PR
  • Enable auto-merge.

🚀

Copy link

@njhill njhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe need to run format

Signed-off-by: Joe Runde <Joseph.Runde@ibm.com>
@joerunde
Copy link
Collaborator Author

The formatter is run, looks like it just did an odd thing or two 🤷

@joerunde
Copy link
Collaborator Author

(gonna merge this so I can go put the health checks on top of it too)

@joerunde joerunde merged commit 5362952 into neuralmagic:isolate-oai-server-process Jul 31, 2024
1 check passed
@joerunde joerunde deleted the open-port branch July 31, 2024 17:59
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants