Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[mypy] Enable mypy type checking for vllm/core #7229

Merged
merged 2 commits into from
Aug 27, 2024

Conversation

jberkhahn
Copy link
Contributor

@jberkhahn jberkhahn commented Aug 6, 2024

Partially address #3680

Hopefully I did this right.

Copy link

github-actions bot commented Aug 6, 2024

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which consists a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of default ones by unblocking the steps in your fast-check build on Buildkite UI.

Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge).

To run full CI, you can do one of these:

  • Comment /ready on the PR
  • Add ready label to the PR
  • Enable auto-merge.

🚀

@jberkhahn
Copy link
Contributor Author

so for the mypy stuff, I think the failure is because it's not picking up the changes in the ci config files? the yapf formatting stuff I have no idea, it looks like a bunch of stuff in the repo doesn't adhere to that. I added the formatting changes just for vllm/core, but that doesn't seem to have satisfied it.

vllm/core/block_manager_v1.py Outdated Show resolved Hide resolved
vllm/core/block_manager_v1.py Outdated Show resolved Hide resolved
@DarkLight1337
Copy link
Member

DarkLight1337 commented Aug 7, 2024

so for the mypy stuff, I think the failure is because it's not picking up the changes in the ci config files? the yapf formatting stuff I have no idea, it looks like a bunch of stuff in the repo doesn't adhere to that. I added the formatting changes just for vllm/core, but that doesn't seem to have satisfied it.

format.sh has recently been updated to skip all imports when run locally because otherwise it takes too long to run. You can manually run mypy (without --follow-imports=skip) to perform the stricter checks locally.

@DarkLight1337 DarkLight1337 changed the title [CI/Build]enable mypy type checking for vllm/core [mypy] Enable mypy type checking for vllm/core Aug 7, 2024
@jberkhahn
Copy link
Contributor Author

so for the mypy stuff, I think the failure is because it's not picking up the changes in the ci config files? the yapf formatting stuff I have no idea, it looks like a bunch of stuff in the repo doesn't adhere to that. I added the formatting changes just for vllm/core, but that doesn't seem to have satisfied it.

format.sh has recently been updated to skip all imports when run locally because otherwise it takes too long to run. You can manually run mypy (without --follow-imports=skip) to perform the stricter checks locally.

Right, but when I run them locally with just mypy vllm/core they pass. Also, the errors in CI refer to line numbers that don't match up with my edited files, so I think what's wrong is something with how CI is configured.

I'll take a look at the suggested changes. Thanks for the review!

vllm/block.py Outdated Show resolved Hide resolved
@jberkhahn jberkhahn force-pushed the mypy_vllm_core branch 2 times, most recently from f936cac to 6ab603f Compare August 15, 2024 17:21
@jberkhahn jberkhahn force-pushed the mypy_vllm_core branch 3 times, most recently from 73c6eea to 1fc1edc Compare August 22, 2024 21:18
@jberkhahn
Copy link
Contributor Author

Think this is good to go now?

vllm/core/embedding_model_block_manager.py Outdated Show resolved Hide resolved
vllm/block.py Outdated Show resolved Hide resolved
@jberkhahn jberkhahn force-pushed the mypy_vllm_core branch 3 times, most recently from 8353df4 to b683f87 Compare August 23, 2024 23:17
vllm/block.py Outdated Show resolved Hide resolved

block_table: BlockTable = BlockTable()
assert seq is not None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is required? Since num_prompt_blocks = 0 if seq is None anyway

Copy link
Contributor Author

@jberkhahn jberkhahn Aug 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The issue is seq is optional and directly referenced on lines 328-330:

                block = self.gpu_allocator.allocate(
                    seq.hash_of_block(logical_idx),
                    seq.num_hashed_tokens_of_block(logical_idx))

It's not technically possible to get there if seq is none but mypy is still complaining because it doesn't care about the check in _get_seq_num_required_blocks(seq). Do you have a more straightforward suggestion?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, thanks for the explanation.

@njhill njhill added the ready ONLY add when PR is ready to merge/full CI is needed label Aug 27, 2024
@njhill
Copy link
Member

njhill commented Aug 27, 2024

@DarkLight1337 this one good with you to merge now?

@DarkLight1337
Copy link
Member

@DarkLight1337 this one good with you to merge now?

Yeah, let's merge this

@DarkLight1337 DarkLight1337 merged commit 9c71c97 into vllm-project:main Aug 27, 2024
53 checks passed
kushanam pushed a commit to kushanam/vllm that referenced this pull request Aug 28, 2024
kushanam pushed a commit to kushanam/vllm that referenced this pull request Aug 28, 2024
triple-Mu pushed a commit to triple-Mu/vllm_official that referenced this pull request Sep 4, 2024
Jeffwan pushed a commit to aibrix/vllm that referenced this pull request Sep 19, 2024
Alvant pushed a commit to compressa-ai/vllm that referenced this pull request Oct 26, 2024
Signed-off-by: Alvant <alvasian@yandex.ru>
KuntaiDu pushed a commit to KuntaiDu/vllm that referenced this pull request Nov 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants