Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Incorrect Data Type Conversion for MultiModalData #4054

Closed
caiqi opened this issue Apr 13, 2024 · 1 comment · Fixed by #4197
Closed

[Bug]: Incorrect Data Type Conversion for MultiModalData #4054

caiqi opened this issue Apr 13, 2024 · 1 comment · Fixed by #4197
Labels
bug Something isn't working

Comments

@caiqi
Copy link

caiqi commented Apr 13, 2024

Your current environment

The output of `python collect_env.py`

🐛 Describe the bug

I've encountered an issue with the code in the vllm project on GitHub. Specifically, at line 170 of the llm.py file https://github.com/vllm-project/vllm/blob/main/vllm/entrypoints/llm.py#L170, there is a conversion of multi_modal_data.data to torch.float16 using the statement multi_modal_data.data = multi_modal_data.data.to(torch.float16). This automatic conversion may not be suitable for all use cases, especially if the model is designed to operate with bfloat16 or other numerical precisions. If this is an issue that needs to be addressed, I would be happy to submit a pull request with a fix

@caiqi caiqi added the bug Something isn't working label Apr 13, 2024
@DarkLight1337
Copy link
Member

DarkLight1337 commented Apr 16, 2024

I have refactored the processing logic in #3978 which removes the assumption of float16 data type. Could you suggest a test case to determine whether the conversion problem still exists or not?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants