Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is vllm willing to support models based on other AI frameworks, such as mindspore? #11507

Closed
wxsIcey opened this issue Dec 26, 2024 · 3 comments
Labels
usage How to use vllm

Comments

@wxsIcey
Copy link

wxsIcey commented Dec 26, 2024

Hello,

we are MindSpore eco-developers! MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.(You can get a more detailed introduction to MindSpore at: https://www.mindspore.cn/en)

We want to add a MindSpore backend to vllm to expand the MindSpore ecosystem, and at the same time, also make the vllm ecosystem better.

Our team has several software engineers with years of development experience, and we are supported by the team of MindSpore framework developers.

We would like to know if your community accepts models based on MindSpore? and if it is possible, we can discuss the technical details further.

@wxsIcey wxsIcey added the usage How to use vllm label Dec 26, 2024
@WangErXiao
Copy link
Contributor

如果支持这些框架,那vllm就走不远了。

@ywang96
Copy link
Member

ywang96 commented Dec 27, 2024

Personally I think it makes more sense for the support for alternative frameworks to stay as a fork of vLLM. For now the scope of vLLM is still limited to PyTorch and we don't have plan to expand to other frameworks.

#3620 as you mentioned is adding support for TPUs with PyTorch XLA rather than supporting JAX directly, and general speaking supporting different hardware is a much different story than supporting other DL frameworks.

@wxsIcey wxsIcey changed the title [Usage]: vllm是否乐意支持基于其他人工智能框架的模型,如mindspore,paddlepaddle Is vllm willing to support models based on other AI frameworks, such as mindspore? Dec 27, 2024
@wxsIcey
Copy link
Author

wxsIcey commented Dec 27, 2024

Personally I think it makes more sense for the support for alternative frameworks to stay as a fork of vLLM. For now the scope of vLLM is still limited to PyTorch and we don't have plan to expand to other frameworks.

#3620 as you mentioned is adding support for TPUs with PyTorch XLA rather than supporting JAX directly, and general speaking supporting different hardware is a much different story than supporting other DL frameworks.

Thank you very much for your patient reply. We will continue to pay attention to the technical development of vllm.

@wxsIcey wxsIcey closed this as completed Dec 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
usage How to use vllm
Projects
None yet
Development

No branches or pull requests

3 participants