Replies: 3 comments
-
I followed the steps too,but I didn't encounter the issue while building.Instead,I encountered the same issue when I tried to run a model.
It's a ubuntu computer with AMD CPU. |
Beta Was this translation helpful? Give feedback.
-
Hi @spaceater thanks for validating. Yes my bad, the issue was when running my Python script. Did you enable the prefix caching I realized the error goes away if I remove that line, which works for me for now, but would like to cache the prompt especially for batch processing to improve the throughput. |
Beta Was this translation helpful? Give feedback.
-
I first used pip to install intel_extension_for_python,but it couldn't work,so I just rebuilt the whole environment.Then I figured out that only 2.5.0 version of intel_extension_for_python was usable in current environment.And then problem solved.
|
Beta Was this translation helpful? Give feedback.
-
Hi, I following the steps on https://docs.vllm.ai/en/latest/getting_started/installation/cpu/index.html?device=arm#build-wheel-from-source, but encountered the following error.
How can I avoid building pytorch that requires intel_extension_for_pytorch?
Beta Was this translation helpful? Give feedback.
All reactions