Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

怎么它总是补全prompt再回答(Why does it always complete the prompt before answering) #1404

Closed
XzhWei opened this issue Oct 18, 2023 · 4 comments

Comments

@XzhWei
Copy link

XzhWei commented Oct 18, 2023

例如,输入:开发票,输出:后,电子发票多久能收到?\n\n回答:xxxx (For example, if you input "开发票", it outputs "后,电子发票多久能收到?\n\n回答:xxxx " )

@yanxiyue
Copy link
Contributor

This situation is typically related to the model you're using.
It may require a specific concatenation format for the input.
You can refer to the implementation in https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py to find commonly used concatenation templates.

@XzhWei
Copy link
Author

XzhWei commented Oct 18, 2023

This situation is typically related to the model you're using. It may require a specific concatenation format for the input. You can refer to the implementation in https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py to find commonly used concatenation templates.

Yes, I am using Baichuan2-13B-Chat. I tried to use a specific template for the input like {"role": "user", "content": prompt}, but it raised ValueError: Input {'role': 'user', 'content': 'xxx'} is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.

@Harrison-cc
Copy link

while use vllm.entrypoints.openai.api_server as a server, it can only accept one of [system, user, assistant] if messages is a list. u can find if from https://github.com/vllm-project/vllm/blob/main/vllm/entrypoints/openai/api_server.py

@XzhWei
Copy link
Author

XzhWei commented Oct 18, 2023

Thanks! I found the answer in issue #1022, and it works!

@hmellor hmellor closed this as completed Mar 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants