You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now, when I connect to a Custom: OpenAI Api, that serves DeepSeek R1 models (for example deepseek-r1-distill-qwen-32b), the response will start with the <think> block and this block will be inserted to the cursor before the actual code response is delivered when I use the ✨Edit Code menu.
Example:
Request:
Result:
Proposed solution
The CodeGPT UI should be able to ignore the part of the response (maybe display it somewhere else) and just take the real code response from the model.
Additional context
The code responses are pretty good from reasoning models, and as the Edit Code menu is very useful and easy to use so this feature seem essential to me.
The text was updated successfully, but these errors were encountered:
Describe the need of your request
Right now, when I connect to a Custom: OpenAI Api, that serves DeepSeek R1 models (for example deepseek-r1-distill-qwen-32b), the response will start with the <think> block and this block will be inserted to the cursor before the actual code response is delivered when I use the ✨Edit Code menu.
Example:
Request:
Result:
Proposed solution
The CodeGPT UI should be able to ignore the part of the response (maybe display it somewhere else) and just take the real code response from the model.
Additional context
The code responses are pretty good from reasoning models, and as the Edit Code menu is very useful and easy to use so this feature seem essential to me.
The text was updated successfully, but these errors were encountered: