-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fixed no max output tokens #175
fixed no max output tokens #175
Conversation
Nice spot. Would you add a line to |
… validates everything is covered
Not sure if that's what you meant but I've extended |
src/MicrosoftAi/AbstractionMapper.cs
Outdated
TryAddOllamaOption<bool?>(options, OllamaOption.UseMmap, v => request.Options.UseMmap = (bool?)v); | ||
TryAddOllamaOption<bool?>(options, OllamaOption.VocabOnly, v => request.Options.VocabOnly = (bool?)v); | ||
} | ||
if (!(options?.AdditionalProperties?.Any() ?? false)) return request; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand you prefer jumping out here. I am okay with that but the combination of not and the coalesce operator is not that easy to recognize so far. Also I personally don't like having the return
in the same line.
I would prefer something like this, what do you think?
var hasAdditionalProperties = options?.AdditionalProperties?.Any() ?? false;
if (!hasAdditionalProperties)
return request;
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I totally agree with you! fixed!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Jakub
Hey, thanks a lot for creating this project! I think I found a bug related to missing MaxOutputTokens field mapping, I fixed it both in the code and in the unit test.
I think i found another one but it's within
Semantic Kernel connector
library. TheOllamaPromptExecutionSettings
doesn't contain and thus you cannot send max output tokens, I will try to report it them