Releases: awaescher/OllamaSharp
Release 3.0.3
Update README.md
Release 3.0.2
Improved tool chat demos
Release 3.0.1
Fixed tool chat demo and added more unit tests
Release 3.0.0
As announced in discussion #63, this release went through major code changes (see pull request #64) and drops a lot of unnecessary code.
OllamaSharp now uses the IAsyncEnumerable
syntax. The following paragraph shows the breaking changes from version 2 to 3.
Breaking changes
Methods on IOllamaApiClient
Chat()
now returnsIAsyncEnumerable<ChatResponseStream?>
(prefer theChat
class to build interactive chats)SendChat()
/StreamChat()
(sync/async) were removed in favor for the newChat()
methodCreateModel()
overload accepting streaming callbacks was removed, use theIAsyncEnumerable<>
syntaxPullModel()
overload accepting streaming callbacks was removed, use theIAsyncEnumerable<>
syntaxPushModel()
overload accepting streaming callbacks was removed, use theIAsyncEnumerable<>
syntaxGenerateEmbeddings()
is nowEmbed()
to follow the Ollama API namingShowModelInformation()
is nowShowModel()
to follow the Ollama API namingStreamCompletion()
is nowGenerate()
to follow the Ollama API namingGetCompletion()
(sync) was removed in favor forGenerate()
method
Chat class
Send()
&SendAs()
overloads accepting streaming callbacks were removed, use theIAsyncEnumerable<>
syntax- Constructor cannot use streamer callbacks anymore, as
Send()
&SendAs()
stream directly
Classes
GenerateEmbeddingRequest
→EmbedRequest
GenerateEmbeddingResponse
→EmbedResponse
GenerateCompletionRequest
→GenerateRequest
GenerateCompletionResponseStream
→GenerateResponseStream
GenerateCompletionDoneResponseStream
→GenerateDoneResponseStream
ChatResponse
was removed as it was only used by the old signature of theChat()
method
Streaming callbacks
A special note to the streaming callbacks that have been removed. Before version 3, it was possible to something like this:
var answer = await chat.Send("hi!", r => Console.Write(r.Message));
Since version 3, all the method overloads accepting these streaming callbacks have been removed in favor for the IAsyncEnumerable
syntax:
await foreach (var answerToken in chat.Send("hi!"))
Console.Write(answerToken);
The second approach, the IAsyncEnumerable
syntax, is the modern one and easier to read. However, one tiny detail is missing: You cannot stream the responses and get the whole answer as result at the same time anymore.
Theoretically, you would have to do something like this if you wanted to get the whole result value:
var builder = new StringBuilder();
await foreach (var answerToken in chat.Send("hi!"))
builder.Append(answerToken);
Console.WriteLine(builder.ToString());
StreamToEnd()
To make this easier, OllamaSharp provides an extension method for IAsyncEnumerable
that streams its items to a single result value: StreamToEnd()
.
With this, you can simply stream the responses into a single value:
var answer = await chat.Send("hi!").StreamToEnd();
If you still want to do something with each item in the IAsyncEnumerable
, you can even pass a item callback to the method:
var answer = await chat.Send("hi!").StreamToEnd(token => { ... });
This way, the OllamaApiClient
can stream its results as IAsyncEnumerable
and the stream processing is done in an extension method instead of having dozens confusing method overloads.
Release 2.1.3
Improve console demos
Release 2.1.2
Added api to set the message history
Release 2.1.1
This release adds improved tool support for Ollama and extends the Api console with a tool chat demo.
Release 2.0.16
Fix readme.md Fixes #55
Release 2.0.15
Merge pull request #60 from milbk/main Update Embedding for /api/embed
Release 2.0.14
Merge pull request #59 from milbk/main