Releases: kevinthedang/discord-ollama
Releases · kevinthedang/discord-ollama
v0.3.4
What's Changed
- Shutoff Discord Bot by @kevinthedang in #30
- Workflows Fix by @kevinthedang in #32
Notes
- Shutoff command added and the toggle-chat is now implemented to work too!
- Fix Node-Job-Build comes next release
- Build readme shield in readme in next release
Issues
- Reviewed workflows and found that the Node-Build is not working since we forgot to add it to that job... will add in next release.
- Just like in v0.3.3 The team hopes to work on issues:
Full Changelog: v0.3.3...v0.3.4
v0.3.3
What's Changed
- Auto-Generate Config by @kevinthedang in #29
Addressed
Notes
- Just like in v0.3.2 The team hopes to work on issues:
Full Changelog: v0.3.2...v0.3.3
v0.3.2
What's Changed
- CI for Application Builds by @kevinthedang in #27
Notes
- Workflows only are CI related and in the future will have more on docker and CD related.
- Just like in v0.3.1 The team hopes to work on issues:
Full Changelog: v0.3.1...v0.3.2
v0.3.1
What's Changed
- Nvidia Container Toolkit by @kevinthedang in #26
Update
- Nvidia Container Toolkit works! wsl is just wonky with the installation of NCT. Any normal Linux machine should install correctly.
Notes
- Local machines using wsl must comment out the runtime and devices found in the
docker-compose.yml
for this to work. - Just like in v0.3.0 The team hopes to work on issues:
Full Changelog: v0.3.0...v0.3.1
v0.3.0
What's Changed
- Small Documentation and Refactoring by @kevinthedang in #18
- User Preferences and Setup Docs by @kevinthedang in #20
New Issue: #21
Notes
- The local and docker versions are working fine!
- Still investigating the
nvidia
runtime issue that allows support for a Nvidia GPU for docker containers. - This should not be an issue for local machines.
- Still investigating the
- The team hopes to work on issues:
Screenshots
Nvidia runtime (Local Machine/Non-container):
Full Changelog: v0.2.0...v0.3.0
v0.2.0
What's Changed
- Docker Container Setup by @kevinthedang in #15
- Docker Setup Instructions by @kevinthedang in #16
Notes
- With the containerization of the app, hopefully we can improve the performance of ollama with GPU utilization from Nvidia.
- Docker may not work properly without Docker Desktop for Windows servers/devices.
Full Changelog: v0.1.4...v0.2.0
v0.1.4
What's Changed
- Formatting and Contributing by @kevinthedang in #11
Notes
- Slash command skeleton has been made to allow other slash commands to be created for issue #10
Full Changelog: v0.1.3...v0.1.4
v0.1.2
Added
- Embedded Messages through the bot
- Data stream parser that will parse a stream response as if the model was talking in real time (sorta)
ollama-js
library works now since the developers added esm compatibility
Notes
- Redid release tag to not point to non-existent commit. This happened because of an amend that happened after a tag in
master
Full Changelog: v0.1.1...v0.1.2
v0.1.3
Notes
- Now that mentions can be used to interact with the bot, we need to do the following additional features
- Implement User Preferences if they want to user mentions or not
- Splice out the uid from the query
Full Changelog: v0.1.2...v0.1.3
v0.1.1
Notes
- Refer to PR #6 for Persistence changes
- Persistence only working on
/api/chat
for Ollama - Next Update will include
/api/generate
if it is possible with embedding. - Streaming the model's response will be coming as well!
Full Changelog: v0.1.0...v0.1.1