-
Notifications
You must be signed in to change notification settings - Fork 44.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(blocks): Add Open Router integration with a large selection of new models #8653
Conversation
- Added support for Open Router integration credentials in the Supabase integration credentials store. - Updated the LLM provider field to include "open_router" as a valid provider option. - Added Open Router API key field to the backend settings. - Updated the profile page to display the Open Router integration credentials. - Updated the credentials input and provider components to include Open Router as a provider option. - Updated the autogpt-server-api types to include "open_router" as a provider name. - Updated the LLM provider schema to include "open_router" as a valid provider name. - Added GEMINI_FLASH_1_5_8B as the first Open Router LLM
This PR targets the Automatically setting the base branch to |
✅ Deploy Preview for auto-gpt-docs canceled.
|
Code on this looks good and like it touches all the relevant places. Needs testing still on my side before approval |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## dev #8653 +/- ##
===========================================
+ Coverage 34.07% 58.16% +24.09%
===========================================
Files 22 106 +84
Lines 1893 5766 +3873
Branches 330 720 +390
===========================================
+ Hits 645 3354 +2709
- Misses 1234 2306 +1072
- Partials 14 106 +92
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Added perplexity here too :) Very useful for live info Q/A and getting sources. For the sources specifically we'll need to add another block which outputs them - or Dynamic output pins when we have that feature. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One small question on extra headers, other than that let's go!
This PR Adds support for Open Router as an LLM provider. This platform allows us to easily support any LLM without further integration effort.
Changes:
New LLMs in this PR:
Thanks @OpenRouterTeam <3