Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow OpenAI to be used with Terminal Chat #17540

Merged
merged 42 commits into from
Oct 28, 2024

Conversation

PankajBhojwani
Copy link
Contributor

@PankajBhojwani PankajBhojwani commented Jul 9, 2024

  • Implements OpenAILLMProvider, which is an implementation of ILMProvider that uses OpenAI
  • Implements an AIConfig on the settings model side, to allow the user to specify which AI provider to use as their current active one (in the case that they have configured more than one LMProvider)

The OpenAI implementation is largely the same as the Azure OpenAI one. The more "new" change in this PR is the AIConfig struct on the settings model side that allows the user to specify which provider is the active one, as well as the logic in TerminalPage for how we update the current active provider based on settings changes

Validation Steps Performed

  • Able to set OpenAI as the active provider
  • OpenAI works in Terminal Chat

@@ -56,6 +56,9 @@
<ClInclude Include="AzureLLMProvider.h">
<DependentUpon>AzureLLMProvider.idl</DependentUpon>
</ClInclude>
<ClInclude Include="OpenAILLMProvider.h">

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AILLM](#security-tab) is not a recognized word. \(unrecognized-spelling\)
_NotifyChanges(L"IsOpenAIKeySet");
}

bool AISettingsViewModel::AzureOpenAIIsActive()

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AIIs](#security-tab) is not a recognized word. \(unrecognized-spelling\)
@@ -0,0 +1,129 @@
// Copyright (c) Microsoft Corporation.

Check failure

Code scanning / check-spelling

Check File Path

[AILLM](#security-tab) is not a recognized word. \(check-file-path\)
@@ -0,0 +1,48 @@
// Copyright (c) Microsoft Corporation.

Check failure

Code scanning / check-spelling

Check File Path

[AILLM](#security-tab) is not a recognized word. \(check-file-path\)
@@ -0,0 +1,17 @@
// Copyright (c) Microsoft Corporation.

Check failure

Code scanning / check-spelling

Check File Path

[AILLM](#security-tab) is not a recognized word. \(check-file-path\)
@@ -96,6 +102,9 @@
<Midl Include="AzureLLMProvider.idl">
<SubType>Code</SubType>
</Midl>
<Midl Include="OpenAILLMProvider.idl">

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AILLM](#security-tab) is not a recognized word. \(unrecognized-spelling\)
</Button>
<Button Click="SetAzureOpenAIActive_Click"
HorizontalAlignment="Center"
Visibility="{x:Bind mtu:Converters.InvertedBooleanToVisibility(ViewModel.AzureOpenAIIsActive), Mode=OneWay}">

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AIIs](#security-tab) is not a recognized word. \(unrecognized-spelling\)
</Button>
<Button Click="SetOpenAIActive_Click"
HorizontalAlignment="Center"
Visibility="{x:Bind mtu:Converters.InvertedBooleanToVisibility(ViewModel.OpenAIIsActive), Mode=OneWay}">

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AIIs](#security-tab) is not a recognized word. \(unrecognized-spelling\)
void AISettingsViewModel::SetAzureOpenAIActive()
{
_Settings.GlobalSettings().AIInfo().ActiveProvider(Model::LLMProvider::AzureOpenAI);
_NotifyChanges(L"AzureOpenAIIsActive");

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AIIs](#security-tab) is not a recognized word. \(unrecognized-spelling\)
{
_Settings.GlobalSettings().AIInfo().ActiveProvider(Model::LLMProvider::AzureOpenAI);
_NotifyChanges(L"AzureOpenAIIsActive");
_NotifyChanges(L"OpenAIIsActive");

Check failure

Code scanning / check-spelling

Unrecognized Spelling

[AIIs](#security-tab) is not a recognized word. \(unrecognized-spelling\)

This comment has been minimized.

Comment on lines 180 to 185
const auto val{ _getActiveProviderImpl() };
if (val)
{
// an active provider was explicitly set, return that
return *val;
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This does mean that you can set an active provider even if you have not configured that provider - in this case when you try to use the Chat the provider will return the appropriate error. For example, the OpenAI provider will still make the http request and we get an error message back that no api key was provided.

@PankajBhojwani PankajBhojwani marked this pull request as ready for review July 9, 2024 23:54

This comment has been minimized.

This comment has been minimized.

This comment has been minimized.

This comment has been minimized.

This comment has been minimized.

@zadjii-msft zadjii-msft added this to the Terminal v1.23 milestone Aug 22, 2024
Base automatically changed from dev/pabhoj/llm_provider_interface to feature/llm September 6, 2024 03:01
@PankajBhojwani PankajBhojwani mentioned this pull request Oct 11, 2024
5 tasks
Copy link
Member

@lhecker lhecker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Biggest concern: The strong-this issue. Otherwise, it seems fine - just some smaller issues.

src/cascadia/QueryExtension/OpenAILLMProvider.cpp Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/OpenAILLMProvider.cpp Outdated Show resolved Hide resolved
src/cascadia/TerminalApp/TerminalPage.cpp Show resolved Hide resolved
src/cascadia/TerminalSettingsEditor/AISettings.cpp Outdated Show resolved Hide resolved
src/cascadia/TerminalSettingsModel/AIConfig.cpp Outdated Show resolved Hide resolved
{
enum LLMProvider
{
AzureOpenAI,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need a None option to return in case none are set - otherwise, the zero value is AzureOpenAI

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I realized this in #18095 and added it there, I'll bring that change forward to here

}
else
{
return LLMProvider{};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(The note about having a zero value applies here!)

@PankajBhojwani PankajBhojwani merged commit 5c7ba82 into feature/llm Oct 28, 2024
20 checks passed
@PankajBhojwani PankajBhojwani deleted the dev/pabhoj/featurellm_openai branch October 28, 2024 19:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants