Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create an ILMProvider interface and have our current implementation use it #17394

Merged
merged 23 commits into from
Sep 6, 2024

Conversation

PankajBhojwani
Copy link
Contributor

@PankajBhojwani PankajBhojwani commented Jun 7, 2024

Summary of the Pull Request

  • Creates an ILMProvider interface
  • The current implementation that supports Azure OpenAI now uses this interface
  • Separates the code that handles the conversation with the AI with the part of the code that handles the UI
  • TerminalPage is now responsible for initializing an LMProvider and passing that into ExtensionPalette upon initialization

Validation Steps Performed

Everything still works

PR Checklist

  • Closes #xxx
  • Tests added/passed
  • Documentation updated
    • If checked, please file a pull request on our docs repo and link it here: #xxx
  • Schema updated (if necessary)

src/cascadia/QueryExtension/ILLMProvider.idl Fixed Show fixed Hide fixed
src/cascadia/QueryExtension/ILLMProvider.idl Fixed Show fixed Hide fixed

This comment has been minimized.

@PankajBhojwani PankajBhojwani marked this pull request as ready for review June 7, 2024 23:46
Copy link
Member

@zadjii-msft zadjii-msft left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay I'm not gonna block over any of these, since this is clearly an in-progress commit on the way to supporting other providers. But I do feel like we may have one too many layers of abstraction. (maybe it's because I haven't seen the other provider implementations yet)

src/cascadia/QueryExtension/ILLMProvider.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/AzureLLMProvider.h Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.cpp Outdated Show resolved Hide resolved

// If the AI key and endpoint is still empty, tell the user to fill them out in settings
if (_AIKey.empty() || _AIEndpoint.empty())
if (_llmProvider)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea this does feel bodgy currently, but there's only one provider for now so it seems totally fine. When there are other providers, it seems like it'd make more sense to have separate errors for "you haven't set up any LLM backend" (generic, from the Extension palette itself) vs "you didn't set up an API key" (from the individual providers)

@PankajBhojwani PankajBhojwani marked this pull request as draft June 25, 2024 18:16
@PankajBhojwani

This comment was marked as outdated.

This comment has been minimized.

This comment has been minimized.

@PankajBhojwani PankajBhojwani changed the title Create an ILLMProvider interface and have our current implementation use it Create an ILMProvider interface and have our current implementation use it Jul 9, 2024
@PankajBhojwani PankajBhojwani marked this pull request as ready for review July 9, 2024 23:38

This comment has been minimized.

Copy link
Member

@lhecker lhecker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving since it's pretty much just moving stuff. The bugs I found also exist in the old version. 😅

src/cascadia/QueryExtension/AzureLLMProvider.h Outdated Show resolved Hide resolved
// Send the request
try
{
const auto response = _httpClient.SendRequestAsync(request).get();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW all these get()s could use co_await in the future instead. That avoids the resume_background hassle.

src/cascadia/QueryExtension/AzureLLMProvider.cpp Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.cpp Outdated Show resolved Hide resolved

This comment has been minimized.

This comment has been minimized.

This comment has been minimized.

@@ -57,6 +59,7 @@ namespace Microsoft.Terminal.Settings.Model

String AIEndpoint;
String AIKey;
static event AzureOpenAISettingChangedHandler AzureOpenAISettingChanged;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is going to be moved into the new AIConfig struct in the follow up PR that introduces OpenAI

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A static event? that's weird to me but okay. It's moving soon so I'll look at it there

Copy link
Contributor Author

@PankajBhojwani PankajBhojwani Jul 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A static event

Yep, that's because these strings aren't actually saved in our settings, they're saved in the credential manager. Which means that if there's multiple instances of terminal open (each with their own TerminalChat), and the auth is changed in one of them, this is what we have to emit to get all instances to update.


// auth related functions
void SetAuthentication(Windows.Foundation.Collections.ValueSet authValues);
event Windows.Foundation.TypedEventHandler<ILMProvider, String> AuthChanged;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AzureOpenAILLMProvider doesn't actually emit this event but the Github one is going to, so putting it in already

This comment has been minimized.

Copy link
Member

@zadjii-msft zadjii-msft left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry it took so long to loop back on this. Let's start merging down!

@@ -57,6 +59,7 @@ namespace Microsoft.Terminal.Settings.Model

String AIEndpoint;
String AIKey;
static event AzureOpenAISettingChangedHandler AzureOpenAISettingChanged;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A static event? that's weird to me but okay. It's moving soon so I'll look at it there

@@ -156,6 +156,9 @@ namespace winrt::Microsoft::Terminal::Settings::Model::implementation

void ExpandCommands();

static winrt::event_token AzureOpenAISettingChanged(const AzureOpenAISettingChangedHandler& handler);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this be a static til::event? I dunno if I've ever done a static event so I genuinely don't know

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check out ConptyConnection::NewConnection!

{
const auto model = jsonResponse.GetNamedString(L"model");
bool modelIsAccepted{ false };
for (const auto acceptedModel : acceptedModels)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm sure if we didn't just use a string[] we could do some sort of smarter .contains(), but also, eh. We accept like 5 models, this is fine.

src/cascadia/QueryExtension/ILMProvider.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/ExtensionPalette.h Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/AzureLLMProvider.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/AzureLLMProvider.idl Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/AzureLLMProvider.h Outdated Show resolved Hide resolved
src/cascadia/QueryExtension/AzureLLMProvider.cpp Outdated Show resolved Hide resolved
@microsoft-github-policy-service microsoft-github-policy-service bot added the Needs-Author-Feedback The original author of the issue/PR needs to come back and respond to something label Aug 15, 2024
@microsoft-github-policy-service microsoft-github-policy-service bot removed the Needs-Author-Feedback The original author of the issue/PR needs to come back and respond to something label Aug 20, 2024
@zadjii-msft zadjii-msft added this to the Terminal v1.23 milestone Aug 22, 2024
Copy link
Member

@DHowett DHowett left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🦵🏻

🍗

🦿

@DHowett DHowett merged commit e1e3a82 into feature/llm Sep 6, 2024
20 checks passed
@DHowett DHowett deleted the dev/pabhoj/llm_provider_interface branch September 6, 2024 03:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants