This is a demo application showing how you can build a ChatGPT like experience using Azure OpenAI Service.
There are three demo applications:
- ConsoleGPT.OpenAISdk - This is a demo application using the Azure OpenAI SDK directly
- ConsoleGPT.OpenAISdk.Streaming - Same as the first demo but uses a streaming response (with simulated network latency)
- ConsoleGPT.SemanticKernelDemo - This is a demo application using the Semantic Kernel to orchestarte the Azure OpenAI Service
Note: While these demos refer to Azure OpenAI Service, OpenAI can also be used directly.
Clone the repository:
git clone https://github.com/aaronpowell/ConsoleGPT.git
Open in Visual Studio, VS Code, or your favourite editor.
You need to add your connection information to either Azure OpenAI Service of OpenAI to appsettings.json
. Here's a sample for Azure OpenAI Service:
{
"settings": {
"model": "chat",
"endpoint": "https://<your resource>.openai.azure.com/",
"key": "<your key>",
"type": "azure"
}
}
Note: For Azure OpenAI Service the model
is the name of the model created from the foundation model. For OpenAI it would be the GPT model such as gpt-3.5-turbo
.
Set the application you want to run as the Startup Project and start a debugging session. Alternatively, navigate to the folder on the command line and run dotnet run
.
MIT
Thanks to Jim Bennett for the original demo inspiration.