Skip to content

A real-time AI chat example that leverages Server-Sent Events (SSE) to stream responses from OpenAI's API. Built with Hono and Bun runtime.

Notifications You must be signed in to change notification settings

hellokaton/hono-stream-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hono + Bun + OpenAI Chat Example

A real-time AI chat example that leverages Server-Sent Events (SSE) to stream responses from OpenAI's API. Built with Hono framework and powered by Bun runtime.

Features

  • 🚀 Real-time streaming responses using Server-Sent Events (SSE)
  • ⚡️ High-performance backend powered by Bun runtime
  • 🔄 Seamless integration with OpenAI API
  • 🛠️ Built with Hono - a lightweight, ultrafast web framework
  • 📱 Modern and responsive chat interface

Screenshots

screenshots.jpg

Prerequisites

Before you begin, ensure you have the following installed:

  • Bun (latest version)
  • An OpenAI API key

Usage

bun install

Create a .env file in the project root and add your OpenAI API key:

OPENAI_BASE_URL=openai_base_url
OPENAI_API_KEY=your_api_key_here
bun run dev

Open your browser and navigate to http://localhost:3000

Project Structure

project-root/
├── src/
│   └── index.ts        # Entry point
├── index.html          # Website template
└── package.json

API Endpoints

POST /chat

Initiates a chat session with the AI model.

Request

{
  "prompt": "Your prompt here"
}

Response

Server-Sent Events stream with AI responses.

Acknowledgments

  • Hono - The ultrafast web framework
  • Bun - The fast all-in-one JavaScript runtime
  • OpenAI - For providing the AI capabilities

About

A real-time AI chat example that leverages Server-Sent Events (SSE) to stream responses from OpenAI's API. Built with Hono and Bun runtime.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published