This repository contains a set of use cases demonstrating how to build AI-featured applications.
This demo shows how to create a simple AI agent using LangGraph and integrate it into a NextJS application. LangGraph is a robust framework for building agent and multi-agent workflows. It provides flexibility to build complex logic and has great tooling for debugging (LangGraph Studio) and monitoring (LangSmith). NextJS is a popular framework for building web applications.
The demo includes the following capabilities:
- Streaming. Agent streams LLM tokens to the client application.
- Generative UI. Renders components based on agent state. For example, weather widget.
- Human in the loop. Agent can ask users for clarification to proceed with tasks. For example, reminder creation confirmation.
- Persistence. LangGraph has a built-in persistence layer. It can be used to persist agent state between sessions. In the demo app, state is persisted in memory. See LangGraph Persistence for how to use PostgreSQL or MongoDB.
- Reply and Fork. Agent can be replied to or forked from any checkpoint.
- Agent state replication. Agent state is fully replicated on the client side based on the graph checkpoints.
- Error handling. The app displays global agent errors, such as when an agent is not accessible, as well as errors that occur at the graph node level.
- Stop agent. Agent execution can be stopped and resumed later.
- No dependencies. There are no dependencies on third-party libraries for integration. You can adjust it to your needs.
- Clean UI. The app is based on shadcn components and has dark and light theme support.
There are some features that are not implemented yet:
- Graph interruption (Human in the loop) in parallel nodes.
- Send custom events from the same parallel nodes. E.g., when checking weather for multiple cities at the same time, it is not possible to distinguish between them on the client side.
You can use this project as the starting point for your projects:
- Clone the repository
- Adjust the AI agent logic in the
graph.py
file or create a brand new one - Adjust the agent state in the
agent-types.ts
file - In the client app, call agent using
useLangGraphAgent
hook in your components
- Add .env file to the
/agent
directory and set your OPENAI_API_KEY (See.env.example
) - Run agent service:
cd agent/
poetry install
poetry run server
- Run client:
cd client/
npm install
npm run dev
Application will start at http://localhost:3000 by default