LangGraph Agent Chat UI

A React/Vite Interface for LangGraph Agents

Your Gateway to Seamless Agent Interaction with LangGraph

Overview

The Agent Chat UI is a React/Vite application that provides a clean, chat-based interface for interacting with your LangGraph agents. It’s designed to work seamlessly with LangGraph’s core features, including checkpoints, thread management, and human-in-the-loop capabilities.

Here’s why it’s a valuable tool:

  • Easy Connection: Connect to local or deployed LangGraph agents with a simple URL and graph ID.
  • Intuitive Chat: Interact naturally with your agents in a familiar chat format.
  • Visualize Agent Actions: See tool calls and their results rendered directly in the UI.
  • Human-in-the-Loop Made Easy: Seamlessly integrate human input using LangGraph’s interrupt feature.
  • Explore Execution Paths: Travel through time, inspect checkpoints, and fork conversations.
  • Debug and Understand: Inspect the full state of your LangGraph thread at any point.

Get Started

1. Try the Deployed Version (No Setup Required!)

  • Visit: agentchat.vercel.app
  • Connect: Enter your LangGraph deployment URL and graph ID (the path set with langserve.add_routes). For production, include your LangSmith API key.
  • Chat! Start interacting with your agent.

2. Run Locally (Development & Customization)

Option A: Clone the Repository

git clone https://github.com/langchain-ai/agent-chat-ui.git
cd agent-chat-ui
pnpm install # Or npm install/yarn install
pnpm dev     # Or npm run dev/yarn dev

Option B: Quickstart with npx

npx create-agent-chat-app
cd agent-chat-app
pnpm install # Or npm install/yarn install
pnpm dev     # Or npm run dev/yarn dev

Open your browser to http://localhost:5173.

Features

  • Easy Connection: Connect to local or production LangGraph deployments with a URL and graph ID.
  • Chat Interface: Real-time messaging with automatic checkpoint persistence.
  • Tool Call Rendering: Visualize tool calls and results, compatible with LangGraph’s tool calling.
  • Human-in-the-Loop Support: Review, edit, and respond to interrupts using LangGraph’s interrupt function.
  • Thread History: Navigate past interactions with checkpointing.
  • Time Travel and Forking: Explore execution paths with LangGraph’s checkpointing.
  • State Inspection: Debug by examining the full LangGraph thread state.

LangGraph Setup Examples

Basic LangGraph Agent

# agent.py
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_agent_executor
from langserve import add_routes
from fastapi import FastAPI

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant"),
    MessagesPlaceholder(variable_name="messages"),
    MessagesPlaceholder(variable_name="agent_scratchpad"),
])
model = ChatOpenAI(temperature=0)
agent = {"messages": lambda x: x["messages"], "agent_scratchpad": lambda x: []} | prompt | model
app = create_agent_executor(agent, [])
fastapi_app = FastAPI(title="LangGraph Agent")
add_routes(fastapi_app, app, path="/chat")

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(fastapi_app, host="localhost", port=2024)

Run at http://localhost:2024/chat with graph ID chat.

Human-in-the-Loop Example

# agent.py
from langgraph.prebuilt import create_agent_executor, interrupt
from langchain_core.tools import tool

@tool
def write_email(subject: str, body: str, to: str):
    return f"Draft email to {to} with subject {subject} sent."

tools = [write_email]
model = ChatOpenAI(model="gpt-4-turbo-preview").bind_tools(tools)

def handle_interrupt(state):
    messages = state["messages"]
    if isinstance(messages[-1].content, list):
        for msg in messages[-1].content:
            if isinstance(msg, ToolInvocation) and msg.name == "write_email":
                return interrupt(messages, {"type": "interrupt", "args": {"type": "response", "studio": msg.args}})
    return {"messages": messages}

agent = {"messages": lambda x: x["messages"], "agent_scratchpad": lambda x: []} | prompt | model | handle_interrupt
app = create_agent_executor(agent, tools)
fastapi_app = FastAPI(title="LangGraph Agent")
add_routes(fastapi_app, app, path="/email_agent")

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(fastapi_app, host="localhost", port=2024)

Run at http://localhost:2024/email_agent with graph ID email_agent.

BibTeX

@misc{langgraph2023agentchat,
  title = {LangGraph Agent Chat UI},
  author = {LangChain Team},
  year = {2023},
  url = {https://github.com/langchain-ai/agent-chat-ui}
}