
A full-stack boilerplate using Next.js App Router with assistant-ui for a polished chat interface and AI SDK v5 as the backend library for AI orchestration. The LLM connection is configured in lib/openai using your OPENAI_API_KEY, while the chat API is exposed via /api/v1/chat (Next.js App Route). This project demonstrates a production-ready chat loop with streaming, and configurable system prompts.
Features
- Next.js App Router for modern full-stack development.
- assistant-ui components in
packages/ui(thread, composer, loading states). - AI SDK v5 as the backend library to handle AI runtime, and streaming.
lib/openaifor LLM configuration usingOPENAI_API_KEY(model, provider setup).- Chat API at
/api/v1/chat(POST/stream) with SSE-compatible streaming responses. - System & user prompts with conversation state persistence example.
- TypeScript end-to-end.
Project Structure
packages/ui: Reusable assistant-ui components and theme tokens.app/api/v1/chat/route.ts: Chat endpoint backed by AI SDK v5 runtime.lib/openai: LLM access configuration usingOPENAI_API_KEY.app/(app)/(chat)/*: Page using assistant-ui primitives for the chat experience.
Use Case
Ideal for teams building AI chat experiences—product copilots, support assistants, research agents—who want a clean separation between UI (assistant-ui), LLM access configuration (lib/openai), and AI orchestration logic (AI SDK v5). This boilerplate is a scalable starting point for SaaS dashboards and embedded assistants with streaming, tools, and strong TypeScript guarantees. 🚀
Boilerplate details
Last update
3 months agoBoilerplate age
3 months ago