A full-stack boilerplate using Next.js App Router with a production-ready chat UI built on shadcn/ui, integrating with Langflow to orchestrate LLM flows. It demonstrates a complete chat loop—sending user messages to a Langflow flow and streaming model responses back to the UI.
Features
- Next.js App Router for modern full-stack development.
- Langflow integration (REST/WebSocket) to run and version conversational flows.
- shadcn/ui chat interface: message bubbles, input, loading/typing states, and error toasts.
- Message history & roles (system/user/assistant) with simple store for session context.
- Env-driven config (e.g.,
LANGFLOW_BASE_URL
,LANGFLOW_FLOW_ID
,LANGFLOW_API_KEY
). - Streaming support (when enabled in your Langflow deployment) for token-by-token replies.
- TypeScript end-to-end for safer refactors.
Use Case
Ideal for teams prototyping or shipping AI chat experiences—support assistants, internal copilots, and workflow-driven bots—while keeping flows maintainable in Langflow and the UI consistent with shadcn/ui. This boilerplate is a solid base for SaaS dashboards, help desks, and embedded product assistants. 🚀
Boilerplate details
Last update
2 weeks agoBoilerplate age
2 weeks ago