A visual LLM flow builder. Create node-based prompt chains that execute sequentially against Claude and other LLM providers. All data stays client-side in IndexedDB.
# Install dependencies
npm install
# Start dev server
npm run devOpen http://localhost:3000 in your browser.
You'll need an Anthropic API key. Enter it in the app settings — it's stored in localStorage (never sent to any server other than Anthropic's API).
npm run dev # Dev server on localhost:3000
npm run build # Production build
npm run start # Serve production build
npm run lint # ESLint check- Create a new flow from the sidebar
- Add nodes to the canvas: user-input, agent, structured-output, router, output
- Connect nodes by dragging edges between them
- Configure each node (system prompts, templates, schemas, routing conditions)
- Use
{{variable}}syntax in templates to reference upstream node outputs - Run the flow — nodes execute sequentially from root to leaf
| Node | Purpose |
|---|---|
| user-input | Entry point — passes user input forward |
| agent | LLM call with system prompt + message template (optionally with VFS tool-call loop) |
| structured-output | LLM call returning JSON matching a defined schema |
| router | Conditional branching (equals, contains, gt, lt) |
| output | Terminal node for displaying final results |
- Next.js 14 (App Router)
- ReactFlow (canvas)
- Zustand + IndexedDB (state & persistence)
- shadcn/ui + Tailwind CSS (UI)
- Anthropic SDK (LLM provider)
See docs/CHANGELOG.md for version history.