A research data processing prototype built on DeepAgents, FastAPI, skill integration, and a React frontend.
- Streaming agent chat via
/chat/stream - Conversation history management: create, switch, delete, and load threads
- Workspace file handling: upload inputs and download outputs
- Message-level attachments: persisted for both uploaded files and preset example files
- Example pipelines: create a new thread and start a task in one click
- Skills center: list skills and categories
flow-deepagents-0408/
├── server.py # FastAPI entrypoint
├── main_cli.py # CLI entrypoint
├── requirements.txt # Python dependencies
├── README.md
├── README.zh-CN.md
├── docs/
│ └── workspace_file_api.md # Workspace file API notes
├── config/
│ ├── app.yaml
│ ├── database.yaml
│ ├── llm.yaml
│ └── mcp_servers.yaml
├── infra/ # config, env, logging
├── runtime/ # engine, chat store, workspace runtime
├── agents/ # agent factory, prompts, middleware, tools
├── tools/ # tool abstraction and adapters
├── mcp_runtime/ # MCP lifecycle and registration
├── workspace/ # workspace root
├── storage/ # static storage mount
├── test/ # Python tests
└── vue-web/
├── package.json
├── README.md
└── src/
├── components/
├── lib/
└── pages/
Python 3.10+ is recommended.
Install backend dependencies:
pip install -r requirements.txtNode.js 18+ is recommended for the frontend.
cd vue-web
npm installThe project uses PostgreSQL for:
- chat threads
- messages
- message attachments
- skill metadata
Tables are created automatically on startup, but you must prepare the database first and configure config/database.yaml.
Example:
host: "127.0.0.1"
port: 5432
user: "postgres"
password: "123456"
name: "flow_agent"Model provider settings come from config files and environment variables.
Check:
- config/llm.yaml
.envfor provider API keys
Example:
DASHSCOPE_API_KEY=your_key
OPENAI_API_KEY=your_keyInstall pandoc if you need more complete document conversion support.
python -m uvicorn server:app --host 0.0.0.0 --port 8080 --reloadOr:
python server.pyDefault API base:
http://localhost:8080
cd vue-web
npm run devDefault frontend URL:
http://localhost:5173
If you need a different backend base URL, create vue-web/.env:
VITE_API_BASE=http://localhost:8080python main_cli.pyThis is useful for quick engine validation, but not a replacement for the web workflow.
- Configure PostgreSQL and the LLM provider
- Start
server.py/ uvicorn - Start
vue-web - Validate:
- plain message sending
- file upload and send
- example pipeline trigger
- thread switching
- artifact download
The workspace root is configured in config/app.yaml, typically with:
workspace/tempworkspace/outputsworkspace/artifactsworkspace/logs
There are currently two attachment paths:
- User-uploaded files
- frontend calls
/workspace/upload - backend stores them under
/temp/<thread_id>/... - backend records them in
chat_files
- frontend calls
- Preset example files
- frontend creates the message first
- frontend calls
/message/attach - backend records the file paths as message attachments
- frontend includes those paths in
/chat/stream
This means:
- attachment badges are not just UI-only
- attachments can still be restored after a page refresh through
/thread/messages
The current frontend depends on:
POST /chat/streamPOST /chatPOST /threads/getTitlesPOST /thread/messagesPOST /thread/deletePOST /message/createPOST /message/attachPOST /workspace/uploadGET /workspace/downloadGET /skills/listGET /skills/types
Current pages include:
- Home
- chat input
- file upload
- 3 example pipelines
- thread history
- assistant streaming state
- Skills
- skill list and types
Current example pipeline behavior:
- creates a new thread
- creates a user message
- attaches preset example files
- starts a streaming task automatically
Main runtime entry responsible for:
- database initialization
- env loading
- MCP runtime initialization
- agent creation
- sync and streaming execution
Persistence layer for:
chat_threadsmessageschat_files
Handles workspace path validation and virtual path resolution.
Contains:
- prompts
- middleware
- local tools
- agent factory
Handles MCP server lifecycle:
- client initialization
- health checks
- auto reconnect
- tool schema registration
pytest-
Backend fails to start
- verify database config
- verify LLM API key
- verify MCP server config
-
Frontend cannot reach backend
- verify
VITE_API_BASE - verify backend is listening on
8080
- verify
-
Attachment badge appears but download fails
- verify the file exists in the workspace
- verify the path registered through
/message/attach
-
No SSE output
- inspect
/chat/streamin browser devtools - inspect backend logs for
chat stream request
- inspect
This README reflects the current repository state and focuses on:
- how to run the project
- what the main modules do
- how the frontend and backend integrate today
For more detailed implementation references, see:
docs/workspace_file_api.md still contains some historical content, so treat the code as the source of truth.