TemuCowork Agent is an AI-powered file management assistant with a cyberpunk aesthetic. Built with .NET 8 Blazor and Microsoft Semantic Kernel, it connects to locally-running Ollama models to provide an intelligent agent that can read, write, search, and manage files through natural language conversation. Features include real-time streaming responses, @ file mentions with autocomplete, multi-format file reading (PDF, Excel, code files), persistent chat sessions, and a stunning neon-glow UI theme.
- Natural Language Commands - Ask the agent to create, read, edit, or delete files using plain English
- Multi-Format Support - Reads
.txt,.md,.json,.xml,.csv,.pdf,.xlsx, and 15+ code file formats - Smart File Search - Recursive file search with real-time filtering
- Sandboxed Operations - All file operations are restricted to a configurable root directory
- Real-Time Streaming - Watch responses stream in as the AI generates them
- Agent Status Indicators - Visual feedback showing thinking, function calls, and writing states
- Function Execution Tracking - See exactly which operations the agent performs
- @ Mention Files - Type
@to get autocomplete suggestions for referencing files in your messages - File Attachments - Attach files from the file browser to include their contents in your message
- Persistent Conversations - All chats are saved and can be resumed later
- Session History - Browse, rename, and delete past conversations
- LiteDB Storage - Lightweight embedded database for reliable storage
- Neon Glow Effects - Matrix-inspired green and cyan color palette
- Custom Fonts - Orbitron, Rajdhani, and Share Tech Mono
- Smooth Animations - Pulse, glow, and slide effects throughout
- Three-Panel Layout - File browser, chat, and sessions in one view
┌─────────────────────────────────────────────────────────────────────┐
│ [Files] [Chat] [Sessions] │
│ ├── TemuCowork/ ┌─────────────────────────┐ ├── Chat 1 │
│ │ ├── Services/ │ You: Read the config │ ├── Chat 2 │
│ │ ├── Models/ │ │ └── Chat 3 │
│ │ └── ... │ Agent: [Thinking...] │ │
│ └── ... │ ► ReadFile executed │ │
│ │ │ │
│ [+ New] [Refresh] │ Here's the content... │ [+ New Chat] │
│ └─────────────────────────┘ │
│ [@mention] [Send] [Attach] │
└─────────────────────────────────────────────────────────────────────┘
- .NET 8.0 SDK
- Ollama running locally
- A compatible LLM model (default:
qwen2.5:14b)
# Install Ollama from https://ollama.ai/
# Then pull a model:
ollama pull qwen2.5:14bgit clone https://github.com/yourusername/TemuCowork.git
cd TemuCowork/TemuCowork
dotnet runNavigate to https://localhost:5001 or http://localhost:5000
Edit appsettings.json to customize:
{
"Ollama": {
"Endpoint": "http://localhost:11434",
"DefaultModel": "qwen2.5:14b",
"Temperature": 0.7,
"MaxTokens": 4096
},
"FileSystem": {
"RootPath": "C:\\Users\\YourName\\Projects"
},
"LiteDb": {
"DatabasePath": "Data/conversations.db"
}
}Or use the in-app Configuration page to:
- Change the Ollama endpoint
- Select from available models
- Adjust temperature and token limits
- Test the connection
"Read the Program.cs file and explain what it does"
"Create a new file called notes.md with a summary of today's meeting"
"Find all JSON files in the project"
"Add a comment at the top of FileSystemService.cs explaining its purpose"
Type @ in the chat to get autocomplete suggestions:
"Compare @config.json with @config.backup.json and tell me the differences"
TemuCowork/
├── Components/
│ ├── Pages/
│ │ ├── Home.razor # Main chat interface
│ │ ├── Configuration.razor # Settings page
│ │ └── Sessions.razor # Session history
│ └── Layout/
│ └── MainLayout.razor # Navigation layout
├── Services/
│ ├── FileSystemService.cs # File operations
│ ├── FileReaderService.cs # Multi-format file reading
│ ├── ConversationService.cs # Session persistence
│ └── SemanticKernelService.cs# AI orchestration
├── Models/
│ ├── ChatMessage.cs # Chat message model
│ ├── Conversation.cs # Session model
│ ├── FileItem.cs # File tree item
│ └── AgentStatus.cs # Agent state tracking
├── Plugins/
│ └── FileSystemPlugin.cs # SK functions for file ops
├── wwwroot/
│ └── app.css # Cyberpunk theme styles
├── Program.cs # App configuration
└── appsettings.json # Settings
| Component | Technology |
|---|---|
| Framework | .NET 8.0, Blazor Server |
| UI Components | Radzen.Blazor |
| AI Orchestration | Microsoft Semantic Kernel |
| LLM Backend | Ollama (local inference) |
| Database | LiteDB |
| PDF Processing | PdfPig |
| Excel Processing | EPPlus |
| Markdown | Markdig |
The AI agent can perform these file system operations:
| Function | Description |
|---|---|
ReadFile |
Read text, PDF, or Excel files |
WriteFile |
Create or overwrite files |
ListDirectory |
List folder contents |
CreateFolder |
Create new directories |
DeleteFile |
Delete files |
DeleteFolder |
Delete directories recursively |
RenameItem |
Rename files or folders |
GetFileInfo |
Get file metadata (size, dates) |
- Path Sandboxing - All operations are restricted to the configured root path
- Path Validation - Every file operation validates paths to prevent directory traversal
- Graceful Error Handling - Unauthorized access attempts are caught and reported
- Ensure Ollama is running:
ollama serve - Check the endpoint in Configuration matches your Ollama server
- Verify the model is downloaded:
ollama list
- Default Ollama port is 11434
- Check firewall settings if running Ollama on a different machine
- Verify the FileSystem:RootPath in appsettings.json exists
- Check read permissions on the directory
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Semantic Kernel - AI orchestration framework
- Ollama - Local LLM inference
- Radzen - Blazor UI components
- LiteDB - Embedded NoSQL database "# temuCowork"