Comprehensive Claude Code plugin for the Burn deep learning framework.
- Domain Skills: Tensors, modules, training, ONNX import, backends, contributing
- Workflow Commands:
/burn-new,/burn-train,/burn-import,/burn-smoke - Specialized Agents: Debugger, code reviewer, ONNX surgeon
- Self-Verification: Auto-format and compile check via hooks
- Evidence-Based: Searches documentation before asserting API behavior
The plugin uses llmx for documentation search. Build and install the MCP server:
# Clone and build llmx
git clone https://github.com/zackees/llmx.git
cd llmx/ingestor-core
cargo build --release --bin llmx-mcp --features mcp
# Add to PATH (or symlink)
cp target/release/llmx-mcp ~/.local/bin/The plugin includes pre-built indexes. To rebuild:
export LLMX_STORAGE_DIR=/path/to/burn-plugin/data/indexes
llmx-mcp # Then send index commands via JSON-RPC# Option A: Test with plugin directory flag
claude --plugin-dir /path/to/burn-plugin
# Option B: Create a marketplace wrapper and install
# See Claude Code plugin documentation for marketplace setupburn-expert/
├── .claude-plugin/plugin.json # Plugin manifest
├── .mcp.json # MCP server config (llmx)
├── commands/ # Slash commands
├── skills/ # Domain knowledge & specialized skills
│ ├── burn-router/ # Central routing
│ ├── burn-app-dev/ # Tensors, modules, config
│ ├── burn-training/ # Learner, metrics, loops
│ ├── burn-onnx/ # Model import
│ ├── burn-backends/ # Backend selection
│ ├── burn-contrib/ # Contributing to Burn
│ ├── burn-debugger/ # Error diagnosis
│ ├── burn-reviewer/ # Code review
│ └── burn-onnx-surgeon/ # ONNX import fixes
├── hooks/ # Auto-verification
└── data/ # Documentation corpus
├── burn-book/
├── burn-contributor-book/
└── indexes/ # llmx search indexes
/burn-new my-project wgpu # Scaffold new Burn project
/burn-train mnist # Add MNIST training loop
/burn-import model.onnx # Import ONNX model
/burn-smoke # Format, check, test
Ask about:
- Tensor operations, shapes, types
- Module definition and configuration
- Training loops and metrics
- ONNX/PyTorch model import
- Backend selection and custom kernels
- Contributing to Burn
- burn-debugger: Diagnose runtime errors, shape mismatches
- burn-reviewer: Code review for Burn idioms and best practices
- burn-onnx-surgeon: Fix ONNX import failures
- Version-aware: Detects project Burn version, warns on drift
- Evidence-seeking: Cites documentation chunks, not assumptions
- Self-verifying:
cargo checkafter every code generation
Documentation from Burn 0.16.x (burn-book + burn-contributor-book).