Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
68 changes: 68 additions & 0 deletions docs/adr-guide.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# ADR Integration Guide

Architecture Decision Records capture the *why* behind significant technical decisions. This guide covers when to write them, where to store them, and how they integrate with flow-code.

## When to Create an ADR

Write an ADR when making a decision that would be expensive to reverse:

- **Technology choices** — frameworks, libraries, major dependencies
- **Architectural patterns** — data model design, API style, auth strategy
- **Infrastructure decisions** — hosting, build tools, deployment approach
- **Pattern changes** — moving from one approach to another (e.g., REST to GraphQL)

Do NOT write ADRs for routine implementation choices, obvious decisions, or throwaway prototypes.

## Where to Store ADRs

Use `docs/decisions/` with sequential numbering:

```
docs/decisions/
ADR-001-use-libsql-for-storage.md
ADR-002-wave-checkpoint-execution-model.md
ADR-003-teams-file-locking-protocol.md
```

Use the template at `references/adr-template.md` as your starting point.

## Referencing ADRs in Task Specs

When a task implements or depends on an architectural decision, reference the ADR in the task spec:

```bash
flowctl task create --title "Implement file locking" \
--spec "Implements ADR-003. See docs/decisions/ADR-003-teams-file-locking-protocol.md"
```

In inline code, link to the ADR near the relevant implementation:

```
// Auth strategy per ADR-002. See docs/decisions/ADR-002-auth-strategy.md
```

## Integration with /flow-code:plan

During planning, ADRs surface naturally at two points:

1. **Plan creation** — When `/flow-code:plan` encounters an architectural decision, create the ADR as a task in the epic. The ADR task should complete before implementation tasks that depend on it.

2. **Plan review** — `/flow-code:plan-review` should verify that significant architectural decisions have corresponding ADRs. Missing ADRs are a review finding.

### Example: ADR as a Plan Task

```
Epic: fn-50-migrate-to-graphql
Task 1: Write ADR-005 documenting REST-to-GraphQL migration rationale
Task 2: Implement GraphQL schema (depends on Task 1)
Task 3: Migrate endpoints (depends on Task 2)
```

## ADR Lifecycle

```
PROPOSED -> ACCEPTED -> SUPERSEDED by ADR-XXX
-> DEPRECATED
```

Never delete old ADRs. When a decision changes, write a new ADR that supersedes the old one. The historical record is the whole point.
6 changes: 6 additions & 0 deletions flowctl/crates/flowctl-cli/src/commands/db_shim.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,12 @@ impl Connection {
fn inner(&self) -> libsql::Connection {
self.conn.clone()
}

/// Public accessor for modules that need the raw libsql connection
/// (e.g. skill commands that call async repos directly).
pub fn inner_conn(&self) -> libsql::Connection {
self.conn.clone()
}
}

fn block_on<F: std::future::Future>(fut: F) -> F::Output {
Expand Down
1 change: 1 addition & 0 deletions flowctl/crates/flowctl-cli/src/commands/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,5 @@ pub mod rp;
pub mod stack;
pub mod stats;
pub mod task;
pub mod skill;
pub mod workflow;
212 changes: 212 additions & 0 deletions flowctl/crates/flowctl-cli/src/commands/skill.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,212 @@
//! Skill commands: register, match.
//!
//! `skill register` scans `skills/*/SKILL.md` files, extracts YAML
//! frontmatter (name + description), and upserts each into the DB with
//! a BGE-small embedding for semantic matching.
//!
//! `skill match` performs semantic vector search against registered
//! skills and returns ranked results.

use clap::Subcommand;
use serde::Deserialize;
use serde_json::json;

use crate::output::{error_exit, json_output, pretty_output};

use super::db_shim;

// ── CLI definition ─────────────────────────────────────────────────

#[derive(Subcommand, Debug)]
pub enum SkillCmd {
/// Scan skills/*/SKILL.md and register into DB with embeddings.
Register {
/// Directory to scan (default: DROID_PLUGIN_ROOT or CLAUDE_PLUGIN_ROOT).
#[arg(long)]
dir: Option<String>,
},
/// Semantic search against registered skills.
Match {
/// Search query text.
query: String,
/// Maximum results to return.
#[arg(long, default_value = "5")]
limit: usize,
/// Minimum cosine similarity threshold.
#[arg(long, default_value = "0.70")]
threshold: f64,
},
}

// ── Frontmatter struct ─────────────────────────────────────────────

#[derive(Deserialize)]
struct SkillFrontmatter {
name: String,
description: String,
}

// ── Dispatch ───────────────────────────────────────────────────────

pub fn dispatch(cmd: &SkillCmd, json: bool) {
match cmd {
SkillCmd::Register { dir } => cmd_skill_register(json, dir.as_deref()),
SkillCmd::Match {
query,
limit,
threshold,
} => cmd_skill_match(json, query, *limit, *threshold),
}
}

// ── Register ───────────────────────────────────────────────────────

fn cmd_skill_register(json: bool, dir: Option<&str>) {
// Resolve plugin root directory.
let root = match dir {
Some(d) => std::path::PathBuf::from(d),
None => {
if let Ok(d) = std::env::var("DROID_PLUGIN_ROOT") {
std::path::PathBuf::from(d)
} else if let Ok(d) = std::env::var("CLAUDE_PLUGIN_ROOT") {
std::path::PathBuf::from(d)
} else {
error_exit("No --dir given and DROID_PLUGIN_ROOT / CLAUDE_PLUGIN_ROOT not set");
}
}
};

let skills_dir = root.join("skills");
if !skills_dir.is_dir() {
error_exit(&format!("Skills directory not found: {}", skills_dir.display()));
}

// Walk skills/*/SKILL.md
let mut entries: Vec<(String, String, String)> = Vec::new(); // (name, description, path)
let read_dir = std::fs::read_dir(&skills_dir).unwrap_or_else(|e| {
error_exit(&format!("Cannot read {}: {e}", skills_dir.display()));
});

for entry in read_dir.flatten() {
if !entry.file_type().map(|ft| ft.is_dir()).unwrap_or(false) {
continue;
}
let skill_md = entry.path().join("SKILL.md");
if !skill_md.is_file() {
continue;
}
let content = match std::fs::read_to_string(&skill_md) {
Ok(c) => c,
Err(e) => {
eprintln!("warn: cannot read {}: {e}", skill_md.display());
continue;
}
};
let fm: SkillFrontmatter =
match flowctl_core::frontmatter::parse_frontmatter(&content) {
Ok(f) => f,
Err(e) => {
eprintln!(
"warn: cannot parse frontmatter in {}: {e}",
skill_md.display()
);
continue;
}
};
entries.push((
fm.name,
fm.description,
skill_md.to_string_lossy().to_string(),
));
}

// Upsert each skill into DB.
let conn = db_shim::require_db().unwrap_or_else(|e| {
error_exit(&format!("Cannot open DB: {e}"));
});

let rt = tokio::runtime::Builder::new_current_thread()
.enable_all()
.build()
.expect("failed to create tokio runtime");

let repo = flowctl_db::skill::SkillRepo::new(conn.inner_conn());

for (name, desc, path) in &entries {
rt.block_on(async {
repo.upsert(name, desc, Some(path.as_str()))
.await
.unwrap_or_else(|e| {
eprintln!("warn: failed to upsert skill '{}': {e}", name);
});
});
}

let skills_json: Vec<serde_json::Value> = entries
.iter()
.map(|(n, d, _)| json!({"name": n, "description": d}))
.collect();

if json {
json_output(json!({
"registered": entries.len(),
"skills": skills_json,
}));
} else {
pretty_output("skill_register", &format!("Registered {} skills", entries.len()));
for (name, desc, _) in &entries {
pretty_output("skill_register", &format!(" {} — {}", name, desc));
}
}
}

// ── Match ──────────────────────────────────────────────────────────

fn cmd_skill_match(json: bool, query: &str, limit: usize, threshold: f64) {
let conn = db_shim::require_db().unwrap_or_else(|e| {
error_exit(&format!("Cannot open DB: {e}"));
});

let rt = tokio::runtime::Builder::new_current_thread()
.enable_all()
.build()
.expect("failed to create tokio runtime");

let repo = flowctl_db::skill::SkillRepo::new(conn.inner_conn());
let matches = rt.block_on(async {
repo.match_skills(query, limit, threshold)
.await
.unwrap_or_else(|e| {
error_exit(&format!("match_skills failed: {e}"));
})
});

if json {
let out: Vec<serde_json::Value> = matches
.iter()
.map(|m| {
json!({
"name": m.name,
"description": m.description,
"score": (m.score * 100.0).round() / 100.0,
})
})
.collect();
json_output(json!(out));
} else {
if matches.is_empty() {
pretty_output("skill_match", "No matching skills found.");
return;
}
pretty_output(
"skill_match",
&format!(" {:<6} {:<28} {}", "Score", "Name", "Description"),
);
for m in &matches {
pretty_output(
"skill_match",
&format!(" {:<6.2} {:<28} {}", m.score, m.name, m.description),
);
}
}
}
7 changes: 7 additions & 0 deletions flowctl/crates/flowctl-cli/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ use commands::{
query,
ralph::RalphCmd,
rp::RpCmd,
skill::SkillCmd,
stack::{InvariantsCmd, StackCmd},
stats::StatsCmd,
task::TaskCmd,
Expand Down Expand Up @@ -216,6 +217,11 @@ enum Commands {
#[command(subcommand)]
cmd: RalphCmd,
},
/// Skill registry commands (register, match).
Skill {
#[command(subcommand)]
cmd: SkillCmd,
},
/// RepoPrompt helpers.
Rp {
#[command(subcommand)]
Expand Down Expand Up @@ -484,6 +490,7 @@ fn main() {
Commands::Stack { cmd } => commands::stack::dispatch(&cmd, json),
Commands::Invariants { cmd } => commands::stack::dispatch_invariants(&cmd, json),
Commands::Ralph { cmd } => commands::ralph::dispatch(&cmd, json),
Commands::Skill { cmd } => commands::skill::dispatch(&cmd, json),
Commands::Rp { cmd } => commands::rp::dispatch(&cmd, json),
Commands::Codex { cmd } => commands::codex::dispatch(&cmd, json),
Commands::Hook { cmd } => commands::hook::dispatch(&cmd),
Expand Down
2 changes: 2 additions & 0 deletions flowctl/crates/flowctl-db/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,14 @@ pub mod memory;
pub mod metrics;
pub mod pool;
pub mod repo;
pub mod skill;

pub use error::DbError;
pub use indexer::{reindex, ReindexResult};
pub use events::{EventLog, TaskTokenSummary, TokenRecord, TokenUsageRow};
pub use memory::{MemoryEntry, MemoryFilter, MemoryRepo};
pub use metrics::StatsQuery;
pub use skill::{SkillEntry, SkillMatch, SkillRepo};
pub use pool::{cleanup, open_async, open_memory_async, resolve_db_path, resolve_libsql_path, resolve_state_dir};
pub use repo::{
DepRepo, EpicRepo, EventRepo, EventRow, EvidenceRepo, FileLockRepo, FileOwnershipRepo,
Expand Down
6 changes: 3 additions & 3 deletions flowctl/crates/flowctl-db/src/memory.rs
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ static EMBEDDER: OnceCell<Result<Mutex<TextEmbedding>, String>> = OnceCell::cons
/// model (~130MB) via fastembed; subsequent calls return the cached
/// instance. Initialization runs on a blocking thread because fastembed
/// performs synchronous file I/O.
async fn ensure_embedder() -> Result<(), DbError> {
pub(crate) async fn ensure_embedder() -> Result<(), DbError> {
let res = EMBEDDER
.get_or_init(|| async {
match tokio::task::spawn_blocking(|| {
Expand All @@ -109,7 +109,7 @@ async fn ensure_embedder() -> Result<(), DbError> {
}

/// Embed a single passage into a 384-dim vector.
async fn embed_one(text: &str) -> Result<Vec<f32>, DbError> {
pub(crate) async fn embed_one(text: &str) -> Result<Vec<f32>, DbError> {
ensure_embedder().await?;
let text = text.to_string();
let result = tokio::task::spawn_blocking(move || {
Expand All @@ -132,7 +132,7 @@ async fn embed_one(text: &str) -> Result<Vec<f32>, DbError> {
}

/// Convert a `Vec<f32>` into a libSQL `vector32()` literal string.
fn vec_to_literal(v: &[f32]) -> String {
pub(crate) fn vec_to_literal(v: &[f32]) -> String {
let parts: Vec<String> = v.iter().map(std::string::ToString::to_string).collect();
format!("[{}]", parts.join(","))
}
Expand Down
Loading
Loading