diff --git a/README.md b/README.md index 010d42b2..3e976e4c 100644 --- a/README.md +++ b/README.md @@ -57,6 +57,7 @@ Skill registries install skills. APM manages **every primitive** your AI agents | **Skills** | AI capabilities, workflows | Form builder, code reviewer | | **Prompts** | Reusable slash commands | `/security-audit`, `/design-review` | | **Agents** | Specialized personas | Accessibility auditor, API designer | +| **Hooks** | Lifecycle event handlers | Pre-tool validation, post-tool linting | | **MCP Servers** | Tool integrations | Database access, API connectors | All declared in one manifest. All installed with one command — including transitive dependencies: @@ -119,7 +120,8 @@ my-standards/ ├── instructions/ # Guardrails (.instructions.md) ├── prompts/ # Slash commands (.prompt.md) ├── skills/ # Agent Skills (SKILL.md) - └── agents/ # Personas (.agent.md) + ├── agents/ # Personas (.agent.md) + └── hooks/ # Event handlers (.json) ``` Add a guardrail and publish: diff --git a/docs/cli-reference.md b/docs/cli-reference.md index 5c6a1cdf..6daacf95 100644 --- a/docs/cli-reference.md +++ b/docs/cli-reference.md @@ -168,6 +168,8 @@ apm install --exclude codex - **Claude Skills**: Repositories with `SKILL.md` (auto-generates `apm.yml` upon installation) - Example: `apm install ComposioHQ/awesome-claude-skills/brand-guidelines` - Skills are transformed to `.github/agents/*.agent.md` for VSCode target +- **Hook Packages**: Repositories with `hooks/*.json` (no `apm.yml` or `SKILL.md` required) + - Example: `apm install anthropics/claude-plugins-official/plugins/hookify` - **Virtual Packages**: Single files or collections installed directly from URLs - Single `.prompt.md` or `.agent.md` files from any GitHub repository - Collections from curated sources (e.g., `github/awesome-copilot`) @@ -215,6 +217,7 @@ When you run `apm install`, APM automatically integrates primitives from install - **Chatmodes**: `.chatmode.md` files → `.github/agents/*-apm.agent.md` (renamed to modern format) - **Control**: Disable with `apm config set auto-integrate false` - **Smart updates**: Only updates when package version/commit changes +- **Hooks**: Hook `.json` files → `.github/hooks/*-apm.json` with scripts bundled - **Naming**: Integrated files use `-apm` suffix (e.g., `accessibility-audit-apm.prompt.md`) - **GitIgnore**: Pattern `*-apm.prompt.md` automatically added to `.gitignore` @@ -224,6 +227,7 @@ APM also integrates with Claude Code when `.claude/` directory exists: - **Agents**: `.agent.md` and `.chatmode.md` files → `.claude/agents/*-apm.md` - **Commands**: `.prompt.md` files → `.claude/commands/*-apm.md` +- **Hooks**: Hook definitions merged into `.claude/settings.json` hooks key **Skill Integration:** @@ -278,6 +282,8 @@ apm uninstall microsoft/apm-sample-package --dry-run | Integrated chatmodes | `.github/agents/*-apm.agent.md` | | Claude commands | `.claude/commands/*-apm.md` | | Skill folders | `.github/skills/{folder-name}/` | +| Integrated hooks | `.github/hooks/*-apm.json` | +| Claude hook settings | `.claude/settings.json` (hooks key cleaned) | | Lockfile entries | `apm.lock` (removed packages + orphaned transitives) | **Behavior:** diff --git a/docs/compilation.md b/docs/compilation.md index cfff5149..5f7fa76b 100644 --- a/docs/compilation.md +++ b/docs/compilation.md @@ -41,7 +41,7 @@ target: vscode # or claude, or all | `all` | Both `AGENTS.md` and `CLAUDE.md` | Universal compatibility | | `minimal` | `AGENTS.md` only | Works everywhere, no folder integration | -> **Note**: `AGENTS.md` and `CLAUDE.md` contain **only instructions** (grouped by `applyTo` patterns). Prompts, agents, commands, and skills are integrated by `apm install`, not `apm compile`. See the [Integrations Guide](integrations.md) for details on how `apm install` populates `.github/prompts/`, `.github/agents/`, `.github/skills/`, and `.claude/commands/`. +> **Note**: `AGENTS.md` and `CLAUDE.md` contain **only instructions** (grouped by `applyTo` patterns). Prompts, agents, commands, hooks, and skills are integrated by `apm install`, not `apm compile`. See the [Integrations Guide](integrations.md) for details on how `apm install` populates `.github/prompts/`, `.github/agents/`, `.github/skills/`, and `.claude/commands/`. ### How It Works diff --git a/docs/concepts.md b/docs/concepts.md index 8516299d..519e8d1e 100644 --- a/docs/concepts.md +++ b/docs/concepts.md @@ -157,6 +157,7 @@ Package your prompt engineering into reusable, configurable components: - **Agents** (.agent.md) - AI assistant personalities - **Skills** (SKILL.md) - Package meta-guides for AI agents - **Context** (.context.md) - Project knowledge base +- **Hooks** (.json) - Lifecycle event handlers ### Layer 3: Context Engineering @@ -234,6 +235,20 @@ Optimized project knowledge for AI consumption: - Event-driven communication between services ``` +### Hooks (.json) +Lifecycle event handlers that run scripts at specific points during AI operations: + +```json +{ + "hooks": { + "PostToolUse": [{ + "matcher": { "tool_name": "write_file" }, + "hooks": [{ "type": "command", "command": "./scripts/lint.sh" }] + }] + } +} +``` + ## Universal Compatibility APM generates context files for all major coding agents: diff --git a/docs/dependencies.md b/docs/dependencies.md index 9f172666..12d71810 100644 --- a/docs/dependencies.md +++ b/docs/dependencies.md @@ -18,8 +18,7 @@ APM supports multiple dependency types: | Type | Detection | Example | |------|-----------|---------| | **APM Package** | Has `apm.yml` | `microsoft/apm-sample-package` | -| **Claude Skill** | Has `SKILL.md` (no `apm.yml`) | `ComposioHQ/awesome-claude-skills/brand-guidelines` | -| **Virtual Subdirectory Package** | Folder path in monorepo | `ComposioHQ/awesome-claude-skills/mcp-builder` | +| **Claude Skill** | Has `SKILL.md` (no `apm.yml`) | `ComposioHQ/awesome-claude-skills/brand-guidelines` || **Hook Package** | Has `hooks/*.json` (no `apm.yml` or `SKILL.md`) | `anthropics/claude-plugins-official/plugins/hookify` || **Virtual Subdirectory Package** | Folder path in monorepo | `ComposioHQ/awesome-claude-skills/mcp-builder` | | **Virtual Subdirectory Package** | Folder path in repo | `github/awesome-copilot/skills/review-and-refactor` | | **ADO Package** | Azure DevOps repo | `dev.azure.com/org/project/_git/repo` | diff --git a/docs/integrations.md b/docs/integrations.md index a152fd9f..9323e7d5 100644 --- a/docs/integrations.md +++ b/docs/integrations.md @@ -152,6 +152,9 @@ apm install microsoft/apm-sample-package # Agents are automatically integrated to: # .github/agents/*-apm.agent.md (verbatim copy) + +# Hooks are automatically integrated to: +# .github/hooks/*-apm.json (hook definitions with rewritten script paths) ``` **How Auto-Integration Works**: @@ -163,13 +166,14 @@ apm install microsoft/apm-sample-package **Integration Flow**: 1. Run `apm install` to fetch APM packages -2. APM automatically creates `.github/prompts/` and `.github/agents/` directories if needed -3. Discovers `.prompt.md` and `.agent.md` files in each package +2. APM automatically creates `.github/prompts/`, `.github/agents/`, and `.github/hooks/` directories if needed +3. Discovers `.prompt.md`, `.agent.md`, and hook `.json` files in each package 4. Copies prompts to `.github/prompts/` with `-apm` suffix (e.g., `accessibility-audit-apm.prompt.md`) 5. Copies agents to `.github/agents/` with `-apm` suffix (e.g., `security-apm.agent.md`) -6. Updates `.gitignore` to exclude integrated prompts and agents -7. VSCode automatically loads all prompts and agents for your coding agents -8. Run `apm uninstall` to automatically remove integrated prompts and agents +6. Copies hooks to `.github/hooks/` with `-apm` suffix (e.g., `hookify-hooks-apm.json`) and copies referenced scripts +7. Updates `.gitignore` to exclude integrated prompts, agents, and hooks +8. VSCode automatically loads all prompts, agents, and hooks for your coding agents +9. Run `apm uninstall` to automatically remove integrated primitives **Intent-First Discovery**: The `-apm` suffix pattern enables natural autocomplete in VSCode: @@ -222,10 +226,11 @@ When you run `apm compile`, APM generates Claude-native files: When you run `apm install`, APM integrates package primitives into Claude's native structure: | Location | Purpose | -|----------|---------|| +|----------|---------| | `.claude/agents/*-apm.md` | Sub-agents from installed packages (from `.agent.md` files) | | `.claude/commands/*-apm.md` | Slash commands from installed packages (from `.prompt.md` files) | | `.claude/skills/{folder}/` | Skills from packages with `SKILL.md` or `.apm/` primitives | +| `.claude/settings.json` (hooks key) | Hooks from installed packages (merged into settings) | ### Automatic Agent Integration @@ -286,6 +291,33 @@ apm install ComposioHQ/awesome-claude-skills/mcp-builder 3. Updates `.gitignore` to exclude integrated skills 4. `apm uninstall` removes the skill folder +### Automatic Hook Integration + +APM automatically integrates hooks from installed packages. Hooks define lifecycle event handlers (e.g., `PreToolUse`, `PostToolUse`, `Stop`) supported by both VSCode Copilot and Claude Code. + +> **Note:** Hook packages must be authored in the target platform's native format. APM handles path rewriting and file placement but does not translate between hook schema formats (e.g., Claude's `command` key vs GitHub Copilot's `bash`/`powershell` keys, or event name casing differences). + +```bash +# Install a package with hooks +apm install anthropics/claude-plugins-official/plugins/hookify + +# VSCode result (.github/hooks/): +# .github/hooks/hookify-hooks-apm.json → Hook definitions +# .github/hooks/scripts/hookify/hooks/*.py → Referenced scripts + +# Claude result (.claude/settings.json): +# Hooks merged into .claude/settings.json hooks key +# Scripts copied to .claude/hooks/hookify/ +``` + +**How hook integration works:** +1. `apm install` discovers hook JSON files in `.apm/hooks/` or `hooks/` directories +2. For VSCode: copies hook JSON to `.github/hooks/` with `-apm` suffix and rewrites script paths +3. For Claude: merges hook definitions into `.claude/settings.json` under the `hooks` key +4. Copies referenced scripts to the target location +5. Rewrites `${CLAUDE_PLUGIN_ROOT}` and relative script paths for the target platform +6. `apm uninstall` removes hook files and cleans up merged settings + ### Target-Specific Compilation Generate only Claude formats when needed: diff --git a/docs/primitives.md b/docs/primitives.md index 043b9f5d..a629d5a9 100644 --- a/docs/primitives.md +++ b/docs/primitives.md @@ -69,12 +69,13 @@ apm run review-copilot --param files="src/auth/" ## Overview -The APM CLI supports four types of primitives: +The APM CLI supports the following types of primitives: - **Agents** (`.agent.md`) - Define AI assistant personalities and behaviors (legacy: `.chatmode.md`) - **Instructions** (`.instructions.md`) - Provide coding standards and guidelines for specific file types - **Skills** (`SKILL.md`) - Package meta-guides that help AI agents understand what a package does - **Context** (`.context.md`, `.memory.md`) - Supply background information and project context +- **Hooks** (`.json` in `.apm/hooks/` or `hooks/`) - Define lifecycle event handlers with script references > **Note**: Both `.agent.md` (new format) and `.chatmode.md` (legacy format) are fully supported. VSCode provides Quick Fix actions to help migrate from `.chatmode.md` to `.agent.md`. @@ -95,8 +96,12 @@ APM discovers primitives in these locations: │ └── *.instructions.md ├── context/ # Project context files │ └── *.context.md -└── memory/ # Team info, contacts, etc. - └── *.memory.md +├── memory/ # Team info, contacts, etc. +│ └── *.memory.md +└── hooks/ # Lifecycle event handlers + ├── *.json # Hook definitions (JSON) + └── scripts/ # Referenced scripts + └── *.sh, *.py # VSCode-compatible structure .github/ @@ -117,7 +122,7 @@ APM discovers primitives in these locations: ## Component Types Overview -Context implements the complete [AI-Native Development framework](https://danielmeppiel.github.io/awesome-ai-native/docs/concepts/) through four core component types: +Context implements the complete [AI-Native Development framework](https://danielmeppiel.github.io/awesome-ai-native/docs/concepts/) through the following core component types: ### Instructions (.instructions.md) **Context Engineering Layer** - Targeted guidance by file type and domain @@ -340,6 +345,38 @@ Team information (`.apm/memory/team-contacts.memory.md`): - Sprint planning: Mondays 2:00 PM PST ``` +### Hooks + +Hooks define lifecycle event handlers that run scripts at specific points during AI agent operations (e.g., before/after tool use). + +**Format:** `.json` files in `hooks/` or `.apm/hooks/` + +**Structure:** +```json +{ + "hooks": { + "PostToolUse": [ + { + "matcher": { "tool_name": "write_file" }, + "hooks": [ + { + "type": "command", + "command": "./scripts/lint-changed.sh $TOOL_INPUT_path" + } + ] + } + ] + } +} +``` + +**Supported Events:** `PreToolUse`, `PostToolUse`, `Stop`, `Notification`, `SubagentStop` + +**Integration:** +- VSCode: Hook JSON files are copied to `.github/hooks/*-apm.json` with script paths rewritten +- Claude: Hooks are merged into `.claude/settings.json` under the `hooks` key +- Scripts referenced by hooks are bundled alongside the hook definitions + ## Discovery and Parsing The APM CLI automatically discovers and parses all primitive files in your project. diff --git a/docs/skills.md b/docs/skills.md index c1d89bee..059f9916 100644 --- a/docs/skills.md +++ b/docs/skills.md @@ -276,6 +276,7 @@ APM automatically detects package types: |-----|------|-----------| | `apm.yml` only | APM Package | Standard APM primitives | | `SKILL.md` only | Claude Skill | Treated as native skill | +| `hooks/*.json` only | Hook Package | Hook handlers only | | Both files | Hybrid Package | Best of both worlds | ## Target Detection diff --git a/src/apm_cli/cli.py b/src/apm_cli/cli.py index 6c2827f4..874376b7 100644 --- a/src/apm_cli/cli.py +++ b/src/apm_cli/cli.py @@ -1317,13 +1317,15 @@ def _find_transitive_orphans(lockfile, removed_urls): agents_cleaned = 0 commands_cleaned = 0 skills_cleaned = 0 + hooks_cleaned = 0 try: - from apm_cli.models.apm_package import APMPackage, PackageInfo, PackageType, validate_package + from apm_cli.models.apm_package import APMPackage, PackageInfo, PackageType, validate_apm_package from apm_cli.integration.prompt_integrator import PromptIntegrator from apm_cli.integration.agent_integrator import AgentIntegrator from apm_cli.integration.skill_integrator import SkillIntegrator from apm_cli.integration.command_integrator import CommandIntegrator + from apm_cli.integration.hook_integrator import HookIntegrator apm_package = APMPackage.from_apm_yml(Path("apm.yml")) project_root = Path(".") @@ -1354,11 +1356,17 @@ def _find_transitive_orphans(lockfile, removed_urls): result = integrator.sync_integration(apm_package, project_root) commands_cleaned = result.get("files_removed", 0) + # Clean hooks (.github/hooks/ and .claude/settings.json) + hook_integrator_cleanup = HookIntegrator() + result = hook_integrator_cleanup.sync_integration(apm_package, project_root) + hooks_cleaned = result.get("files_removed", 0) + # Phase 2: Re-integrate from remaining installed packages in apm_modules/ prompt_integrator = PromptIntegrator() agent_integrator = AgentIntegrator() skill_integrator = SkillIntegrator() command_integrator = CommandIntegrator() + hook_integrator_reint = HookIntegrator() for dep in apm_package.get_apm_dependencies(): dep_ref = dep if hasattr(dep, 'repo_url') else None @@ -1372,7 +1380,7 @@ def _find_transitive_orphans(lockfile, removed_urls): continue # Build minimal PackageInfo for re-integration - result = validate_package(install_path) + result = validate_apm_package(install_path) pkg = result.package if result and result.package else None if not pkg: continue @@ -1393,6 +1401,8 @@ def _find_transitive_orphans(lockfile, removed_urls): skill_integrator.integrate_package_skill(pkg_info, project_root) if command_integrator.should_integrate(project_root): command_integrator.integrate_package_commands(pkg_info, project_root) + hook_integrator_reint.integrate_package_hooks(pkg_info, project_root) + hook_integrator_reint.integrate_package_hooks_claude(pkg_info, project_root) except Exception: pass # Best effort re-integration @@ -1408,6 +1418,8 @@ def _find_transitive_orphans(lockfile, removed_urls): _rich_info(f"✓ Cleaned up {skills_cleaned} skill(s)") if commands_cleaned > 0: _rich_info(f"✓ Cleaned up {commands_cleaned} command(s)") + if hooks_cleaned > 0: + _rich_info(f"✓ Cleaned up {hooks_cleaned} hook(s)") # Final summary summary_lines = [] @@ -1617,15 +1629,18 @@ def matches_filter(dep): agent_integrator = AgentIntegrator() from apm_cli.integration.skill_integrator import SkillIntegrator, should_install_skill from apm_cli.integration.command_integrator import CommandIntegrator + from apm_cli.integration.hook_integrator import HookIntegrator skill_integrator = SkillIntegrator() command_integrator = CommandIntegrator() + hook_integrator = HookIntegrator() total_prompts_integrated = 0 total_agents_integrated = 0 total_skills_integrated = 0 total_sub_skills_promoted = 0 total_instructions_found = 0 total_commands_integrated = 0 + total_hooks_integrated = 0 total_links_resolved = 0 # Collect installed packages for lockfile generation @@ -1873,6 +1888,26 @@ def matches_filter(dep): f" └─ {command_result.files_updated} commands updated" ) total_links_resolved += command_result.links_resolved + + # Hook integration (target-aware) + if integrate_vscode: + hook_result = hook_integrator.integrate_package_hooks( + cached_package_info, project_root + ) + if hook_result.hooks_integrated > 0: + total_hooks_integrated += hook_result.hooks_integrated + _rich_info( + f" └─ {hook_result.hooks_integrated} hook(s) integrated → .github/hooks/" + ) + if integrate_claude: + hook_result_claude = hook_integrator.integrate_package_hooks_claude( + cached_package_info, project_root + ) + if hook_result_claude.hooks_integrated > 0: + total_hooks_integrated += hook_result_claude.hooks_integrated + _rich_info( + f" └─ {hook_result_claude.hooks_integrated} hook(s) integrated → .claude/settings.json" + ) except Exception as e: # Don't fail installation if integration fails _rich_warning( @@ -2058,6 +2093,26 @@ def matches_filter(dep): f" └─ {command_result.files_updated} commands updated" ) total_links_resolved += command_result.links_resolved + + # Hook integration (target-aware) + if integrate_vscode: + hook_result = hook_integrator.integrate_package_hooks( + package_info, project_root + ) + if hook_result.hooks_integrated > 0: + total_hooks_integrated += hook_result.hooks_integrated + _rich_info( + f" └─ {hook_result.hooks_integrated} hook(s) integrated → .github/hooks/" + ) + if integrate_claude: + hook_result_claude = hook_integrator.integrate_package_hooks_claude( + package_info, project_root + ) + if hook_result_claude.hooks_integrated > 0: + total_hooks_integrated += hook_result_claude.hooks_integrated + _rich_info( + f" └─ {hook_result_claude.hooks_integrated} hook(s) integrated → .claude/settings.json" + ) except Exception as e: # Don't fail installation if integration fails _rich_warning(f" ⚠ Failed to integrate primitives: {e}") @@ -2126,6 +2181,17 @@ def matches_filter(dep): except Exception as e: _rich_warning(f"Could not update .gitignore for Claude agents: {e}") + # Update .gitignore for integrated hooks if any were integrated + if integrate_vscode and total_hooks_integrated > 0: + try: + updated = hook_integrator.update_gitignore(project_root) + if updated: + _rich_info( + "Updated .gitignore for integrated hooks (*-apm.json)" + ) + except Exception as e: + _rich_warning(f"Could not update .gitignore for hooks: {e}") + # Show link resolution stats if any were resolved if total_links_resolved > 0: _rich_info(f"✓ Resolved {total_links_resolved} context file links") @@ -2134,6 +2200,12 @@ def matches_filter(dep): if total_commands_integrated > 0: _rich_info(f"✓ Integrated {total_commands_integrated} command(s)") + # Show hooks stats if any were integrated + if total_hooks_integrated > 0: + _rich_info(f"✓ Integrated {total_hooks_integrated} hook(s)") + + _rich_success(f"Installed {installed_count} APM dependencies") + return installed_count, total_prompts_integrated, total_agents_integrated except Exception as e: diff --git a/src/apm_cli/commands/deps.py b/src/apm_cli/commands/deps.py index 8968c4ae..a2c32dbb 100644 --- a/src/apm_cli/commands/deps.py +++ b/src/apm_cli/commands/deps.py @@ -170,6 +170,7 @@ def list_packages(): table.add_column("Instructions", style="green", justify="center") table.add_column("Agents", style="cyan", justify="center") table.add_column("Skills", style="yellow", justify="center") + table.add_column("Hooks", style="red", justify="center") for pkg in installed_packages: p = pkg['primitives'] @@ -181,6 +182,7 @@ def list_packages(): str(p.get('instructions', 0)) if p.get('instructions', 0) > 0 else "-", str(p.get('agents', 0)) if p.get('agents', 0) > 0 else "-", str(p.get('skills', 0)) if p.get('skills', 0) > 0 else "-", + str(p.get('hooks', 0)) if p.get('hooks', 0) > 0 else "-", ) console.print(table) @@ -194,8 +196,8 @@ def list_packages(): else: # Fallback text table click.echo("📋 APM Dependencies:") - click.echo(f"{'Package':<30} {'Version':<10} {'Source':<12} {'Prompts':>7} {'Instr':>7} {'Agents':>7} {'Skills':>7}") - click.echo("-" * 90) + click.echo(f"{'Package':<30} {'Version':<10} {'Source':<12} {'Prompts':>7} {'Instr':>7} {'Agents':>7} {'Skills':>7} {'Hooks':>7}") + click.echo("-" * 98) for pkg in installed_packages: p = pkg['primitives'] @@ -206,7 +208,8 @@ def list_packages(): instructions = str(p.get('instructions', 0)) if p.get('instructions', 0) > 0 else "-" agents = str(p.get('agents', 0)) if p.get('agents', 0) > 0 else "-" skills = str(p.get('skills', 0)) if p.get('skills', 0) > 0 else "-" - click.echo(f"{name:<30} {version:<10} {source:<12} {prompts:>7} {instructions:>7} {agents:>7} {skills:>7}") + hooks = str(p.get('hooks', 0)) if p.get('hooks', 0) > 0 else "-" + click.echo(f"{name:<30} {version:<10} {source:<12} {prompts:>7} {instructions:>7} {agents:>7} {skills:>7} {hooks:>7}") # Show orphaned packages warning if orphaned_packages: @@ -519,6 +522,11 @@ def info(package: str): else: content_lines.append(" • No agent workflows found") + if package_info.get('hooks', 0) > 0: + content_lines.append("") + content_lines.append("[bold]Hooks:[/bold]") + content_lines.append(f" • {package_info['hooks']} hook file(s)") + content = "\n".join(content_lines) panel = Panel(content, title=f"ℹ️ Package Info: {package}", border_style="cyan") console.print(panel) @@ -549,6 +557,11 @@ def info(package: str): click.echo(f" • {package_info['workflows']} executable workflows") else: click.echo(" • No agent workflows found") + + if package_info.get('hooks', 0) > 0: + click.echo("") + click.echo("Hooks:") + click.echo(f" • {package_info['hooks']} hook file(s)") except Exception as e: _rich_error(f"Error reading package information: {e}") @@ -563,7 +576,7 @@ def _count_primitives(package_path: Path) -> Dict[str, int]: Returns: dict: Counts for 'prompts', 'instructions', 'agents', 'skills' """ - counts = {'prompts': 0, 'instructions': 0, 'agents': 0, 'skills': 0} + counts = {'prompts': 0, 'instructions': 0, 'agents': 0, 'skills': 0, 'hooks': 0} apm_dir = package_path / ".apm" if apm_dir.exists(): @@ -591,6 +604,11 @@ def _count_primitives(package_path: Path) -> Dict[str, int]: if (package_path / "SKILL.md").exists(): counts['skills'] += 1 + # Count hooks (.json files in hooks/ or .apm/hooks/) + for hooks_dir in [package_path / "hooks", apm_dir / "hooks" if apm_dir.exists() else None]: + if hooks_dir and hooks_dir.exists() and hooks_dir.is_dir(): + counts['hooks'] += len(list(hooks_dir.glob("*.json"))) + return counts @@ -689,6 +707,7 @@ def _get_detailed_package_info(package_path: Path) -> Dict[str, Any]: if apm_yml_path.exists(): package = APMPackage.from_apm_yml(apm_yml_path) context_count, workflow_count = _count_package_files(package_path) + primitives = _count_primitives(package_path) return { 'name': package.name, 'version': package.version or 'unknown', @@ -697,10 +716,12 @@ def _get_detailed_package_info(package_path: Path) -> Dict[str, Any]: 'source': package.source or 'local', 'install_path': str(package_path.resolve()), 'context_files': _get_detailed_context_counts(package_path), - 'workflows': workflow_count + 'workflows': workflow_count, + 'hooks': primitives.get('hooks', 0) } else: context_count, workflow_count = _count_package_files(package_path) + primitives = _count_primitives(package_path) return { 'name': package_path.name, 'version': 'unknown', @@ -709,7 +730,8 @@ def _get_detailed_package_info(package_path: Path) -> Dict[str, Any]: 'source': 'unknown', 'install_path': str(package_path.resolve()), 'context_files': _get_detailed_context_counts(package_path), - 'workflows': workflow_count + 'workflows': workflow_count, + 'hooks': primitives.get('hooks', 0) } except Exception as e: return { @@ -720,7 +742,8 @@ def _get_detailed_package_info(package_path: Path) -> Dict[str, Any]: 'source': 'unknown', 'install_path': str(package_path.resolve()), 'context_files': {'instructions': 0, 'chatmodes': 0, 'contexts': 0}, - 'workflows': 0 + 'workflows': 0, + 'hooks': 0 } diff --git a/src/apm_cli/deps/package_validator.py b/src/apm_cli/deps/package_validator.py index 003a7a10..b06b7fae 100644 --- a/src/apm_cli/deps/package_validator.py +++ b/src/apm_cli/deps/package_validator.py @@ -89,6 +89,20 @@ def validate_package_structure(self, package_path: Path) -> ValidationResult: # Validate each primitive file for md_file in md_files: self._validate_primitive_file(md_file, result) + + # Check for hooks (JSON files, not markdown) + hooks_dir = apm_dir / "hooks" + if hooks_dir.exists() and hooks_dir.is_dir(): + json_files = list(hooks_dir.glob("*.json")) + if json_files: + has_primitives = True + + # Also check hooks/ at package root (Claude-native convention) + hooks_root_dir = package_path / "hooks" + if hooks_root_dir.exists() and hooks_root_dir.is_dir(): + json_files = list(hooks_root_dir.glob("*.json")) + if json_files: + has_primitives = True if not has_primitives: result.add_warning("No primitive files found in .apm/ directory") @@ -209,8 +223,20 @@ def get_package_info_summary(self, package_path: Path) -> Optional[str]: primitive_dir = apm_dir / primitive_type if primitive_dir.exists(): primitive_count += len(list(primitive_dir.glob("*.md"))) - - if primitive_count > 0: - summary += f" ({primitive_count} primitives)" + # Count hook files in .apm/hooks/ + hooks_dir = apm_dir / "hooks" + if hooks_dir.exists(): + primitive_count += len(list(hooks_dir.glob("*.json"))) + + # Also count hook files in hooks/ (Claude-native convention) + hooks_root_dir = package_path / "hooks" + if hooks_root_dir.exists(): + json_count = len(list(hooks_root_dir.glob("*.json"))) + # Avoid double-counting if .apm/hooks already counted + if not (apm_dir.exists() and (apm_dir / "hooks").exists()): + primitive_count += json_count + + if primitive_count > 0: + summary += f" ({primitive_count} primitives)" return summary \ No newline at end of file diff --git a/src/apm_cli/integration/__init__.py b/src/apm_cli/integration/__init__.py index 68aa8e29..efc3af2c 100644 --- a/src/apm_cli/integration/__init__.py +++ b/src/apm_cli/integration/__init__.py @@ -2,6 +2,7 @@ from .prompt_integrator import PromptIntegrator from .agent_integrator import AgentIntegrator +from .hook_integrator import HookIntegrator from .skill_integrator import ( SkillIntegrator, validate_skill_name, @@ -17,6 +18,7 @@ __all__ = [ 'PromptIntegrator', 'AgentIntegrator', + 'HookIntegrator', 'SkillIntegrator', 'SkillTransformer', 'validate_skill_name', diff --git a/src/apm_cli/integration/hook_integrator.py b/src/apm_cli/integration/hook_integrator.py new file mode 100644 index 00000000..1d828834 --- /dev/null +++ b/src/apm_cli/integration/hook_integrator.py @@ -0,0 +1,532 @@ +"""Hook integration functionality for APM packages. + +Integrates hook JSON files and their referenced scripts during package +installation. Supports both VSCode Copilot (.github/hooks/) and Claude Code +(.claude/settings.json) targets. + +Hook JSON format (Claude Code — nested matcher groups): + { + "hooks": { + "PreToolUse": [ + { + "hooks": [ + {"type": "command", "command": "./scripts/validate.sh", "timeout": 10} + ] + } + ] + } + } + +Hook JSON format (GitHub Copilot — flat arrays with bash/powershell keys): + { + "version": 1, + "hooks": { + "preToolUse": [ + {"type": "command", "bash": "./scripts/validate.sh", "timeoutSec": 10} + ] + } + } + +Script path handling: + - ${CLAUDE_PLUGIN_ROOT}/path → resolved relative to package root, rewritten for target + - ./path → relative path, resolved from hook file's parent directory, rewritten for target + - System commands (no path separators) → passed through unchanged +""" + +import json +import re +import shutil +from pathlib import Path +from typing import List, Dict, Tuple, Optional +from dataclasses import dataclass, field + + +@dataclass +class HookIntegrationResult: + """Result of hook integration operation.""" + hooks_integrated: int + scripts_copied: int + target_paths: List[Path] = field(default_factory=list) + gitignore_updated: bool = False + + +class HookIntegrator: + """Handles integration of APM package hooks into target locations. + + Discovers hook JSON files and their referenced scripts from packages, + then installs them to the appropriate target location: + - VSCode: .github/hooks/--apm.json + .github/hooks/scripts// + - Claude: Merged into .claude/settings.json hooks key + .claude/hooks// + """ + + def __init__(self): + """Initialize the hook integrator.""" + pass + + def should_integrate(self, project_root: Path) -> bool: + """Check if hook integration should be performed. + + Args: + project_root: Root directory of the project + + Returns: + bool: Always True - integration happens automatically + """ + return True + + def find_hook_files(self, package_path: Path) -> List[Path]: + """Find all hook JSON files in a package. + + Searches in: + - .apm/hooks/ subdirectory (APM convention) + - hooks/ subdirectory (Claude-native convention) + + Args: + package_path: Path to the package directory + + Returns: + List[Path]: List of absolute paths to hook JSON files + """ + hook_files = [] + seen = set() + + # Search in .apm/hooks/ (APM convention) + apm_hooks = package_path / ".apm" / "hooks" + if apm_hooks.exists(): + for f in sorted(apm_hooks.glob("*.json")): + resolved = f.resolve() + if resolved not in seen: + seen.add(resolved) + hook_files.append(f) + + # Search in hooks/ (Claude-native convention) + hooks_dir = package_path / "hooks" + if hooks_dir.exists(): + for f in sorted(hooks_dir.glob("*.json")): + resolved = f.resolve() + if resolved not in seen: + seen.add(resolved) + hook_files.append(f) + + return hook_files + + def _parse_hook_json(self, hook_file: Path) -> Optional[Dict]: + """Parse a hook JSON file and return the data dict. + + Args: + hook_file: Path to the hook JSON file + + Returns: + Optional[Dict]: Parsed JSON dict, or None if invalid + """ + try: + with open(hook_file, 'r', encoding='utf-8') as f: + data = json.load(f) + if not isinstance(data, dict): + return None + return data + except (json.JSONDecodeError, OSError): + return None + + def _rewrite_command_for_target( + self, + command: str, + package_path: Path, + package_name: str, + target: str, + hook_file_dir: Optional[Path] = None, + ) -> Tuple[str, List[Tuple[Path, str]]]: + """Rewrite a hook command to use installed script paths. + + Handles: + - ${CLAUDE_PLUGIN_ROOT}/path references (resolved from package root) + - ./path relative references (resolved from hook file's parent directory) + + Args: + command: Original command string + package_path: Root path of the source package + package_name: Name used for the scripts subdirectory + target: "vscode" or "claude" + hook_file_dir: Directory containing the hook JSON file (for ./path resolution) + + Returns: + Tuple of (rewritten_command, list of (source_file, relative_target_path)) + """ + scripts_to_copy = [] + new_command = command + + if target == "vscode": + scripts_base = f".github/hooks/scripts/{package_name}" + else: + scripts_base = f".claude/hooks/{package_name}" + + # Handle ${CLAUDE_PLUGIN_ROOT} references (always relative to package root) + plugin_root_pattern = r'\$\{CLAUDE_PLUGIN_ROOT\}(/[^\s]+)' + for match in re.finditer(plugin_root_pattern, command): + full_var = match.group(0) + rel_path = match.group(1).lstrip('/') + + source_file = (package_path / rel_path).resolve() + # Reject path traversal outside the package directory + if not source_file.is_relative_to(package_path.resolve()): + continue + if source_file.exists() and source_file.is_file(): + target_rel = f"{scripts_base}/{rel_path}" + scripts_to_copy.append((source_file, target_rel)) + new_command = new_command.replace(full_var, target_rel) + + # Handle relative ./path references (safe to run after ${CLAUDE_PLUGIN_ROOT} + # substitution since replacements produce paths like ".github/..." not "./...") + # Resolve from hook file's directory if available, else fall back to package root + resolve_base = hook_file_dir if hook_file_dir else package_path + rel_pattern = r'(\./[^\s]+)' + for match in re.finditer(rel_pattern, new_command): + rel_ref = match.group(1) + rel_path = rel_ref[2:] # Strip ./ + + source_file = (resolve_base / rel_path).resolve() + # Reject path traversal outside the package directory + if not source_file.is_relative_to(package_path.resolve()): + continue + if source_file.exists() and source_file.is_file(): + target_rel = f"{scripts_base}/{rel_path}" + scripts_to_copy.append((source_file, target_rel)) + new_command = new_command.replace(rel_ref, target_rel) + + return new_command, scripts_to_copy + + def _rewrite_hooks_data( + self, + data: Dict, + package_path: Path, + package_name: str, + target: str, + hook_file_dir: Optional[Path] = None, + ) -> Tuple[Dict, List[Tuple[Path, str]]]: + """Rewrite all command paths in a hooks JSON structure. + + Creates a deep copy and rewrites command paths for the target platform. + + Args: + data: Parsed hook JSON data + package_path: Root path of the source package + package_name: Name for scripts subdirectory + target: "vscode" or "claude" + hook_file_dir: Directory containing the hook JSON file (for ./path resolution) + + Returns: + Tuple of (rewritten_data_copy, list of (source_file, target_rel_path)) + """ + import copy + rewritten = copy.deepcopy(data) + all_scripts: List[Tuple[Path, str]] = [] + + hooks = rewritten.get("hooks", {}) + for event_name, matchers in hooks.items(): + if not isinstance(matchers, list): + continue + for matcher in matchers: + if not isinstance(matcher, dict): + continue + # Rewrite script paths in the matcher dict itself + # (GitHub Copilot flat format: bash/powershell keys at this level) + for key in ("command", "bash", "powershell"): + if key in matcher: + new_cmd, scripts = self._rewrite_command_for_target( + matcher[key], package_path, package_name, target, + hook_file_dir=hook_file_dir, + ) + matcher[key] = new_cmd + all_scripts.extend(scripts) + + # Rewrite script paths in nested hooks array + # (Claude format: matcher groups with inner hooks array) + for hook in matcher.get("hooks", []): + if not isinstance(hook, dict): + continue + for key in ("command", "bash", "powershell"): + if key in hook: + new_cmd, scripts = self._rewrite_command_for_target( + hook[key], package_path, package_name, target, + hook_file_dir=hook_file_dir, + ) + hook[key] = new_cmd + all_scripts.extend(scripts) + + return rewritten, all_scripts + + def _get_package_name(self, package_info) -> str: + """Get a short package name for use in file/directory naming. + + Args: + package_info: PackageInfo object + + Returns: + str: Package name derived from install path + """ + return package_info.install_path.name + + def integrate_package_hooks(self, package_info, project_root: Path) -> HookIntegrationResult: + """Integrate hooks from a package into .github/hooks/ (VSCode target). + + Copies hook JSON files with rewritten script paths and copies + referenced script files to .github/hooks/scripts//. + + Args: + package_info: PackageInfo with package metadata and install path + project_root: Root directory of the project + + Returns: + HookIntegrationResult: Results of the integration operation + """ + hook_files = self.find_hook_files(package_info.install_path) + + if not hook_files: + return HookIntegrationResult( + hooks_integrated=0, + scripts_copied=0, + ) + + hooks_dir = project_root / ".github" / "hooks" + hooks_dir.mkdir(parents=True, exist_ok=True) + + package_name = self._get_package_name(package_info) + hooks_integrated = 0 + scripts_copied = 0 + target_paths: List[Path] = [] + + for hook_file in hook_files: + data = self._parse_hook_json(hook_file) + if data is None: + continue + + # Rewrite script paths for VSCode target + rewritten, scripts = self._rewrite_hooks_data( + data, package_info.install_path, package_name, "vscode", + hook_file_dir=hook_file.parent, + ) + + # Generate target filename: --apm.json + stem = hook_file.stem + target_filename = f"{package_name}-{stem}-apm.json" + target_path = hooks_dir / target_filename + + # Write rewritten JSON + with open(target_path, 'w', encoding='utf-8') as f: + json.dump(rewritten, f, indent=2) + f.write('\n') + + hooks_integrated += 1 + target_paths.append(target_path) + + # Copy referenced scripts + for source_file, target_rel in scripts: + target_script = project_root / target_rel + target_script.parent.mkdir(parents=True, exist_ok=True) + shutil.copy2(source_file, target_script) + scripts_copied += 1 + + return HookIntegrationResult( + hooks_integrated=hooks_integrated, + scripts_copied=scripts_copied, + target_paths=target_paths, + ) + + def integrate_package_hooks_claude(self, package_info, project_root: Path) -> HookIntegrationResult: + """Integrate hooks from a package into .claude/settings.json (Claude target). + + Merges hook definitions into the Claude settings file and copies + referenced script files to .claude/hooks//. + + Args: + package_info: PackageInfo with package metadata and install path + project_root: Root directory of the project + + Returns: + HookIntegrationResult: Results of the integration operation + """ + hook_files = self.find_hook_files(package_info.install_path) + + if not hook_files: + return HookIntegrationResult( + hooks_integrated=0, + scripts_copied=0, + ) + + package_name = self._get_package_name(package_info) + hooks_integrated = 0 + scripts_copied = 0 + target_paths: List[Path] = [] + + # Read existing settings + settings_path = project_root / ".claude" / "settings.json" + settings: Dict = {} + if settings_path.exists(): + try: + with open(settings_path, 'r', encoding='utf-8') as f: + settings = json.load(f) + except (json.JSONDecodeError, OSError): + settings = {} + + if "hooks" not in settings: + settings["hooks"] = {} + + for hook_file in hook_files: + data = self._parse_hook_json(hook_file) + if data is None: + continue + + # Rewrite script paths for Claude target + rewritten, scripts = self._rewrite_hooks_data( + data, package_info.install_path, package_name, "claude", + hook_file_dir=hook_file.parent, + ) + + # Merge hooks into settings (additive) + hooks = rewritten.get("hooks", {}) + for event_name, matchers in hooks.items(): + if not isinstance(matchers, list): + continue + if event_name not in settings["hooks"]: + settings["hooks"][event_name] = [] + + # Mark each matcher with APM source for sync/cleanup + for matcher in matchers: + if isinstance(matcher, dict): + matcher["_apm_source"] = package_name + + settings["hooks"][event_name].extend(matchers) + + hooks_integrated += 1 + + # Copy referenced scripts + for source_file, target_rel in scripts: + target_script = project_root / target_rel + target_script.parent.mkdir(parents=True, exist_ok=True) + shutil.copy2(source_file, target_script) + scripts_copied += 1 + target_paths.append(target_script) + + # Write settings back + settings_path.parent.mkdir(parents=True, exist_ok=True) + with open(settings_path, 'w', encoding='utf-8') as f: + json.dump(settings, f, indent=2) + f.write('\n') + target_paths.append(settings_path) + + return HookIntegrationResult( + hooks_integrated=hooks_integrated, + scripts_copied=scripts_copied, + target_paths=target_paths, + ) + + def sync_integration(self, apm_package, project_root: Path) -> Dict: + """Remove all APM-managed hook files for clean regeneration. + + Removes: + - .github/hooks/*-apm.json files + - .github/hooks/scripts/ directory + - APM-managed entries from .claude/settings.json + - .claude/hooks/ directory + + Args: + apm_package: APMPackage (unused, kept for interface compatibility) + project_root: Root directory of the project + + Returns: + Dict with cleanup stats: {'files_removed': int, 'errors': int} + """ + stats: Dict[str, int] = {'files_removed': 0, 'errors': 0} + + # Clean VSCode hooks + hooks_dir = project_root / ".github" / "hooks" + if hooks_dir.exists(): + for hook_file in hooks_dir.glob("*-apm.json"): + try: + hook_file.unlink() + stats['files_removed'] += 1 + except Exception: + stats['errors'] += 1 + + # Clean scripts directory + scripts_dir = hooks_dir / "scripts" + if scripts_dir.exists(): + try: + shutil.rmtree(scripts_dir) + stats['files_removed'] += 1 + except Exception: + stats['errors'] += 1 + + # Clean Claude hooks scripts + claude_hooks_dir = project_root / ".claude" / "hooks" + if claude_hooks_dir.exists(): + try: + shutil.rmtree(claude_hooks_dir) + stats['files_removed'] += 1 + except Exception: + stats['errors'] += 1 + + # Clean APM entries from .claude/settings.json + settings_path = project_root / ".claude" / "settings.json" + if settings_path.exists(): + try: + with open(settings_path, 'r', encoding='utf-8') as f: + settings = json.load(f) + + if "hooks" in settings: + modified = False + for event_name in list(settings["hooks"].keys()): + matchers = settings["hooks"][event_name] + if isinstance(matchers, list): + filtered = [ + m for m in matchers + if not (isinstance(m, dict) and "_apm_source" in m) + ] + if len(filtered) != len(matchers): + modified = True + settings["hooks"][event_name] = filtered + if not filtered: + del settings["hooks"][event_name] + + if not settings["hooks"]: + del settings["hooks"] + + if modified: + with open(settings_path, 'w', encoding='utf-8') as f: + json.dump(settings, f, indent=2) + f.write('\n') + stats['files_removed'] += 1 + except (json.JSONDecodeError, OSError): + stats['errors'] += 1 + + return stats + + def update_gitignore(self, project_root: Path) -> bool: + """Update .gitignore with patterns for APM-managed hooks. + + Args: + project_root: Root directory of the project + + Returns: + bool: True if .gitignore was updated, False if patterns already exist + """ + gitignore_path = project_root / ".gitignore" + patterns = [ + ".github/hooks/*-apm.json", + ".github/hooks/scripts/", + ] + + existing_content = "" + if gitignore_path.exists(): + existing_content = gitignore_path.read_text() + + # Check if patterns already exist + if ".github/hooks/*-apm.json" in existing_content: + return False + + new_content = existing_content.rstrip() + "\n\n# APM integrated hooks\n" + for pattern in patterns: + new_content += f"{pattern}\n" + + gitignore_path.write_text(new_content) + return True diff --git a/src/apm_cli/models/apm_package.py b/src/apm_cli/models/apm_package.py index 003f72e5..a5484f12 100644 --- a/src/apm_cli/models/apm_package.py +++ b/src/apm_cli/models/apm_package.py @@ -21,10 +21,11 @@ class PackageType(Enum): """Types of packages that APM can install. This enum is used internally to classify packages based on their content - (presence of apm.yml, SKILL.md, etc.). + (presence of apm.yml, SKILL.md, hooks/, etc.). """ APM_PACKAGE = "apm_package" # Has apm.yml CLAUDE_SKILL = "claude_skill" # Has SKILL.md, no apm.yml + HOOK_PACKAGE = "hook_package" # Has hooks/hooks.json, no apm.yml or SKILL.md HYBRID = "hybrid" # Has both apm.yml and SKILL.md INVALID = "invalid" # Neither apm.yml nor SKILL.md @@ -881,23 +882,36 @@ def get_primitives_path(self) -> Path: def has_primitives(self) -> bool: """Check if the package has any primitives.""" apm_dir = self.get_primitives_path() - if not apm_dir.exists(): - return False + if apm_dir.exists(): + # Check for any primitive files in .apm/ subdirectories + for primitive_type in ['instructions', 'chatmodes', 'contexts', 'prompts', 'hooks']: + primitive_dir = apm_dir / primitive_type + if primitive_dir.exists() and any(primitive_dir.iterdir()): + return True + + # Also check hooks/ at package root (Claude-native convention) + hooks_dir = self.install_path / "hooks" + if hooks_dir.exists() and any(hooks_dir.glob("*.json")): + return True - # Check for any primitive files in subdirectories - for primitive_type in ['instructions', 'chatmodes', 'contexts', 'prompts']: - primitive_dir = apm_dir / primitive_type - if primitive_dir.exists() and any(primitive_dir.iterdir()): - return True return False +def _has_hook_json(package_path: Path) -> bool: + """Check if the package has hook JSON files in hooks/ or .apm/hooks/.""" + for hooks_dir in [package_path / "hooks", package_path / ".apm" / "hooks"]: + if hooks_dir.exists() and any(hooks_dir.glob("*.json")): + return True + return False + + def validate_apm_package(package_path: Path) -> ValidationResult: """Validate that a directory contains a valid APM package or Claude Skill. - Supports three package types: + Supports four package types: - APM_PACKAGE: Has apm.yml and .apm/ directory - CLAUDE_SKILL: Has SKILL.md but no apm.yml (auto-generates apm.yml) + - HOOK_PACKAGE: Has hooks/*.json but no apm.yml or SKILL.md - HYBRID: Has both apm.yml and SKILL.md Args: @@ -922,6 +936,7 @@ def validate_apm_package(package_path: Path) -> ValidationResult: skill_md_path = package_path / "SKILL.md" has_apm_yml = apm_yml_path.exists() has_skill_md = skill_md_path.exists() + has_hooks = _has_hook_json(package_path) # Determine package type if has_apm_yml and has_skill_md: @@ -930,11 +945,17 @@ def validate_apm_package(package_path: Path) -> ValidationResult: result.package_type = PackageType.APM_PACKAGE elif has_skill_md: result.package_type = PackageType.CLAUDE_SKILL + elif has_hooks: + result.package_type = PackageType.HOOK_PACKAGE else: result.package_type = PackageType.INVALID - result.add_error("Missing required file: apm.yml or SKILL.md") + result.add_error("Missing required file: apm.yml, SKILL.md, or hooks/*.json") return result + # Handle hook-only packages (no apm.yml or SKILL.md) + if result.package_type == PackageType.HOOK_PACKAGE: + return _validate_hook_package(package_path, result) + # Handle Claude Skills (no apm.yml) - auto-generate minimal apm.yml if result.package_type == PackageType.CLAUDE_SKILL: return _validate_claude_skill(package_path, skill_md_path, result) @@ -943,6 +964,34 @@ def validate_apm_package(package_path: Path) -> ValidationResult: return _validate_apm_package_with_yml(package_path, apm_yml_path, result) +def _validate_hook_package(package_path: Path, result: ValidationResult) -> ValidationResult: + """Validate a hook-only package and create APMPackage from its metadata. + + A hook package has hooks/*.json (or .apm/hooks/*.json) defining hook + handlers per the Claude Code hooks specification, but no apm.yml or SKILL.md. + + Args: + package_path: Path to the package directory + result: ValidationResult to populate + + Returns: + ValidationResult: Updated validation result + """ + package_name = package_path.name + + # Create APMPackage from directory name + package = APMPackage( + name=package_name, + version="1.0.0", + description=f"Hook package: {package_name}", + package_path=package_path, + type=PackageContentType.HYBRID + ) + result.package = package + + return result + + def _validate_claude_skill(package_path: Path, skill_md_path: Path, result: ValidationResult) -> ValidationResult: """Validate a Claude Skill and create APMPackage directly from SKILL.md metadata. @@ -1032,6 +1081,10 @@ def _validate_apm_package_with_yml(package_path: Path, apm_yml_path: Path, resul except Exception as e: result.add_warning(f"Could not read primitive file {md_file.relative_to(package_path)}: {e}") + # Also check for hooks (JSON files in .apm/hooks/ or hooks/) + if not has_primitives: + has_primitives = _has_hook_json(package_path) + if not has_primitives: result.add_warning("No primitive files found in .apm/ directory") diff --git a/tests/test_apm_package_models.py b/tests/test_apm_package_models.py index 0129307c..e60f737b 100644 --- a/tests/test_apm_package_models.py +++ b/tests/test_apm_package_models.py @@ -1,5 +1,6 @@ """Unit tests for APM package data models and validation.""" +import json import pytest import tempfile import yaml @@ -15,6 +16,7 @@ PackageInfo, GitReferenceType, PackageContentType, + PackageType, validate_apm_package, parse_git_reference, ) @@ -632,11 +634,11 @@ def test_validate_file_instead_of_directory(self): assert any("not a directory" in error for error in result.errors) def test_validate_missing_apm_yml(self): - """Test validating directory without apm.yml.""" + """Test validating directory without apm.yml, SKILL.md, or hooks.""" with tempfile.TemporaryDirectory() as tmpdir: result = validate_apm_package(Path(tmpdir)) assert not result.is_valid - assert any("Missing required file: apm.yml" in error for error in result.errors) + assert any("Missing required file" in error for error in result.errors) def test_validate_invalid_apm_yml(self): """Test validating directory with invalid apm.yml.""" @@ -848,6 +850,71 @@ def test_validate_skill_without_description(self): assert "Claude Skill: minimal-skill" in result.package.description +class TestHookPackageValidation: + """Test hook-only package validation.""" + + def test_validate_hook_package_with_hooks_dir(self): + """Test validating a package with only hooks/hooks.json.""" + with tempfile.TemporaryDirectory() as tmpdir: + hooks_dir = Path(tmpdir) / "hooks" + hooks_dir.mkdir() + hooks_json = hooks_dir / "hooks.json" + hooks_json.write_text(json.dumps({ + "hooks": { + "PreToolUse": [{ + "hooks": [{ + "type": "command", + "command": "echo hello" + }] + }] + } + })) + + result = validate_apm_package(Path(tmpdir)) + assert result.is_valid, f"Errors: {result.errors}" + assert result.package_type == PackageType.HOOK_PACKAGE + assert result.package is not None + assert result.package.name == Path(tmpdir).name + + def test_validate_hook_package_with_apm_hooks_dir(self): + """Test validating a package with .apm/hooks/*.json.""" + with tempfile.TemporaryDirectory() as tmpdir: + hooks_dir = Path(tmpdir) / ".apm" / "hooks" + hooks_dir.mkdir(parents=True) + hooks_json = hooks_dir / "my-hooks.json" + hooks_json.write_text(json.dumps({ + "hooks": {"Stop": [{"hooks": [{"type": "command", "command": "echo bye"}]}]} + })) + + result = validate_apm_package(Path(tmpdir)) + assert result.is_valid, f"Errors: {result.errors}" + assert result.package_type == PackageType.HOOK_PACKAGE + + def test_validate_hook_package_prefers_apm_yml(self): + """Test that apm.yml takes precedence over hooks/ for type detection.""" + with tempfile.TemporaryDirectory() as tmpdir: + # Create both apm.yml + .apm/ and hooks/ + apm_yml = Path(tmpdir) / "apm.yml" + apm_yml.write_text("name: test\nversion: 1.0.0") + apm_dir = Path(tmpdir) / ".apm" / "instructions" + apm_dir.mkdir(parents=True) + (apm_dir / "main.md").write_text("# Instructions") + hooks_dir = Path(tmpdir) / "hooks" + hooks_dir.mkdir() + (hooks_dir / "hooks.json").write_text('{"hooks": {}}') + + result = validate_apm_package(Path(tmpdir)) + assert result.is_valid, f"Errors: {result.errors}" + assert result.package_type == PackageType.APM_PACKAGE + + def test_validate_empty_dir_is_invalid(self): + """Test that a dir with no apm.yml, SKILL.md, or hooks is invalid.""" + with tempfile.TemporaryDirectory() as tmpdir: + result = validate_apm_package(Path(tmpdir)) + assert not result.is_valid + assert result.package_type == PackageType.INVALID + + class TestGitReferenceUtils: """Test Git reference parsing utilities.""" diff --git a/tests/unit/integration/test_hook_integrator.py b/tests/unit/integration/test_hook_integrator.py new file mode 100644 index 00000000..7869c1c6 --- /dev/null +++ b/tests/unit/integration/test_hook_integrator.py @@ -0,0 +1,1282 @@ +"""Unit tests for HookIntegrator. + +Tests cover: +- Hook file discovery (.apm/hooks/ and hooks/ directories) +- VSCode integration (JSON copy + script copy + path rewriting) +- Claude integration (settings.json merge + script copy) +- Sync/cleanup integration (nuke-and-regenerate) +- Official plugin formats (hookify, learning-output-style, ralph-loop) +- Script path rewriting for ${CLAUDE_PLUGIN_ROOT} references +- Gitignore updates +""" + +import json +import tempfile +import shutil +from pathlib import Path +from unittest.mock import MagicMock + +import pytest + +from apm_cli.integration.hook_integrator import HookIntegrator, HookIntegrationResult +from apm_cli.models.apm_package import APMPackage, PackageInfo + + +def _make_package_info(install_path: Path, name: str = "test-pkg") -> PackageInfo: + """Create a minimal PackageInfo for testing.""" + package = APMPackage(name=name, version="1.0.0") + return PackageInfo(package=package, install_path=install_path) + + +# ─── Hook file fixtures mirroring official Claude plugins ───────────────────── + +HOOKIFY_HOOKS_JSON = { + "description": "Hookify plugin - User-configurable hooks from .local.md files", + "hooks": { + "PreToolUse": [ + { + "hooks": [ + { + "type": "command", + "command": "python3 ${CLAUDE_PLUGIN_ROOT}/hooks/pretooluse.py", + "timeout": 10, + } + ] + } + ], + "PostToolUse": [ + { + "hooks": [ + { + "type": "command", + "command": "python3 ${CLAUDE_PLUGIN_ROOT}/hooks/posttooluse.py", + "timeout": 10, + } + ] + } + ], + "Stop": [ + { + "hooks": [ + { + "type": "command", + "command": "python3 ${CLAUDE_PLUGIN_ROOT}/hooks/stop.py", + "timeout": 10, + } + ] + } + ], + "UserPromptSubmit": [ + { + "hooks": [ + { + "type": "command", + "command": "python3 ${CLAUDE_PLUGIN_ROOT}/hooks/userpromptsubmit.py", + "timeout": 10, + } + ] + } + ], + }, +} + +LEARNING_OUTPUT_STYLE_HOOKS_JSON = { + "description": "Learning mode hook that adds interactive learning instructions", + "hooks": { + "SessionStart": [ + { + "hooks": [ + { + "type": "command", + "command": "${CLAUDE_PLUGIN_ROOT}/hooks-handlers/session-start.sh", + } + ] + } + ] + }, +} + +RALPH_LOOP_HOOKS_JSON = { + "description": "Ralph Loop plugin stop hook for self-referential loops", + "hooks": { + "Stop": [ + { + "hooks": [ + { + "type": "command", + "command": "${CLAUDE_PLUGIN_ROOT}/hooks/stop-hook.sh", + } + ] + } + ] + }, +} + + +# ─── Discovery tests ───────────────────────────────────────────────────────── + + +class TestHookDiscovery: + """Tests for finding hook JSON files in packages.""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + yield Path(temp_dir) + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_find_no_hooks(self, temp_project): + """No hooks found when package has no hook directories.""" + pkg_dir = temp_project / "pkg" + pkg_dir.mkdir() + integrator = HookIntegrator() + assert integrator.find_hook_files(pkg_dir) == [] + + def test_find_hooks_in_apm_hooks(self, temp_project): + """Find hook JSON files in .apm/hooks/ directory.""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / ".apm" / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "security.json").write_text(json.dumps({"hooks": {}})) + (hooks_dir / "quality.json").write_text(json.dumps({"hooks": {}})) + (hooks_dir / "readme.md").write_text("# Not a hook") # Should be ignored + + integrator = HookIntegrator() + files = integrator.find_hook_files(pkg_dir) + assert len(files) == 2 + assert all(f.suffix == ".json" for f in files) + + def test_find_hooks_in_hooks_dir(self, temp_project): + """Find hook JSON files in hooks/ directory (Claude-native convention).""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "hooks.json").write_text(json.dumps({"hooks": {}})) + + integrator = HookIntegrator() + files = integrator.find_hook_files(pkg_dir) + assert len(files) == 1 + assert files[0].name == "hooks.json" + + def test_find_hooks_deduplicates(self, temp_project): + """Do not return duplicate hook files when .apm/hooks/ and hooks/ overlap.""" + pkg_dir = temp_project / "pkg" + # Create both directories pointing to the same conceptual hooks + apm_hooks = pkg_dir / ".apm" / "hooks" + apm_hooks.mkdir(parents=True) + (apm_hooks / "a.json").write_text(json.dumps({"hooks": {}})) + + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "b.json").write_text(json.dumps({"hooks": {}})) + + integrator = HookIntegrator() + files = integrator.find_hook_files(pkg_dir) + assert len(files) == 2 # Different files, should both be found + + def test_should_integrate_always_true(self, temp_project): + """Integration is always enabled (zero-config).""" + integrator = HookIntegrator() + assert integrator.should_integrate(temp_project) + + +# ─── Parsing tests ──────────────────────────────────────────────────────────── + + +class TestHookParsing: + """Tests for parsing hook JSON files.""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + yield Path(temp_dir) + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_parse_valid_hook_json(self, temp_project): + hook_file = temp_project / "hooks.json" + hook_file.write_text(json.dumps(HOOKIFY_HOOKS_JSON)) + + integrator = HookIntegrator() + data = integrator._parse_hook_json(hook_file) + assert data is not None + assert "hooks" in data + assert "PreToolUse" in data["hooks"] + + def test_parse_invalid_json(self, temp_project): + hook_file = temp_project / "bad.json" + hook_file.write_text("not valid json {{{") + + integrator = HookIntegrator() + assert integrator._parse_hook_json(hook_file) is None + + def test_parse_non_dict_json(self, temp_project): + hook_file = temp_project / "array.json" + hook_file.write_text(json.dumps([1, 2, 3])) + + integrator = HookIntegrator() + assert integrator._parse_hook_json(hook_file) is None + + def test_parse_missing_file(self, temp_project): + integrator = HookIntegrator() + assert integrator._parse_hook_json(temp_project / "missing.json") is None + + +# ─── VSCode integration tests ──────────────────────────────────────────────── + + +class TestVSCodeIntegration: + """Tests for VSCode hook integration (.github/hooks/).""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + project = Path(temp_dir) + (project / ".github").mkdir() + yield project + shutil.rmtree(temp_dir, ignore_errors=True) + + def _setup_hookify_package(self, project: Path) -> PackageInfo: + """Create a hookify-like package structure.""" + pkg_dir = project / "apm_modules" / "anthropics" / "hookify" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + + (hooks_dir / "hooks.json").write_text(json.dumps(HOOKIFY_HOOKS_JSON, indent=2)) + + # Create the script files + for script in ["pretooluse.py", "posttooluse.py", "stop.py", "userpromptsubmit.py"]: + (hooks_dir / script).write_text(f"#!/usr/bin/env python3\n# {script}") + + return _make_package_info(pkg_dir, "hookify") + + def test_integrate_hookify_vscode(self, temp_project): + """Test VSCode integration of hookify plugin (multiple events + Python scripts).""" + pkg_info = self._setup_hookify_package(temp_project) + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 4 + + # Check hook JSON was created + target_json = temp_project / ".github" / "hooks" / "hookify-hooks-apm.json" + assert target_json.exists() + + # Verify rewritten paths + data = json.loads(target_json.read_text()) + cmd = data["hooks"]["PreToolUse"][0]["hooks"][0]["command"] + assert "${CLAUDE_PLUGIN_ROOT}" not in cmd + assert ".github/hooks/scripts/hookify/hooks/pretooluse.py" in cmd + assert cmd.startswith("python3 ") + + # Check scripts were copied + scripts_dir = temp_project / ".github" / "hooks" / "scripts" / "hookify" / "hooks" + assert (scripts_dir / "pretooluse.py").exists() + assert (scripts_dir / "posttooluse.py").exists() + assert (scripts_dir / "stop.py").exists() + assert (scripts_dir / "userpromptsubmit.py").exists() + + def test_integrate_learning_output_style_vscode(self, temp_project): + """Test VSCode integration of learning-output-style plugin (different script dir).""" + pkg_dir = temp_project / "apm_modules" / "anthropics" / "learning-output-style" + hooks_dir = pkg_dir / "hooks" + handlers_dir = pkg_dir / "hooks-handlers" + hooks_dir.mkdir(parents=True) + handlers_dir.mkdir(parents=True) + + (hooks_dir / "hooks.json").write_text(json.dumps(LEARNING_OUTPUT_STYLE_HOOKS_JSON)) + (handlers_dir / "session-start.sh").write_text("#!/bin/bash\necho 'start'") + + pkg_info = _make_package_info(pkg_dir, "learning-output-style") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 1 + + # Verify rewritten paths + target_json = temp_project / ".github" / "hooks" / "learning-output-style-hooks-apm.json" + data = json.loads(target_json.read_text()) + cmd = data["hooks"]["SessionStart"][0]["hooks"][0]["command"] + assert "${CLAUDE_PLUGIN_ROOT}" not in cmd + assert "learning-output-style" in cmd + assert "session-start.sh" in cmd + + # Check script was copied + assert ( + temp_project + / ".github" + / "hooks" + / "scripts" + / "learning-output-style" + / "hooks-handlers" + / "session-start.sh" + ).exists() + + def test_integrate_ralph_loop_vscode(self, temp_project): + """Test VSCode integration of ralph-loop plugin (Stop hook).""" + pkg_dir = temp_project / "apm_modules" / "anthropics" / "ralph-loop" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + + (hooks_dir / "hooks.json").write_text(json.dumps(RALPH_LOOP_HOOKS_JSON)) + (hooks_dir / "stop-hook.sh").write_text("#!/bin/bash\nexit 0") + + pkg_info = _make_package_info(pkg_dir, "ralph-loop") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 1 + + target_json = temp_project / ".github" / "hooks" / "ralph-loop-hooks-apm.json" + data = json.loads(target_json.read_text()) + cmd = data["hooks"]["Stop"][0]["hooks"][0]["command"] + assert "ralph-loop" in cmd + assert "stop-hook.sh" in cmd + + def test_integrate_no_hooks(self, temp_project): + """Test integration with package that has no hooks.""" + pkg_dir = temp_project / "pkg" + pkg_dir.mkdir() + + pkg_info = _make_package_info(pkg_dir) + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + assert result.hooks_integrated == 0 + assert result.scripts_copied == 0 + + def test_integrate_hooks_from_apm_convention(self, temp_project): + """Test VSCode integration using .apm/hooks/ convention.""" + pkg_dir = temp_project / "apm_modules" / "myorg" / "security-hooks" + hooks_dir = pkg_dir / ".apm" / "hooks" + scripts_dir = pkg_dir / "scripts" + hooks_dir.mkdir(parents=True) + scripts_dir.mkdir(parents=True) + + hook_data = { + "hooks": { + "PreToolUse": [ + { + "hooks": [ + {"type": "command", "command": "./scripts/validate.sh"} + ] + } + ] + } + } + (hooks_dir / "security.json").write_text(json.dumps(hook_data)) + (scripts_dir / "validate.sh").write_text("#!/bin/bash\necho 'validate'") + + pkg_info = _make_package_info(pkg_dir, "security-hooks") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + target_json = temp_project / ".github" / "hooks" / "security-hooks-security-apm.json" + assert target_json.exists() + + def test_integrate_system_command_passthrough(self, temp_project): + """Test that system commands without file paths are passed through unchanged.""" + pkg_dir = temp_project / "apm_modules" / "myorg" / "format-pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + + hook_data = { + "hooks": { + "PreToolUse": [ + { + "hooks": [ + {"type": "command", "command": "npx prettier --check ."} + ] + } + ] + } + } + (hooks_dir / "format.json").write_text(json.dumps(hook_data)) + + pkg_info = _make_package_info(pkg_dir, "format-pkg") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 0 # No scripts to copy for system commands + + target_json = temp_project / ".github" / "hooks" / "format-pkg-format-apm.json" + data = json.loads(target_json.read_text()) + cmd = data["hooks"]["PreToolUse"][0]["hooks"][0]["command"] + assert cmd == "npx prettier --check ." + + def test_invalid_json_skipped(self, temp_project): + """Test that invalid JSON hook files are skipped gracefully.""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "bad.json").write_text("not json") + + pkg_info = _make_package_info(pkg_dir) + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + assert result.hooks_integrated == 0 + + def test_creates_github_hooks_dir(self, temp_project): + """Test that .github/hooks/ directory is created if it doesn't exist.""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "hooks.json").write_text(json.dumps({"hooks": {"Stop": []}})) + + pkg_info = _make_package_info(pkg_dir) + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + assert (temp_project / ".github" / "hooks").exists() + + +# ─── Claude integration tests ──────────────────────────────────────────────── + + +class TestClaudeIntegration: + """Tests for Claude hook integration (.claude/settings.json merge).""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + project = Path(temp_dir) + (project / ".claude").mkdir() + yield project + shutil.rmtree(temp_dir, ignore_errors=True) + + def _setup_hookify_package(self, project: Path) -> PackageInfo: + """Create a hookify-like package structure.""" + pkg_dir = project / "apm_modules" / "anthropics" / "hookify" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + + (hooks_dir / "hooks.json").write_text(json.dumps(HOOKIFY_HOOKS_JSON, indent=2)) + + for script in ["pretooluse.py", "posttooluse.py", "stop.py", "userpromptsubmit.py"]: + (hooks_dir / script).write_text(f"#!/usr/bin/env python3\n# {script}") + + return _make_package_info(pkg_dir, "hookify") + + def test_integrate_hookify_claude(self, temp_project): + """Test Claude integration of hookify plugin (merge into settings.json).""" + pkg_info = self._setup_hookify_package(temp_project) + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 4 + + # Check settings.json was created/updated + settings_path = temp_project / ".claude" / "settings.json" + assert settings_path.exists() + + settings = json.loads(settings_path.read_text()) + assert "hooks" in settings + assert "PreToolUse" in settings["hooks"] + assert "PostToolUse" in settings["hooks"] + assert "Stop" in settings["hooks"] + assert "UserPromptSubmit" in settings["hooks"] + + # Check APM source marker for cleanup + assert settings["hooks"]["PreToolUse"][0]["_apm_source"] == "hookify" + + # Verify rewritten paths + cmd = settings["hooks"]["PreToolUse"][0]["hooks"][0]["command"] + assert ".claude/hooks/hookify/hooks/pretooluse.py" in cmd + + def test_integrate_learning_output_style_claude(self, temp_project): + """Test Claude integration of learning-output-style plugin.""" + pkg_dir = temp_project / "apm_modules" / "anthropics" / "learning-output-style" + hooks_dir = pkg_dir / "hooks" + handlers_dir = pkg_dir / "hooks-handlers" + hooks_dir.mkdir(parents=True) + handlers_dir.mkdir(parents=True) + + (hooks_dir / "hooks.json").write_text(json.dumps(LEARNING_OUTPUT_STYLE_HOOKS_JSON)) + (handlers_dir / "session-start.sh").write_text("#!/bin/bash\necho 'start'") + + pkg_info = _make_package_info(pkg_dir, "learning-output-style") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + settings = json.loads((temp_project / ".claude" / "settings.json").read_text()) + assert "SessionStart" in settings["hooks"] + + def test_integrate_ralph_loop_claude(self, temp_project): + """Test Claude integration of ralph-loop plugin.""" + pkg_dir = temp_project / "apm_modules" / "anthropics" / "ralph-loop" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + + (hooks_dir / "hooks.json").write_text(json.dumps(RALPH_LOOP_HOOKS_JSON)) + (hooks_dir / "stop-hook.sh").write_text("#!/bin/bash\nexit 0") + + pkg_info = _make_package_info(pkg_dir, "ralph-loop") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + settings = json.loads((temp_project / ".claude" / "settings.json").read_text()) + assert "Stop" in settings["hooks"] + cmd = settings["hooks"]["Stop"][0]["hooks"][0]["command"] + assert "ralph-loop" in cmd + + def test_merge_into_existing_settings(self, temp_project): + """Test that hooks are merged into existing settings.json without clobbering.""" + settings_path = temp_project / ".claude" / "settings.json" + settings_path.write_text(json.dumps({ + "model": "claude-sonnet-4-20250514", + "hooks": { + "PreToolUse": [{"hooks": [{"type": "command", "command": "echo user-hook"}]}] + } + })) + + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "hooks.json").write_text(json.dumps(RALPH_LOOP_HOOKS_JSON)) + (hooks_dir / "stop-hook.sh").write_text("#!/bin/bash\nexit 0") + + pkg_info = _make_package_info(pkg_dir, "ralph-loop") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + + settings = json.loads(settings_path.read_text()) + # Original settings preserved + assert settings["model"] == "claude-sonnet-4-20250514" + # User hook preserved + assert len(settings["hooks"]["PreToolUse"]) == 1 + # New hook added + assert "Stop" in settings["hooks"] + + def test_additive_merge_same_event(self, temp_project): + """Test that multiple packages can add hooks to the same event (additive).""" + integrator = HookIntegrator() + + # First package: ralph-loop with Stop hook + pkg1_dir = temp_project / "pkg1" + hooks1_dir = pkg1_dir / "hooks" + hooks1_dir.mkdir(parents=True) + (hooks1_dir / "hooks.json").write_text(json.dumps(RALPH_LOOP_HOOKS_JSON)) + (hooks1_dir / "stop-hook.sh").write_text("#!/bin/bash\nexit 0") + pkg1_info = _make_package_info(pkg1_dir, "ralph-loop") + + integrator.integrate_package_hooks_claude(pkg1_info, temp_project) + + # Second package: also has Stop hook + pkg2_dir = temp_project / "pkg2" + hooks2_dir = pkg2_dir / "hooks" + hooks2_dir.mkdir(parents=True) + other_hooks = { + "hooks": { + "Stop": [{"hooks": [{"type": "command", "command": "echo other-stop"}]}] + } + } + (hooks2_dir / "hooks.json").write_text(json.dumps(other_hooks)) + pkg2_info = _make_package_info(pkg2_dir, "other-pkg") + + integrator.integrate_package_hooks_claude(pkg2_info, temp_project) + + settings = json.loads((temp_project / ".claude" / "settings.json").read_text()) + # Both Stop hooks should be present (additive) + assert len(settings["hooks"]["Stop"]) == 2 + + def test_no_hooks_returns_empty_result(self, temp_project): + """Test Claude integration with no hook files returns empty result.""" + pkg_dir = temp_project / "pkg" + pkg_dir.mkdir() + + pkg_info = _make_package_info(pkg_dir) + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + assert result.hooks_integrated == 0 + + def test_creates_settings_json(self, temp_project): + """Test that .claude/settings.json is created if it doesn't exist.""" + # Remove existing .claude dir + shutil.rmtree(temp_project / ".claude") + + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "hooks.json").write_text(json.dumps(RALPH_LOOP_HOOKS_JSON)) + (hooks_dir / "stop-hook.sh").write_text("#!/bin/bash\nexit 0") + + pkg_info = _make_package_info(pkg_dir, "ralph-loop") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + assert result.hooks_integrated == 1 + assert (temp_project / ".claude" / "settings.json").exists() + + def test_integrate_hooks_with_scripts_in_hooks_subdir_claude(self, temp_project): + """Test Claude integration when hook JSON and scripts are both inside hooks/ subdir.""" + pkg_dir = temp_project / "apm_modules" / "myorg" / "lint-hooks" + hooks_dir = pkg_dir / "hooks" + scripts_dir = hooks_dir / "scripts" + scripts_dir.mkdir(parents=True) + + hook_data = { + "hooks": { + "PostToolUse": [ + { + "matcher": {"tool_name": "write_to_file"}, + "hooks": [ + {"type": "command", "command": "./scripts/lint.sh", "timeout": 10} + ] + } + ] + } + } + (hooks_dir / "hooks.json").write_text(json.dumps(hook_data)) + (scripts_dir / "lint.sh").write_text("#!/bin/bash\necho lint") + + pkg_info = _make_package_info(pkg_dir, "lint-hooks") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 1 + + # Verify rewritten command in settings.json + settings = json.loads((temp_project / ".claude" / "settings.json").read_text()) + cmd = settings["hooks"]["PostToolUse"][0]["hooks"][0]["command"] + assert ".claude/hooks/lint-hooks/scripts/lint.sh" in cmd + assert "./" not in cmd + + # Verify script was copied to Claude target location + copied_script = temp_project / ".claude" / "hooks" / "lint-hooks" / "scripts" / "lint.sh" + assert copied_script.exists() + assert copied_script.read_text() == "#!/bin/bash\necho lint" + + +# ─── Sync/cleanup tests ────────────────────────────────────────────────────── + + +class TestSyncIntegration: + """Tests for sync_integration (nuke-and-regenerate during uninstall).""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + yield Path(temp_dir) + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_sync_removes_vscode_hook_files(self, temp_project): + """Test that sync removes all *-apm.json files from .github/hooks/.""" + hooks_dir = temp_project / ".github" / "hooks" + hooks_dir.mkdir(parents=True) + + (hooks_dir / "hookify-hooks-apm.json").write_text("{}") + (hooks_dir / "ralph-loop-hooks-apm.json").write_text("{}") + (hooks_dir / "user-custom.json").write_text("{}") # Should NOT be removed + + integrator = HookIntegrator() + stats = integrator.sync_integration(None, temp_project) + + assert stats["files_removed"] == 2 + assert not (hooks_dir / "hookify-hooks-apm.json").exists() + assert not (hooks_dir / "ralph-loop-hooks-apm.json").exists() + assert (hooks_dir / "user-custom.json").exists() + + def test_sync_removes_scripts_directory(self, temp_project): + """Test that sync removes the scripts/ directory from .github/hooks/.""" + hooks_dir = temp_project / ".github" / "hooks" + scripts_dir = hooks_dir / "scripts" / "hookify" / "hooks" + scripts_dir.mkdir(parents=True) + (scripts_dir / "pretooluse.py").write_text("# script") + + integrator = HookIntegrator() + stats = integrator.sync_integration(None, temp_project) + + assert not (hooks_dir / "scripts").exists() + + def test_sync_removes_claude_hook_entries(self, temp_project): + """Test that sync removes APM-managed entries from .claude/settings.json.""" + claude_dir = temp_project / ".claude" + claude_dir.mkdir() + settings_path = claude_dir / "settings.json" + + settings = { + "model": "claude-sonnet-4-20250514", + "hooks": { + "Stop": [ + {"_apm_source": "ralph-loop", "hooks": [{"type": "command", "command": "..."}]}, + {"hooks": [{"type": "command", "command": "echo user-hook"}]}, + ], + "PreToolUse": [ + {"_apm_source": "hookify", "hooks": [{"type": "command", "command": "..."}]} + ], + }, + } + settings_path.write_text(json.dumps(settings)) + + integrator = HookIntegrator() + stats = integrator.sync_integration(None, temp_project) + + updated_settings = json.loads(settings_path.read_text()) + # Model preserved + assert updated_settings["model"] == "claude-sonnet-4-20250514" + # APM entries removed, user entries preserved + assert "Stop" in updated_settings["hooks"] + assert len(updated_settings["hooks"]["Stop"]) == 1 + assert "_apm_source" not in updated_settings["hooks"]["Stop"][0] + # PreToolUse completely removed (only had APM entries) + assert "PreToolUse" not in updated_settings["hooks"] + + def test_sync_removes_claude_hooks_dir(self, temp_project): + """Test that sync removes .claude/hooks/ directory.""" + claude_hooks = temp_project / ".claude" / "hooks" / "hookify" + claude_hooks.mkdir(parents=True) + (claude_hooks / "pretooluse.py").write_text("# script") + + integrator = HookIntegrator() + stats = integrator.sync_integration(None, temp_project) + + assert not (temp_project / ".claude" / "hooks").exists() + + def test_sync_empty_project(self, temp_project): + """Test sync on project with no hook artifacts.""" + integrator = HookIntegrator() + stats = integrator.sync_integration(None, temp_project) + assert stats["files_removed"] == 0 + assert stats["errors"] == 0 + + def test_sync_removes_empty_hooks_key(self, temp_project): + """Test that empty hooks key is removed from settings.json after cleanup.""" + claude_dir = temp_project / ".claude" + claude_dir.mkdir() + settings_path = claude_dir / "settings.json" + settings = { + "hooks": { + "Stop": [{"_apm_source": "test", "hooks": []}] + } + } + settings_path.write_text(json.dumps(settings)) + + integrator = HookIntegrator() + integrator.sync_integration(None, temp_project) + + updated = json.loads(settings_path.read_text()) + assert "hooks" not in updated # Completely removed when empty + + +# ─── Script path rewriting tests ───────────────────────────────────────────── + + +class TestScriptPathRewriting: + """Tests for command path rewriting logic.""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + yield Path(temp_dir) + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_rewrite_claude_plugin_root(self, temp_project): + """Test rewriting ${CLAUDE_PLUGIN_ROOT} variable.""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "hooks").mkdir(parents=True) + (pkg_dir / "hooks" / "script.sh").write_text("#!/bin/bash") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "python3 ${CLAUDE_PLUGIN_ROOT}/hooks/script.sh", + pkg_dir, + "my-pkg", + "vscode", + ) + + assert "${CLAUDE_PLUGIN_ROOT}" not in cmd + assert ".github/hooks/scripts/my-pkg/hooks/script.sh" in cmd + assert len(scripts) == 1 + + def test_rewrite_relative_path(self, temp_project): + """Test rewriting relative ./path references.""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "scripts").mkdir(parents=True) + (pkg_dir / "scripts" / "check.sh").write_text("#!/bin/bash") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "./scripts/check.sh", + pkg_dir, + "my-pkg", + "vscode", + ) + + assert "./" not in cmd + assert ".github/hooks/scripts/my-pkg/scripts/check.sh" in cmd + assert len(scripts) == 1 + + def test_system_command_unchanged(self, temp_project): + """Test that system commands are not modified.""" + pkg_dir = temp_project / "pkg" + pkg_dir.mkdir(parents=True) + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "npx prettier --check .", + pkg_dir, + "my-pkg", + "vscode", + ) + + assert cmd == "npx prettier --check ." + assert len(scripts) == 0 + + def test_rewrite_for_claude_target(self, temp_project): + """Test that Claude target uses .claude/hooks/ path.""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "hooks").mkdir(parents=True) + (pkg_dir / "hooks" / "run.sh").write_text("#!/bin/bash") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "${CLAUDE_PLUGIN_ROOT}/hooks/run.sh", + pkg_dir, + "my-pkg", + "claude", + ) + + assert ".claude/hooks/my-pkg/hooks/run.sh" in cmd + assert len(scripts) == 1 + + def test_nonexistent_script_not_rewritten(self, temp_project): + """Test that references to non-existent scripts are left as-is.""" + pkg_dir = temp_project / "pkg" + pkg_dir.mkdir(parents=True) + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "${CLAUDE_PLUGIN_ROOT}/missing/script.sh", + pkg_dir, + "my-pkg", + "vscode", + ) + + # Variable is left in the command since the file doesn't exist + assert "${CLAUDE_PLUGIN_ROOT}" in cmd + assert len(scripts) == 0 + + def test_rewrite_preserves_binary_prefix(self, temp_project): + """Test that binary prefix (e.g., python3) is preserved in rewritten commands.""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "hooks").mkdir(parents=True) + (pkg_dir / "hooks" / "check.py").write_text("#!/usr/bin/env python3") + + integrator = HookIntegrator() + cmd, _ = integrator._rewrite_command_for_target( + "python3 ${CLAUDE_PLUGIN_ROOT}/hooks/check.py", + pkg_dir, + "my-pkg", + "vscode", + ) + + assert cmd.startswith("python3 ") + assert cmd.endswith("hooks/check.py") + + def test_rewrite_relative_path_with_hook_file_dir(self, temp_project): + """Test that ./path is resolved from hook_file_dir, not package root.""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + scripts_dir = hooks_dir / "scripts" + scripts_dir.mkdir(parents=True) + (scripts_dir / "lint.sh").write_text("#!/bin/bash") + + integrator = HookIntegrator() + # Script lives at hooks/scripts/lint.sh — only resolvable from hooks/ dir + cmd, scripts = integrator._rewrite_command_for_target( + "./scripts/lint.sh", + pkg_dir, + "my-pkg", + "vscode", + hook_file_dir=hooks_dir, + ) + + assert "./" not in cmd + assert ".github/hooks/scripts/my-pkg/scripts/lint.sh" in cmd + assert len(scripts) == 1 + assert scripts[0][0] == (scripts_dir / "lint.sh").resolve() + + def test_rewrite_relative_path_fails_without_hook_file_dir(self, temp_project): + """Test that ./path is NOT found when resolved from package root (no hook_file_dir).""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + scripts_dir = hooks_dir / "scripts" + scripts_dir.mkdir(parents=True) + (scripts_dir / "lint.sh").write_text("#!/bin/bash") + + integrator = HookIntegrator() + # Without hook_file_dir, resolves from pkg_dir — scripts/lint.sh doesn't exist there + cmd, scripts = integrator._rewrite_command_for_target( + "./scripts/lint.sh", + pkg_dir, + "my-pkg", + "vscode", + ) + + # Script not found at pkg_dir/scripts/lint.sh, so left unchanged + assert cmd == "./scripts/lint.sh" + assert len(scripts) == 0 + + def test_rewrite_rejects_plugin_root_path_traversal(self, temp_project): + """Test that ${CLAUDE_PLUGIN_ROOT}/../ paths are rejected (path traversal).""" + pkg_dir = temp_project / "pkg" + pkg_dir.mkdir(parents=True) + # Create a file outside the package directory + secret = temp_project / "secrets.txt" + secret.write_text("top-secret") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "cat ${CLAUDE_PLUGIN_ROOT}/../secrets.txt", + pkg_dir, + "evil-pkg", + "vscode", + ) + + # The traversal path should NOT be rewritten and no scripts copied + assert "${CLAUDE_PLUGIN_ROOT}/../secrets.txt" in cmd + assert len(scripts) == 0 + + def test_rewrite_rejects_relative_path_traversal(self, temp_project): + """Test that ./../../ paths are rejected (path traversal via relative refs).""" + pkg_dir = temp_project / "pkg" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + # Create a file outside the package directory + secret = temp_project / "secrets.txt" + secret.write_text("top-secret") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "./../../secrets.txt", + pkg_dir, + "evil-pkg", + "claude", + hook_file_dir=hooks_dir, + ) + + # The traversal path should NOT be rewritten and no scripts copied + assert cmd == "./../../secrets.txt" + assert len(scripts) == 0 + + def test_rewrite_bash_key(self, temp_project): + """Test rewriting the bash key (GitHub Copilot format).""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "scripts").mkdir(parents=True) + (pkg_dir / "scripts" / "check.sh").write_text("#!/bin/bash") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "./scripts/check.sh", + pkg_dir, + "my-pkg", + "vscode", + ) + + assert "./" not in cmd + assert ".github/hooks/scripts/my-pkg/scripts/check.sh" in cmd + assert len(scripts) == 1 + + def test_rewrite_powershell_key(self, temp_project): + """Test rewriting the powershell key (GitHub Copilot format).""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "scripts").mkdir(parents=True) + (pkg_dir / "scripts" / "check.ps1").write_text("Write-Host 'ok'") + + integrator = HookIntegrator() + cmd, scripts = integrator._rewrite_command_for_target( + "./scripts/check.ps1", + pkg_dir, + "my-pkg", + "vscode", + ) + + assert "./" not in cmd + assert ".github/hooks/scripts/my-pkg/scripts/check.ps1" in cmd + assert len(scripts) == 1 + + def test_rewrite_hooks_data_github_copilot_flat_format(self, temp_project): + """Test _rewrite_hooks_data handles GitHub Copilot flat format (bash/powershell at top level).""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "scripts").mkdir(parents=True) + (pkg_dir / "scripts" / "validate.sh").write_text("#!/bin/bash") + (pkg_dir / "scripts" / "validate.ps1").write_text("Write-Host 'ok'") + + data = { + "version": 1, + "hooks": { + "preToolUse": [ + { + "type": "command", + "bash": "./scripts/validate.sh", + "powershell": "./scripts/validate.ps1", + } + ] + } + } + + integrator = HookIntegrator() + rewritten, scripts = integrator._rewrite_hooks_data( + data, pkg_dir, "my-pkg", "vscode", + ) + + hook = rewritten["hooks"]["preToolUse"][0] + assert ".github/hooks/scripts/my-pkg/scripts/validate.sh" in hook["bash"] + assert ".github/hooks/scripts/my-pkg/scripts/validate.ps1" in hook["powershell"] + assert len(scripts) == 2 + + def test_rewrite_hooks_data_claude_nested_format(self, temp_project): + """Test _rewrite_hooks_data handles Claude nested format (command in inner hooks array).""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "scripts").mkdir(parents=True) + (pkg_dir / "scripts" / "validate.sh").write_text("#!/bin/bash") + + data = { + "hooks": { + "PreToolUse": [ + { + "matcher": "Bash", + "hooks": [ + {"type": "command", "command": "./scripts/validate.sh"} + ] + } + ] + } + } + + integrator = HookIntegrator() + rewritten, scripts = integrator._rewrite_hooks_data( + data, pkg_dir, "my-pkg", "vscode", + ) + + hook = rewritten["hooks"]["PreToolUse"][0]["hooks"][0] + assert ".github/hooks/scripts/my-pkg/scripts/validate.sh" in hook["command"] + assert len(scripts) == 1 + + def test_integrate_hooks_with_scripts_in_hooks_subdir(self, temp_project): + """Test full integration when hook JSON and scripts are both inside hooks/ subdir.""" + pkg_dir = temp_project / "apm_modules" / "myorg" / "lint-hooks" + hooks_dir = pkg_dir / "hooks" + scripts_dir = hooks_dir / "scripts" + scripts_dir.mkdir(parents=True) + + hook_data = { + "hooks": { + "PostToolUse": [ + { + "matcher": {"tool_name": "write_to_file"}, + "hooks": [ + {"type": "command", "command": "./scripts/lint.sh", "timeout": 10} + ] + } + ] + } + } + (hooks_dir / "hooks.json").write_text(json.dumps(hook_data)) + (scripts_dir / "lint.sh").write_text("#!/bin/bash\necho lint") + + pkg_info = _make_package_info(pkg_dir, "lint-hooks") + integrator = HookIntegrator() + + result = integrator.integrate_package_hooks(pkg_info, temp_project) + + assert result.hooks_integrated == 1 + assert result.scripts_copied == 1 + + # Verify the rewritten command points to the bundled script + target_json = temp_project / ".github" / "hooks" / "lint-hooks-hooks-apm.json" + data = json.loads(target_json.read_text()) + cmd = data["hooks"]["PostToolUse"][0]["hooks"][0]["command"] + assert ".github/hooks/scripts/lint-hooks/scripts/lint.sh" in cmd + assert "./" not in cmd + + # Verify the script was actually copied + copied_script = temp_project / ".github" / "hooks" / "scripts" / "lint-hooks" / "scripts" / "lint.sh" + assert copied_script.exists() + assert copied_script.read_text() == "#!/bin/bash\necho lint" + + +# ─── Gitignore tests ───────────────────────────────────────────────────────── + + +class TestGitignore: + """Tests for .gitignore updates.""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + yield Path(temp_dir) + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_update_gitignore_adds_patterns(self, temp_project): + """Test that hook patterns are added to .gitignore.""" + (temp_project / ".gitignore").write_text("node_modules/\n") + + integrator = HookIntegrator() + result = integrator.update_gitignore(temp_project) + + assert result is True + content = (temp_project / ".gitignore").read_text() + assert ".github/hooks/*-apm.json" in content + assert ".github/hooks/scripts/" in content + + def test_update_gitignore_idempotent(self, temp_project): + """Test that patterns are not duplicated on repeated calls.""" + (temp_project / ".gitignore").write_text( + "node_modules/\n\n# APM integrated hooks\n.github/hooks/*-apm.json\n.github/hooks/scripts/\n" + ) + + integrator = HookIntegrator() + result = integrator.update_gitignore(temp_project) + + assert result is False + + def test_update_gitignore_creates_file(self, temp_project): + """Test that .gitignore is created if it doesn't exist.""" + integrator = HookIntegrator() + result = integrator.update_gitignore(temp_project) + + assert result is True + assert (temp_project / ".gitignore").exists() + + +# ─── End-to-end: install → verify → cleanup ────────────────────────────────── + + +class TestEndToEnd: + """End-to-end tests covering full install → verify → cleanup cycle.""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + project = Path(temp_dir) + (project / ".github").mkdir() + (project / ".claude").mkdir() + yield project + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_full_hookify_lifecycle(self, temp_project): + """Test full lifecycle: install hookify → verify → cleanup.""" + integrator = HookIntegrator() + + # Setup hookify package + pkg_dir = temp_project / "apm_modules" / "anthropics" / "hookify" + hooks_dir = pkg_dir / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "hooks.json").write_text(json.dumps(HOOKIFY_HOOKS_JSON)) + for script in ["pretooluse.py", "posttooluse.py", "stop.py", "userpromptsubmit.py"]: + (hooks_dir / script).write_text(f"# {script}") + + pkg_info = _make_package_info(pkg_dir, "hookify") + + # Install VSCode hooks + vscode_result = integrator.integrate_package_hooks(pkg_info, temp_project) + assert vscode_result.hooks_integrated == 1 + assert vscode_result.scripts_copied == 4 + + # Install Claude hooks + claude_result = integrator.integrate_package_hooks_claude(pkg_info, temp_project) + assert claude_result.hooks_integrated == 1 + + # Verify files exist + assert (temp_project / ".github" / "hooks" / "hookify-hooks-apm.json").exists() + assert (temp_project / ".claude" / "settings.json").exists() + + # Cleanup + stats = integrator.sync_integration(None, temp_project) + assert stats["files_removed"] > 0 + + # Verify cleanup + assert not (temp_project / ".github" / "hooks" / "hookify-hooks-apm.json").exists() + assert not (temp_project / ".github" / "hooks" / "scripts").exists() + assert not (temp_project / ".claude" / "hooks").exists() + + def test_multiple_packages_lifecycle(self, temp_project): + """Test installing hooks from multiple packages, then cleaning up.""" + integrator = HookIntegrator() + + # Package 1: ralph-loop + pkg1_dir = temp_project / "apm_modules" / "anthropics" / "ralph-loop" + hooks1_dir = pkg1_dir / "hooks" + hooks1_dir.mkdir(parents=True) + (hooks1_dir / "hooks.json").write_text(json.dumps(RALPH_LOOP_HOOKS_JSON)) + (hooks1_dir / "stop-hook.sh").write_text("#!/bin/bash") + pkg1_info = _make_package_info(pkg1_dir, "ralph-loop") + + # Package 2: learning-output-style + pkg2_dir = temp_project / "apm_modules" / "anthropics" / "learning-output-style" + hooks2_dir = pkg2_dir / "hooks" + handlers_dir = pkg2_dir / "hooks-handlers" + hooks2_dir.mkdir(parents=True) + handlers_dir.mkdir(parents=True) + (hooks2_dir / "hooks.json").write_text(json.dumps(LEARNING_OUTPUT_STYLE_HOOKS_JSON)) + (handlers_dir / "session-start.sh").write_text("#!/bin/bash") + pkg2_info = _make_package_info(pkg2_dir, "learning-output-style") + + # Install both + integrator.integrate_package_hooks(pkg1_info, temp_project) + integrator.integrate_package_hooks(pkg2_info, temp_project) + + # Both hook JSONs should exist + assert (temp_project / ".github" / "hooks" / "ralph-loop-hooks-apm.json").exists() + assert (temp_project / ".github" / "hooks" / "learning-output-style-hooks-apm.json").exists() + + # Cleanup removes all + stats = integrator.sync_integration(None, temp_project) + assert stats["files_removed"] >= 2 + assert not (temp_project / ".github" / "hooks" / "ralph-loop-hooks-apm.json").exists() + assert not (temp_project / ".github" / "hooks" / "learning-output-style-hooks-apm.json").exists() + + +# ─── Deep copy safety test ─────────────────────────────────────────────────── + + +class TestDeepCopySafety: + """Test that rewriting doesn't mutate the original data.""" + + @pytest.fixture + def temp_project(self): + temp_dir = tempfile.mkdtemp() + yield Path(temp_dir) + shutil.rmtree(temp_dir, ignore_errors=True) + + def test_rewrite_does_not_mutate_original(self, temp_project): + """Ensure _rewrite_hooks_data returns a copy, not mutating original.""" + pkg_dir = temp_project / "pkg" + (pkg_dir / "hooks").mkdir(parents=True) + (pkg_dir / "hooks" / "script.sh").write_text("#!/bin/bash") + + data = { + "hooks": { + "Stop": [{"hooks": [{"type": "command", "command": "${CLAUDE_PLUGIN_ROOT}/hooks/script.sh"}]}] + } + } + original_cmd = data["hooks"]["Stop"][0]["hooks"][0]["command"] + + integrator = HookIntegrator() + rewritten, _ = integrator._rewrite_hooks_data(data, pkg_dir, "test", "vscode") + + # Original should be unchanged + assert data["hooks"]["Stop"][0]["hooks"][0]["command"] == original_cmd + # Rewritten should be different + assert rewritten["hooks"]["Stop"][0]["hooks"][0]["command"] != original_cmd