diff --git a/CHANGELOG.md b/CHANGELOG.md index 3f9ae011..e5506a47 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -8,6 +8,19 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ## [Unreleased] +### Added + +- `apm pack --format plugin` — export APM packages as standalone plugin directories consumable by Copilot CLI, Claude Code, and other plugin hosts. Transforms `.apm/` layout to plugin-native directories (agents, skills, commands, instructions, contexts) with hooks/MCP merging, collision handling, and security scanning (#378) +- `apm init --plugin` — initialize a plugin authoring project with both `plugin.json` and `apm.yml` (includes `devDependencies` section). Validates kebab-case plugin names per plugin spec (#378) +- `devDependencies` support in `apm.yml` and `APMPackage` model — same syntax as `dependencies`, parsed with `get_dev_apm_dependencies()`/`get_dev_mcp_dependencies()` accessors. Dev deps are excluded from plugin bundles (#378) +- `apm install --dev` — install packages as development dependencies, writing to `devDependencies` instead of `dependencies` (#378) +- `apm pack --force` flag — on collision, last writer wins instead of first (#378) +- `synthesize_plugin_json_from_apm_yml()` — generates `plugin.json` from `apm.yml` identity fields when no plugin manifest exists (#378) + +### Security + +- Content integrity hashing — SHA-256 checksums of package file trees are stored in `apm.lock.yaml` (`content_hash` field) and verified on subsequent installs. Detects tampering, MITM modifications, or force-pushed commits (#315, #378) +- Lockfile `is_dev` tracking — dev dependencies are explicitly marked in the lockfile for auditability (#378) ### Changed - Install URLs now use short `aka.ms/apm-unix` and `aka.ms/apm-windows` redirects across README, docs, CLI output, and install script @@ -250,7 +263,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - **SKILL.md Parsing**: Parse SKILL.md directly without requiring apm.yml generation - **Git Host Errors**: Actionable error messages for unsupported Git hosts -## [0.7.0] - 2025-12-19 +## [0.7.0] - 2024-12-19 ### Changed diff --git a/copilot-banner.png b/copilot-banner.png new file mode 100644 index 00000000..c495353f Binary files /dev/null and b/copilot-banner.png differ diff --git a/copilot-cli-screenshot.png b/copilot-cli-screenshot.png new file mode 100644 index 00000000..c495353f Binary files /dev/null and b/copilot-cli-screenshot.png differ diff --git a/docs/src/content/docs/contributing/integration-testing.md b/docs/src/content/docs/contributing/integration-testing.md index 755c833f..57e9eac5 100644 --- a/docs/src/content/docs/contributing/integration-testing.md +++ b/docs/src/content/docs/contributing/integration-testing.md @@ -48,11 +48,9 @@ pytest tests/integration/test_runtime_smoke.py::TestRuntimeSmoke::test_codex_run #### Option 1: Complete CI Process Simulation (Recommended) ```bash -```bash export GITHUB_TOKEN=your_token_here ./scripts/test-integration.sh ``` -``` This script (`scripts/test-integration.sh`) is a unified script that automatically adapts to your environment: diff --git a/docs/src/content/docs/enterprise/security.md b/docs/src/content/docs/enterprise/security.md index ba5ea0f1..73406942 100644 --- a/docs/src/content/docs/enterprise/security.md +++ b/docs/src/content/docs/enterprise/security.md @@ -59,7 +59,7 @@ The `resolved_commit` field is a full 40-character SHA, not a branch name or tag ### No registry -APM does not use a package registry. Dependencies are specified as git repository URLs in `apm.yaml`. This eliminates the registry compromise vector entirely — there is no centralized service that can be poisoned to redirect installs. +APM does not use a package registry. Dependencies are specified as git repository URLs in `apm.yml`. This eliminates the registry compromise vector entirely — there is no centralized service that can be poisoned to redirect installs. ## Content scanning @@ -139,9 +139,26 @@ Content scanning detects hidden Unicode characters. It does not detect: ### Planned hardening -- **Content integrity hashing** — SHA-256 checksums stored in `apm.lock.yaml` to verify downloaded content hasn't been tampered with. - **Hook transparency** — display hook script contents during install so developers can review what will execute. +## Content integrity hashing + +APM computes a SHA-256 hash of each downloaded package's file tree and stores it in `apm.lock.yaml` as `content_hash`. On subsequent installs, cached packages are verified against the lockfile hash. A mismatch triggers a warning and re-download. + +```yaml +# apm.lock.yaml +dependencies: + - repo_url: https://github.com/acme-corp/security-baseline + resolved_commit: a1b2c3d4e5f6... + content_hash: "sha256:9f86d081884c7d659a2feaa0c55ad015..." +``` + +The hash is deterministic — computed over sorted file paths and contents, independent of filesystem metadata (timestamps, permissions). `.git/` and `__pycache__/` directories are excluded. + +Lock files generated before this feature omit `content_hash`. APM handles this gracefully — verification is skipped and the hash is populated on the next install. + +See the [Lock File Specification](../../reference/lockfile-spec/#44-content-integrity) for field details. + ## Path security APM deploys files only to controlled subdirectories within the project root. @@ -163,6 +180,7 @@ Symlinks are never followed during artifact operations: - **Tree copy operations** skip symlinks entirely — they are excluded from the copy via an ignore filter. - **MCP configuration files** that are symlinks are rejected with a warning and not parsed. - **Manifest parsing** requires files to pass both `.is_file()` and `not .is_symlink()` checks. +- **Archive creation** — `apm pack` excludes symlinks from bundled archives. Packaged artifacts contain no symbolic links, preventing symlink-based escape attacks in distributed bundles. This prevents symlink-based attacks that could escape allowed directories or cause APM to read or write outside the project root. @@ -174,13 +192,22 @@ When APM deploys a file, it checks whether a file already exists at the target p - If the file is **not tracked** (user-authored or created by another tool), APM skips it and prints a warning. - The `--force` flag overrides collision detection, allowing APM to overwrite untracked files. +### Development dependency isolation + +APM separates production and development dependencies: + +- **Production dependencies** (`dependencies.apm`) are included in plugin bundles and shared packages. +- **Development dependencies** (`devDependencies.apm`, installed via `apm install --dev`) are resolved and cached locally but **excluded** from `apm pack --format plugin` output. + +This prevents transitive inclusion of development-only packages (test fixtures, linting rules, internal helpers) in distributed artifacts. The lockfile marks dev dependencies with `is_dev: true` for explicit tracking. See the [Lock File Specification](../../reference/lockfile-spec/#42-dependency-entries) for field details. + ## MCP server trust model APM integrates MCP (Model Context Protocol) server configurations from packages. Trust is explicit and scoped by dependency depth. ### Direct dependencies -MCP servers declared by your direct dependencies (packages listed in your `apm.yaml`) are auto-trusted. You explicitly chose to depend on these packages, so their MCP server declarations are accepted. +MCP servers declared by your direct dependencies (packages listed in your `apm.yml`) are auto-trusted. You explicitly chose to depend on these packages, so their MCP server declarations are accepted. ### Transitive dependencies @@ -188,7 +215,7 @@ MCP servers declared by transitive dependencies (dependencies of your dependenci To allow transitive MCP servers, you must either: -- **Re-declare the dependency** in your own `apm.yaml`, promoting it to a direct dependency. +- **Re-declare the dependency** in your own `apm.yml`, promoting it to a direct dependency. - **Pass `--trust-transitive-mcp`** to explicitly opt in to transitive MCP servers for that install. ## Token handling @@ -200,7 +227,7 @@ APM authenticates to git hosts using personal access tokens (PATs) read from env | GitHub packages | `GITHUB_APM_PAT`, `GITHUB_TOKEN`, `GH_TOKEN` | | Azure DevOps packages | `ADO_APM_PAT` | -- **Never stored in files.** Tokens are read from the environment at runtime. They are never written to `apm.yaml`, `apm.lock.yaml`, or any generated file. +- **Never stored in files.** Tokens are read from the environment at runtime. They are never written to `apm.yml`, `apm.lock.yaml`, or any generated file. - **Never logged.** Token values are not included in console output, error messages, or debug logs. - **Scoped to their git host.** A GitHub token is only sent to GitHub. An Azure DevOps token is only sent to Azure DevOps. Tokens are never transmitted to any other endpoint. @@ -211,7 +238,7 @@ For GitHub, a fine-grained PAT with read-only `Contents` permission on the repos | Vector | Traditional package manager | APM | |---|---|---| | Registry compromise | Attacker poisons central registry | No registry exists | -| Version substitution | Malicious version replaces legitimate one | Lock file pins exact commit SHA | +| Version substitution | Malicious version replaces legitimate one | Lock file pins exact commit SHA; content hash detects post-download tampering | | Post-install scripts | Arbitrary code runs after install | No code execution | | Typosquatting | Similar package names on registry | Dependencies are full git URLs | | Build-time injection | Malicious build steps execute | No build step — files are copied | diff --git a/docs/src/content/docs/getting-started/first-package.md b/docs/src/content/docs/getting-started/first-package.md index 336a7d30..a4b44eaa 100644 --- a/docs/src/content/docs/getting-started/first-package.md +++ b/docs/src/content/docs/getting-started/first-package.md @@ -117,4 +117,5 @@ This produces `AGENTS.md` (for Codex, Gemini) and `CLAUDE.md` for tools that nee - Add [skills](/apm/guides/skills/) to your package - Set up [dependencies](/apm/guides/dependencies/) on other packages +- Distribute as a standalone plugin — see [Plugin authoring](../../guides/plugins/#plugin-authoring) and [Pack & Distribute](../../guides/pack-distribute/) - Explore the [CLI reference](/apm/reference/cli-commands/) for more commands diff --git a/docs/src/content/docs/getting-started/migration.md b/docs/src/content/docs/getting-started/migration.md index 948e0f86..78a06b93 100644 --- a/docs/src/content/docs/getting-started/migration.md +++ b/docs/src/content/docs/getting-started/migration.md @@ -56,7 +56,7 @@ No uninstall script, no cleanup command. Zero risk. ## Next steps -- [Quick start](../quickstart/) — first-time setup walkthrough +- [Quick start](../quick-start/) — first-time setup walkthrough - [Dependencies](../../guides/dependencies/) — managing external packages - [Manifest schema](../../reference/manifest-schema/) — full `apm.yml` reference - [CLI commands](../../reference/cli-commands/) — complete command reference diff --git a/docs/src/content/docs/guides/dependencies.md b/docs/src/content/docs/guides/dependencies.md index 4b8a5702..d9777762 100644 --- a/docs/src/content/docs/guides/dependencies.md +++ b/docs/src/content/docs/guides/dependencies.md @@ -25,7 +25,9 @@ APM supports multiple dependency types: |------|-----------|---------| | **APM Package** | Has `apm.yml` | `microsoft/apm-sample-package` | | **Marketplace Plugin** | Has `plugin.json` (no `apm.yml`) | `github/awesome-copilot/plugins/context-engineering` | -| **Claude Skill** | Has `SKILL.md` (no `apm.yml`) | `ComposioHQ/awesome-claude-skills/brand-guidelines` || **Hook Package** | Has `hooks/*.json` (no `apm.yml` or `SKILL.md`) | `anthropics/claude-plugins-official/plugins/hookify` || **Virtual Subdirectory Package** | Folder path in monorepo | `ComposioHQ/awesome-claude-skills/mcp-builder` | +| **Claude Skill** | Has `SKILL.md` (no `apm.yml`) | `ComposioHQ/awesome-claude-skills/brand-guidelines` | +| **Hook Package** | Has `hooks/*.json` (no `apm.yml` or `SKILL.md`) | `anthropics/claude-plugins-official/plugins/hookify` | +| **Virtual Subdirectory Package** | Folder path in monorepo | `ComposioHQ/awesome-claude-skills/mcp-builder` | | **Virtual Subdirectory Package** | Folder path in repo | `github/awesome-copilot/skills/review-and-refactor` | | **Local Path Package** | Path starts with `./`, `../`, or `/` | `./packages/my-shared-skills` | | **ADO Package** | Azure DevOps repo | `dev.azure.com/org/project/_git/repo` | @@ -221,11 +223,28 @@ apm deps info apm-sample-package # Compile with dependencies apm compile -# The compilation process generates distributed AGENTS.md files across the project +# Compilation generates distributed files across the project # Instructions with matching applyTo patterns are merged from all sources -# See docs/wip/distributed-agents-compilation-strategy.md for detailed compilation logic ``` +## Development Dependencies + +Some packages are only needed during authoring — test fixtures, linting rules, internal helpers. Install them as dev dependencies so they stay out of distributed bundles: + +```bash +apm install --dev owner/test-helpers +``` + +Or declare them directly: + +```yaml +devDependencies: + apm: + - source: owner/test-helpers +``` + +Dev dependencies install to `apm_modules/` like production deps but are excluded from `apm pack --format plugin` output. See [Pack & Distribute](../pack-distribute/) for details. + ## Local Path Dependencies Install packages from the local filesystem for fast iteration during development. diff --git a/docs/src/content/docs/guides/pack-distribute.md b/docs/src/content/docs/guides/pack-distribute.md index 90c856b5..530691b6 100644 --- a/docs/src/content/docs/guides/pack-distribute.md +++ b/docs/src/content/docs/guides/pack-distribute.md @@ -66,6 +66,7 @@ apm pack --dry-run | `--archive` | off | Produce `.tar.gz` instead of directory | | `-o, --output` | `./build` | Output directory | | `--dry-run` | off | List files without writing | +| `--force` | off | On collision (plugin format), last writer wins | ### Target filtering @@ -145,6 +146,65 @@ build/my-project-1.0.0/ The bundle is self-describing: its `apm.lock.yaml` lists every file it contains and the dependency graph that produced them. +## Plugin format + +`apm pack --format plugin` transforms your project into a standalone plugin directory consumable by Copilot CLI, Claude Code, or other plugin hosts. The output contains no APM-specific files — no `apm.yml`, `apm_modules/`, `.apm/`, or `apm.lock.yaml`. + +Use this when you want to distribute your APM package as a standalone plugin that works without APM. + +```bash +apm pack --format plugin +``` + +### Output mapping + +The exporter remaps `.apm/` content into plugin-native paths: + +| APM source | Plugin output | +|---|---| +| `.apm/agents/*.agent.md` | `agents/*.agent.md` | +| `.apm/skills/*/SKILL.md` | `skills/*/SKILL.md` | +| `.apm/prompts/*.prompt.md` | `commands/*.md` | +| `.apm/prompts/*.md` | `commands/*.md` | +| `.apm/instructions/*.instructions.md` | `instructions/*.instructions.md` | +| `.apm/hooks/*.json` | `hooks.json` (merged) | +| `.apm/commands/*.md` | `commands/*.md` | + +Prompt files are renamed: `review.prompt.md` becomes `review.md` in `commands/`. + +**Excluded from plugin output:** `devDependencies` are excluded from plugin bundles — see [devDependencies](../../reference/manifest-schema/#5-devdependencies). + +### plugin.json generation + +The bundle includes a `plugin.json`. If one already exists in the project (at the root, `.github/plugin/`, `.claude-plugin/`, or `.cursor-plugin/`), it is used and updated with component paths reflecting the output layout. Otherwise, APM synthesizes one from `apm.yml` metadata. + +### devDependencies exclusion + +Dependencies listed under [`devDependencies`](../../reference/manifest-schema/#5-devdependencies) in `apm.yml` are excluded from the plugin bundle. Use [`apm install --dev`](../../reference/cli-commands/#apm-install---install-apm-and-mcp-dependencies) to add dev deps: + +```bash +apm install --dev owner/test-helpers +``` + +This keeps development-only packages (test helpers, lint rules) out of distributed plugins. + +### Example output + +``` +build/my-plugin-1.0.0/ + agents/ + architect.agent.md + skills/ + security-scan/ + SKILL.md + commands/ + review.md + instructions/ + coding-standards.instructions.md + hooks.json + plugin.json +``` + ## Lockfile enrichment The bundle includes a copy of `apm.lock.yaml` enriched with a `pack:` section. The project's own `apm.lock.yaml` is never modified. diff --git a/docs/src/content/docs/guides/plugins.md b/docs/src/content/docs/guides/plugins.md index 03e641fd..ec133d9f 100644 --- a/docs/src/content/docs/guides/plugins.md +++ b/docs/src/content/docs/guides/plugins.md @@ -6,6 +6,33 @@ sidebar: APM supports plugins through the `plugin.json` format. Plugins are automatically detected and integrated into your project as standard APM dependencies. +## Plugin authoring + +Plugin ecosystems handle distribution but lack dependency management, security scanning, version locking, and dev/prod separation. As plugins depend on shared primitives, these gaps compound. + +APM is the supply-chain layer. Author packages with full tooling — transitive dependencies, lockfile pinning, [security scanning](../../enterprise/security/), [`devDependencies`](../../reference/manifest-schema/#5-devdependencies) — then export as standard plugins. Consumers never need APM installed. + +### Three modes + +| Mode | Manifests | Use when | +|------|-----------|----------| +| **APM-only** | `apm.yml` | Full APM workflow — deploy to `.github/`, `.claude/`, `.cursor/`, `.opencode/` | +| **Plugin-only** | `plugin.json` | Standard plugin consumed by APM via `apm install` — metadata synthesized automatically | +| **Hybrid** | `apm.yml` + `plugin.json` | Author with dependency management + security, export as standalone plugins | + +**APM-only** is the default for teams using APM end-to-end. **Plugin-only** requires no changes to existing plugins — APM consumes them as-is. **Hybrid** is for plugin authors who want APM's supply-chain features during development while distributing standard plugins. + +### Hybrid authoring workflow + +```bash +apm init my-plugin --plugin # Creates both apm.yml and plugin.json +apm install --dev owner/helpers # Dev-only dependency (excluded from export) +apm install owner/core-rules # Production dependency +apm pack --format plugin # Export — dev deps excluded, security scanned +``` + +The exported plugin directory contains no APM-specific files. See [Pack & Distribute — Plugin format](../../guides/pack-distribute/#plugin-format) for the output mapping. + ## Overview Plugins are packages that contain: @@ -306,6 +333,10 @@ This: - Integrates skills into the runtime - Includes prompt primitives +## Exporting APM packages as plugins + +Use the [hybrid authoring workflow](#hybrid-authoring-workflow) to develop plugins with APM's full tooling and export them as standalone plugin directories. See [Pack & Distribute — Plugin format](../../guides/pack-distribute/#plugin-format) for the output mapping and structure. + ## Finding Plugins Plugins can be found through: diff --git a/docs/src/content/docs/guides/private-packages.md b/docs/src/content/docs/guides/private-packages.md index f75173f5..ae20ad59 100644 --- a/docs/src/content/docs/guides/private-packages.md +++ b/docs/src/content/docs/guides/private-packages.md @@ -2,7 +2,7 @@ title: "Private Packages" description: "Create and distribute private APM packages within your team or organization." sidebar: - order: 8 + order: 9 --- A private APM package is just a private git repository with an `apm.yml`. There is no registry and no publish step — make the repo private, grant read access, and `apm install` handles the rest. diff --git a/docs/src/content/docs/integrations/ci-cd.md b/docs/src/content/docs/integrations/ci-cd.md index 886668e7..b10131d6 100644 --- a/docs/src/content/docs/integrations/ci-cd.md +++ b/docs/src/content/docs/integrations/ci-cd.md @@ -151,6 +151,17 @@ Use `apm pack` in CI to build a distributable bundle once, then consume it in do path: build/*.tar.gz ``` +### Pack as standalone plugin + +```yaml +# Export as standalone plugin +- run: apm pack --format plugin +- uses: actions/upload-artifact@v4 + with: + name: plugin-bundle + path: build/*.tar.gz +``` + ### Consume in another job (no APM needed) ```yaml diff --git a/docs/src/content/docs/integrations/gh-aw.md b/docs/src/content/docs/integrations/gh-aw.md index 8d676c65..8c950de9 100644 --- a/docs/src/content/docs/integrations/gh-aw.md +++ b/docs/src/content/docs/integrations/gh-aw.md @@ -74,7 +74,7 @@ The APM compilation target is automatically inferred from the configured `engine ### apm-action Pre-Step -For more control over the installation process, use [`microsoft/apm-action@v1`](https://github.com/microsoft/apm-action) as an explicit workflow step. This approach runs `apm install && apm compile` directly, giving you access to the full APM CLI. +For more control over the installation process, use [`microsoft/apm-action@v1`](https://github.com/microsoft/apm-action) as an explicit workflow step. This approach runs `apm install` directly, giving you access to the full APM CLI. To also compile, add `compile: true` to the action configuration. ```yaml --- diff --git a/docs/src/content/docs/integrations/ide-tool-integration.md b/docs/src/content/docs/integrations/ide-tool-integration.md index a730e4b1..f382ae59 100644 --- a/docs/src/content/docs/integrations/ide-tool-integration.md +++ b/docs/src/content/docs/integrations/ide-tool-integration.md @@ -420,7 +420,7 @@ APM provides first-class support for MCP servers, including registry-based serve APM auto-discovers MCP server declarations from packages during `apm install`: - **apm.yml dependencies**: MCP servers listed under `dependencies.mcp` in a package's `apm.yml` are collected automatically. -- **plugin.json**: Packages with a `plugin.json` (at the root, `.github/plugin/`, or `.claude-plugin/`) are recognized as marketplace plugins. APM synthesizes an `apm.yml` from `plugin.json` metadata when no `apm.yml` exists. +- **plugin.json**: Packages with a `plugin.json` (at the root, `.github/plugin/`, or `.claude-plugin/`) are recognized as marketplace plugins. APM synthesizes an `apm.yml` from `plugin.json` metadata when no `apm.yml` exists. When both files are present (hybrid mode), APM uses `apm.yml` for dependency management while preserving `plugin.json` for plugin ecosystem compatibility. See [Plugin authoring](../../guides/plugins/#plugin-authoring). - **Transitive collection**: APM walks the dependency tree and collects MCP servers from all transitive packages. ### Trust Model diff --git a/docs/src/content/docs/introduction/how-it-works.md b/docs/src/content/docs/introduction/how-it-works.md index f4baa746..dbe642b1 100644 --- a/docs/src/content/docs/introduction/how-it-works.md +++ b/docs/src/content/docs/introduction/how-it-works.md @@ -160,7 +160,6 @@ Package your prompt engineering into reusable, configurable components: - **Prompts** (.prompt.md) - Executable AI workflows - **Agents** (.agent.md) - AI assistant personalities - **Skills** (SKILL.md) - Package meta-guides for AI agents -- **Context** (.context.md) - Project knowledge base - **Hooks** (.json) - Lifecycle event handlers ### Layer 3: Context Engineering @@ -227,18 +226,6 @@ Apply these colors and typography standards... Skills provide AI agents with a quick summary of package purpose and usage. -### Context (.context.md) -Optimized project knowledge for AI consumption: - -```markdown -# Project Architecture - -## Core Patterns -- Repository pattern for data access -- Clean architecture with domain separation -- Event-driven communication between services -``` - ### Hooks (.json) Lifecycle event handlers that run scripts at specific points during AI operations: diff --git a/docs/src/content/docs/introduction/key-concepts.md b/docs/src/content/docs/introduction/key-concepts.md index c0261291..a60857d4 100644 --- a/docs/src/content/docs/introduction/key-concepts.md +++ b/docs/src/content/docs/introduction/key-concepts.md @@ -29,11 +29,9 @@ my-project/ ├── instructions/ # Targeted guidance by file type and domain │ ├── security.instructions.md # applyTo: "auth/**" │ └── testing.instructions.md # applyTo: "**/*test*" - ├── prompts/ # Reusable agent workflows - │ ├── code-review.prompt.md # Systematic review process - │ └── feature-spec.prompt.md # Spec-first development - └── context/ # Optimized information retrieval - └── architecture.context.md # Project patterns and decisions + └── prompts/ # Reusable agent workflows + ├── code-review.prompt.md # Systematic review process + └── feature-spec.prompt.md # Spec-first development ``` ### Intelligent Compilation @@ -77,9 +75,8 @@ The APM CLI supports the following types of primitives: - **Agents** (`.agent.md`) - Define AI assistant personalities and behaviors (legacy: `.chatmode.md`) - **Instructions** (`.instructions.md`) - Provide coding standards and guidelines for specific file types - **Skills** (`SKILL.md`) - Package meta-guides that help AI agents understand what a package does -- **Context** (`.context.md`, `.memory.md`) - Supply background information and project context - **Hooks** (`.json` in `.apm/hooks/` or `hooks/`) - Define lifecycle event handlers with script references -- **Plugins** (`plugin.json`) - Pre-packaged agent bundles auto-normalized into APM packages +- **Plugins** (`plugin.json`) - Pre-packaged agent bundles auto-normalized into APM packages. Projects may use `apm.yml` only, `plugin.json` only, or both. See [Plugin authoring](../../guides/plugins/#plugin-authoring) > **Note**: Both `.agent.md` (new format) and `.chatmode.md` (legacy format) are fully supported. VSCode provides Quick Fix actions to help migrate from `.chatmode.md` to `.agent.md`. @@ -98,10 +95,6 @@ APM discovers primitives in these locations: │ └── *.chatmode.md ├── instructions/ # Coding standards and guidelines │ └── *.instructions.md -├── context/ # Project context files -│ └── *.context.md -├── memory/ # Team info, contacts, etc. -│ └── *.memory.md └── hooks/ # Lifecycle event handlers ├── *.json # Hook definitions (JSON) └── scripts/ # Referenced scripts @@ -120,8 +113,6 @@ APM discovers primitives in these locations: *.agent.md *.chatmode.md *.instructions.md -*.context.md -*.memory.md ``` ## Component Types Overview @@ -198,19 +189,6 @@ When asked about branding, apply these standards... → [Complete Skills Guide](../../guides/skills/) -### Context (.context.md) -**Knowledge Management Layer** - Optimized project information for AI consumption - -Context files package project knowledge, architectural decisions, and team standards in formats optimized for LLM consumption and token efficiency. - -```markdown -# Project Architecture -## Core Patterns -- Repository pattern for data access -- Clean architecture with domain separation -- Event-driven communication between services -``` - ## Primitive Types ### Agents @@ -297,58 +275,6 @@ def calculate_metrics(data: List[Dict], threshold: float = 0.5) -> Dict[str, flo """ ``` -### Context Files - -Context files provide background information, project details, and other relevant context that AI assistants should be aware of. - -**Format:** `.context.md` or `.memory.md` files - -**Frontmatter:** -- `description` (optional) - Brief description of the context -- `author` (optional) - Creator information -- `version` (optional) - Version string - -**Examples:** - -Project context (`.apm/context/project-info.context.md`): -```markdown ---- -description: Project overview and architecture ---- - -# APM CLI Project - -## Overview -Command-line tool for AI-powered development workflows. - -## Key Technologies -- Python 3.10+ with Click framework -- YAML frontmatter for configuration -- Rich library for terminal output - -## Architecture -- Modular runtime system -- Plugin-based workflow engine -- Extensible primitive system -``` - -Team information (`.apm/memory/team-contacts.memory.md`): -```markdown -# Team Contacts - -## Development Team -- Lead Developer: Alice Johnson (alice@company.com) -- Backend Engineer: Bob Smith (bob@company.com) - -## Emergency Contacts -- On-call: +1-555-0123 -- Incidents: incidents@company.com - -## Meeting Schedule -- Daily standup: 9:00 AM PST -- Sprint planning: Mondays 2:00 PM PST -``` - ### Hooks Hooks define lifecycle event handlers that run scripts at specific points during AI agent operations (e.g., before/after tool use). @@ -391,7 +317,6 @@ All primitives are automatically validated during discovery: - **Agents**: Must have description and content (supports both `.agent.md` and `.chatmode.md`) - **Instructions**: Must have description, applyTo pattern, and content -- **Context**: Must have content (description optional) Invalid files are skipped with warning messages, allowing valid primitives to continue loading. @@ -475,12 +400,9 @@ Use the structured `.apm/` directories for better organization: ├── agents/ │ ├── code-reviewer.agent.md │ └── documentation-writer.agent.md -├── instructions/ -│ ├── python-style.instructions.md -│ └── typescript-conventions.instructions.md -└── context/ - ├── project-info.context.md - └── architecture-overview.context.md +└── instructions/ + ├── python-style.instructions.md + └── typescript-conventions.instructions.md ``` ### 5. Team Collaboration diff --git a/docs/src/content/docs/reference/cli-commands.md b/docs/src/content/docs/reference/cli-commands.md index 1ad67622..92707b59 100644 --- a/docs/src/content/docs/reference/cli-commands.md +++ b/docs/src/content/docs/reference/cli-commands.md @@ -35,6 +35,7 @@ apm init [PROJECT_NAME] [OPTIONS] **Options:** - `-y, --yes` - Skip interactive prompts and use auto-detected defaults +- `--plugin` - Initialize as a plugin authoring project (creates `plugin.json` + `apm.yml` with `devDependencies`) **Examples:** ```bash @@ -49,6 +50,9 @@ apm init my-hello-world # Create project with auto-detected defaults apm init my-project --yes + +# Initialize a plugin authoring project +apm init my-plugin --plugin ``` **Behavior:** @@ -56,9 +60,11 @@ apm init my-project --yes - **Interactive mode**: Prompts for project details unless `--yes` specified - **Auto-detection**: Automatically detects author from `git config user.name` and description from project context - **Brownfield friendly**: Works cleanly in existing projects without file pollution +- **Plugin mode** (`--plugin`): Creates both `plugin.json` and `apm.yml` with an empty `devDependencies` section. Plugin names must be kebab-case (`^[a-z][a-z0-9-]{0,63}$`), max 64 characters **Creates:** - `apm.yml` - Minimal project configuration with empty dependencies and scripts sections +- `plugin.json` - Plugin manifest (only with `--plugin`) **Auto-detected fields:** - `name` - From project directory name @@ -87,6 +93,7 @@ apm install [PACKAGES...] [OPTIONS] - `--parallel-downloads INTEGER` - Max concurrent package downloads (default: 4, 0 to disable) - `--verbose` - Show individual file paths and full error details in the diagnostic summary - `--trust-transitive-mcp` - Trust self-defined MCP servers from transitive packages (skip re-declaration requirement) +- `--dev` - Add packages to [`devDependencies`](../manifest-schema/#5-devdependencies) instead of `dependencies`. Dev deps are installed locally but excluded from `apm pack --format plugin` bundles **Behavior:** - `apm install` (no args): Installs **all** packages from `apm.yml` @@ -137,6 +144,9 @@ apm install --exclude codex # Trust self-defined MCP servers from transitive packages apm install --trust-transitive-mcp +# Install as a dev dependency (excluded from plugin bundles) +apm install --dev owner/test-helpers + # Install from a local path (copies to apm_modules/_local/) apm install ./packages/my-shared-skills apm install /home/user/repos/my-ai-package @@ -407,7 +417,8 @@ apm pack [OPTIONS] - `-t, --target [copilot|vscode|claude|cursor|opencode|all]` - Filter files by target. Auto-detects from `apm.yml` if not specified. `vscode` is an alias for `copilot` - `--archive` - Produce a `.tar.gz` archive instead of a directory - `--dry-run` - List files that would be packed without writing anything -- `--format [apm|plugin]` - Bundle format (default: `apm`) +- `--format [apm|plugin]` - Bundle format (default: `apm`). `plugin` produces a standalone plugin directory with `plugin.json` +- `--force` - On collision (plugin format), last writer wins instead of first **Examples:** ```bash @@ -420,6 +431,9 @@ apm pack --archive # Pack only VS Code / Copilot files apm pack --target vscode +# Export as a standalone plugin directory +apm pack --format plugin + # Preview what would be packed apm pack --dry-run @@ -432,6 +446,7 @@ apm pack -o dist/ - Scans files for hidden Unicode characters before bundling — warns if findings are detected (non-blocking; consumers are protected by `apm install`/`apm unpack` which block on critical) - Copies files preserving directory structure - Writes an enriched `apm.lock.yaml` inside the bundle with a `pack:` metadata section (the project's own `apm.lock.yaml` is never modified) +- **Plugin format** (`--format plugin`): Remaps `.apm/` content into plugin-native paths (`agents/`, `skills/`, `commands/`, etc.), generates or updates `plugin.json`, merges hooks into a single `hooks.json`. `devDependencies` are also excluded from plugin bundles. See [Pack & Distribute](../../guides/pack-distribute/#plugin-format) for the full mapping table **Target filtering:** @@ -994,7 +1009,6 @@ The structure is entirely dictated by the instruction context found in `.apm/` a **Primitive Discovery:** - **Chatmodes**: `.chatmode.md` files in `.apm/chatmodes/`, `.github/chatmodes/` - **Instructions**: `.instructions.md` files in `.apm/instructions/`, `.github/instructions/` -- **Contexts**: `.context.md`, `.memory.md` files in `.apm/context/`, `.github/context/` - **Workflows**: `.prompt.md` files in project and `.github/prompts/` APM integrates seamlessly with [Spec-kit](https://github.com/github/spec-kit) for specification-driven development, automatically injecting Spec-kit `constitution` into the compiled context layer. diff --git a/docs/src/content/docs/reference/examples.md b/docs/src/content/docs/reference/examples.md index bda59bbd..43faa6d6 100644 --- a/docs/src/content/docs/reference/examples.md +++ b/docs/src/content/docs/reference/examples.md @@ -291,7 +291,11 @@ apm run feature-review --param feature="user-dashboard" **Scenario**: Quickly get new developers productive with company standards ```yaml -# .apm/context/company-standards.context.md +# .apm/instructions/company-standards.instructions.md +--- +description: Development standards for all AcmeCorp projects +applyTo: "**/*" +--- # Development Standards at AcmeCorp diff --git a/docs/src/content/docs/reference/lockfile-spec.md b/docs/src/content/docs/reference/lockfile-spec.md index 987cf76b..4a54f697 100644 --- a/docs/src/content/docs/reference/lockfile-spec.md +++ b/docs/src/content/docs/reference/lockfile-spec.md @@ -121,6 +121,8 @@ fields: | `depth` | integer | MUST | Dependency depth. `1` = direct dependency, `2`+ = transitive. | | `resolved_by` | string | MAY | `repo_url` of the parent that introduced this transitive dependency. Present only when `depth >= 2`. | | `package_type` | string | MUST | Package type: `apm_package`, `plugin`, `virtual`, or other registered types. | +| `content_hash` | string | MAY | SHA-256 hash of the package file tree, in the format `"sha256:"`. Used to verify cached packages on subsequent installs. Omitted for local path dependencies. See [section 4.4](#44-content-integrity). | +| `is_dev` | boolean | MAY | `true` if the dependency was resolved through [`devDependencies`](../manifest-schema/#5-devdependencies). Omitted when `false`. Dev deps are excluded from `apm pack --format plugin` bundles. | | `deployed_files` | array of strings | MUST | Every file path APM deployed for this dependency, relative to project root. | | `source` | string | MAY | Dependency source. `"local"` for local path dependencies. Omitted for remote (git) dependencies. | | `local_path` | string | MAY | Filesystem path (relative or absolute) to the local package. Present only when `source` is `"local"`. | @@ -128,6 +130,8 @@ fields: Fields with empty or default values (empty strings, `false` booleans, empty lists) SHOULD be omitted from the serialized output to keep the file concise. +**Dev dependency tracking:** Packages installed via `apm install --dev` are marked with `is_dev: true`. When building plugin bundles (`apm pack --format plugin`), dev dependencies are excluded from the output. Resolvers and CI tools should respect this flag when producing distributable artifacts. + ### 4.3 Unique Key Each dependency is uniquely identified by its `repo_url`, or by the @@ -136,6 +140,26 @@ For local path dependencies (`source: "local"`), the unique key is the `local_path` value. A conforming lock file MUST NOT contain duplicate entries for the same key. +### 4.4 Content Integrity + +APM computes a SHA-256 hash of each package's file tree after download and stores +it as `content_hash` in the lock file. On subsequent installs, cached packages are +verified against this hash. A mismatch triggers a warning and re-download. + +The hash covers all regular files sorted by POSIX path (deterministic regardless of +filesystem ordering). `.git/` and `__pycache__/` directories are excluded. + +```yaml +dependencies: + - repo_url: https://github.com/acme-corp/security-baseline + resolved_commit: a1b2c3d4e5f6789012345678901234567890abcd + content_hash: "sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855" + # ... +``` + +Lock files generated before this feature omit `content_hash`. APM handles this +gracefully — verification is skipped and the hash is populated on the next install. + ## 5. Path Conventions All paths in `deployed_files` MUST use forward slashes (POSIX format), @@ -247,6 +271,7 @@ dependencies: version: "2.1.0" depth: 1 package_type: apm_package + content_hash: "sha256:9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08" deployed_files: - .github/instructions/security.instructions.md - .github/agents/security-auditor.agent.md @@ -258,6 +283,7 @@ dependencies: depth: 2 resolved_by: https://github.com/acme-corp/security-baseline package_type: apm_package + content_hash: "sha256:d7a8fbb307d7809469ca9abcb0082e4f8d5651e46d3cdb762d02d0bf37c9e592" deployed_files: - .github/instructions/common-guidelines.instructions.md @@ -273,6 +299,16 @@ dependencies: deployed_files: - .github/instructions/linter.instructions.md + - repo_url: https://github.com/acme-corp/test-helpers + resolved_commit: abcdef1234567890abcdef1234567890abcdef12 + resolved_ref: main + depth: 1 + package_type: apm_package + is_dev: true + content_hash: "sha256:4a44dc15364204a80fe80e9039455cc1608281820fe2b24f1e5233ade6af1dd5" + deployed_files: + - .github/instructions/test-helpers.instructions.md + mcp_servers: - security-scanner diff --git a/docs/src/content/docs/reference/manifest-schema.md b/docs/src/content/docs/reference/manifest-schema.md index 78d37f98..5a46c6ce 100644 --- a/docs/src/content/docs/reference/manifest-schema.md +++ b/docs/src/content/docs/reference/manifest-schema.md @@ -51,6 +51,9 @@ scripts: > dependencies: apm: > mcp: > +devDependencies: + apm: > + mcp: > compilation: ``` @@ -319,7 +322,32 @@ dependencies: --- -## 5. Compilation +## 5. devDependencies + +| | | +|---|---| +| **Type** | `object` | +| **Required** | OPTIONAL | +| **Known keys** | `apm`, `mcp` | + +Development-only dependencies installed locally but excluded from plugin bundles (`apm pack --format plugin`). Uses the same structure as [`dependencies`](#4-dependencies). + +```yaml +devDependencies: + apm: + - owner/test-helpers + - owner/lint-rules#v2.0.0 +``` + +Created automatically by `apm init --plugin`. Use [`apm install --dev`](../cli-commands/#apm-install---install-apm-and-mcp-dependencies) to add packages: + +```bash +apm install --dev owner/test-helpers +``` + +--- + +## 6. Compilation The `compilation` key is OPTIONAL. It controls `apm compile` behaviour. All fields have sensible defaults; omitting the entire section is valid. @@ -333,9 +361,9 @@ The `compilation` key is OPTIONAL. It controls `apm compile` behaviour. All fiel | `resolve_links` | `bool` | `true` | | Resolve relative Markdown links in primitives. | | `source_attribution` | `bool` | `true` | | Include source-file origin comments in compiled output. | | `exclude` | `list` or `string` | `[]` | Glob patterns | Directories to skip during compilation (e.g. `apm_modules/**`). | -| `placement` | `object` | — | | Placement tuning. See §5.1. | +| `placement` | `object` | — | | Placement tuning. See §6.1. | -### 5.1. `compilation.placement` +### 6.1. `compilation.placement` | Field | Type | Default | Description | |---|---|---|---| @@ -355,11 +383,11 @@ compilation: --- -## 6. Lockfile (`apm.lock.yaml`) +## 7. Lockfile (`apm.lock.yaml`) After successful dependency resolution, a conforming resolver MUST write a lockfile capturing the exact resolved state. The lockfile MUST be a YAML file named `apm.lock.yaml` at the project root. It SHOULD be committed to version control. -### 6.1. Structure +### 7.1. Structure ```yaml lockfile_version: "1" @@ -376,11 +404,13 @@ dependencies: # YAML list (not a map) depth: # 1 = direct, 2+ = transitive resolved_by: # Parent dependency (transitive only) package_type: # Package type (e.g. "apm_package", "marketplace_plugin") + content_hash: # SHA-256 of package file tree (e.g. "sha256:a1b2c3...") + is_dev: # True for devDependencies deployed_files: > # Workspace-relative paths of installed files mcp_servers: > # MCP dependency references managed by APM (OPTIONAL, e.g. "io.github.github/github-mcp-server") ``` -### 6.2. Resolver Behaviour +### 7.2. Resolver Behaviour 1. **First install** — Resolve all dependencies, write `apm.lock.yaml`. 2. **Subsequent installs** — Read `apm.lock.yaml`, use locked commit SHAs. A resolver SHOULD skip download if local checkout already matches. @@ -388,7 +418,7 @@ mcp_servers: > # MCP dependency references managed b --- -## 7. Integrator Contract +## 8. Integrator Contract Any runtime adopting this format (e.g. GitHub Agentic Workflows, CI systems, IDEs) MUST implement these steps: @@ -396,7 +426,7 @@ Any runtime adopting this format (e.g. GitHub Agentic Workflows, CI systems, IDE 2. **Resolve `dependencies.apm`** — For each entry, clone/fetch the git repo (respecting `ref`), locate the `.apm/` directory (or virtual path), and extract primitives. 3. **Resolve `dependencies.mcp`** — For each entry, resolve from the MCP registry or validate self-defined transport config per §4.2.3. 4. **Transitive resolution** — Resolved packages MAY contain their own `apm.yml` with further dependencies, forming a dependency tree. Resolvers MUST resolve transitively. Conflicts are merged at instruction level (by `applyTo` pattern), not file level. -5. **Write lockfile** — Record exact commit SHAs and deployed file paths in `apm.lock.yaml` per §6. +5. **Write lockfile** — Record exact commit SHAs and deployed file paths in `apm.lock.yaml` per §7. --- @@ -431,6 +461,10 @@ dependencies: env: API_KEY: ${{ secrets.KEY }} +devDependencies: + apm: + - owner/test-helpers + compilation: target: all strategy: distributed diff --git a/dummy b/dummy new file mode 100644 index 00000000..e69de29b diff --git a/src/apm_cli/bundle/__init__.py b/src/apm_cli/bundle/__init__.py index 31dc0988..8cba3e1f 100644 --- a/src/apm_cli/bundle/__init__.py +++ b/src/apm_cli/bundle/__init__.py @@ -1,6 +1,13 @@ """Bundle creation and consumption for APM packages.""" from .packer import pack_bundle, PackResult +from .plugin_exporter import export_plugin_bundle from .unpacker import unpack_bundle, UnpackResult -__all__ = ["pack_bundle", "PackResult", "unpack_bundle", "UnpackResult"] +__all__ = [ + "pack_bundle", + "PackResult", + "export_plugin_bundle", + "unpack_bundle", + "UnpackResult", +] diff --git a/src/apm_cli/bundle/packer.py b/src/apm_cli/bundle/packer.py index d1860cb4..ff7d75f9 100644 --- a/src/apm_cli/bundle/packer.py +++ b/src/apm_cli/bundle/packer.py @@ -46,6 +46,7 @@ def pack_bundle( target: Optional[str] = None, archive: bool = False, dry_run: bool = False, + force: bool = False, ) -> PackResult: """Create a self-contained bundle from installed APM dependencies. @@ -57,6 +58,7 @@ def pack_bundle( (auto-detect from apm.yml / project structure). archive: If *True*, produce a ``.tar.gz`` and remove the directory. dry_run: If *True*, resolve the file list but write nothing to disk. + force: On collision (plugin format), last writer wins. Returns: :class:`PackResult` describing what was (or would be) produced. @@ -67,6 +69,20 @@ def pack_bundle( """ # 1. Read lockfile (migrate legacy apm.lock → apm.lock.yaml if needed) migrate_lockfile_if_needed(project_root) + + # Plugin format: delegate to dedicated exporter + if fmt == "plugin": + from .plugin_exporter import export_plugin_bundle + + return export_plugin_bundle( + project_root=project_root, + output_dir=output_dir, + target=target, + archive=archive, + dry_run=dry_run, + force=force, + ) + lockfile_path = get_lockfile_path(project_root) lockfile = LockFile.read(lockfile_path) if lockfile is None: diff --git a/src/apm_cli/bundle/plugin_exporter.py b/src/apm_cli/bundle/plugin_exporter.py new file mode 100644 index 00000000..8d1c9ee2 --- /dev/null +++ b/src/apm_cli/bundle/plugin_exporter.py @@ -0,0 +1,640 @@ +"""Plugin exporter -- transforms APM packages into plugin-native directories. + +Produces a standalone plugin directory that Copilot CLI, Claude Code, or other +plugin hosts can consume directly. The output contains no APM-specific files +(no ``apm.yml``, ``apm_modules/``, ``.apm/``, or ``apm.lock.yaml``). +""" + +import json +import os +import shutil +import tarfile +from pathlib import Path +from typing import Dict, List, Optional, Set, Tuple + +import yaml + +import re + +from ..deps.lockfile import ( + LockFile, + LockedDependency, + get_lockfile_path, + migrate_lockfile_if_needed, +) +from ..models.apm_package import APMPackage, DependencyReference +from ..utils.console import _rich_info, _rich_warning +from ..utils.path_security import PathTraversalError, ensure_path_within, safe_rmtree +from .packer import PackResult + +# --------------------------------------------------------------------------- +# Path helpers +# --------------------------------------------------------------------------- + + +def _validate_output_rel(rel: str) -> bool: + """Return True when *rel* is safe to write inside the output directory.""" + p = Path(rel) + return not p.is_absolute() and ".." not in p.parts + + +_SAFE_BUNDLE_NAME_RE = re.compile(r"[^a-zA-Z0-9._-]") + + +def _sanitize_bundle_name(name: str) -> str: + """Sanitize a package name/version for use as a directory component. + + Replaces path separators and traversal characters with hyphens, then + validates the result is a single safe path component. + """ + sanitized = _SAFE_BUNDLE_NAME_RE.sub("-", name).strip("-") or "unnamed" + if ".." in sanitized or "/" in sanitized or "\\" in sanitized: + sanitized = "unnamed" + return sanitized + + +def _rename_prompt(name: str) -> str: + """Strip the ``.prompt`` infix so ``foo.prompt.md`` becomes ``foo.md``.""" + if name.endswith(".prompt.md"): + return name[: -len(".prompt.md")] + ".md" + return name + + +# --------------------------------------------------------------------------- +# Component collectors +# --------------------------------------------------------------------------- + + +def _collect_apm_components(apm_dir: Path) -> List[Tuple[Path, str]]: + """Collect all components from a package's ``.apm/`` directory. + + Returns a list of ``(source_abs, output_rel_posix)`` tuples using the + APM → plugin mapping table. + """ + components: List[Tuple[Path, str]] = [] + if not apm_dir.is_dir(): + return components + + # agents/ -> agents/ + _collect_flat(apm_dir / "agents", "agents", components) + + # skills/ -> skills/ (preserve sub-directory structure) + _collect_recursive(apm_dir / "skills", "skills", components) + + # prompts/ -> commands/ (rename .prompt.md -> .md) + _collect_recursive( + apm_dir / "prompts", "commands", components, rename=_rename_prompt + ) + + # instructions/ -> instructions/ + _collect_recursive(apm_dir / "instructions", "instructions", components) + + # commands/ -> commands/ + _collect_recursive(apm_dir / "commands", "commands", components) + + return components + + +def _collect_root_plugin_components(project_root: Path) -> List[Tuple[Path, str]]: + """Collect plugin-native components authored at root level. + + Packages that already follow the plugin directory convention (``agents/``, + ``skills/``, etc. at the repo root) have their files picked up here. + """ + components: List[Tuple[Path, str]] = [] + for dir_name in ("agents", "skills", "commands", "instructions"): + _collect_recursive(project_root / dir_name, dir_name, components) + return components + + +def _collect_bare_skill( + install_path: Path, + dep: "LockedDependency", + out: List[Tuple[Path, str]], +) -> None: + """Detect a bare Claude skill (SKILL.md at dep root, no skills/ subdir). + + Bare skills are packages consisting of just ``SKILL.md`` + supporting files + at the package root. They have no ``.apm/`` directory or ``skills/`` + subdirectory, so the normal collectors miss them. Map the entire package + into ``skills/{name}/`` so the plugin host can discover it. + """ + skill_md = install_path / "SKILL.md" + if not skill_md.is_file(): + return + # Already collected via .apm/skills/ or root skills/ — skip + if any(rel.startswith("skills/") for _, rel in out): + return + # Derive a slug: prefer virtual_path (e.g. "frontend-design"), else last + # segment of repo_url (e.g. "my-skill" from "owner/my-skill") + slug = (getattr(dep, "virtual_path", "") or "").strip("/") + if not slug: + slug = dep.repo_url.rsplit("/", 1)[-1] if dep.repo_url else "skill" + for f in sorted(install_path.iterdir()): + if f.is_file() and not f.is_symlink() and f.name not in ( + "apm.yml", "apm.lock.yaml", "plugin.json", + ): + out.append((f, f"skills/{slug}/{f.name}")) + + +# -- low-level walkers ------------------------------------------------------- + + +def _collect_flat( + src_dir: Path, + output_prefix: str, + out: List[Tuple[Path, str]], + *, + rename=None, +) -> None: + """Add every regular non-symlink file directly inside *src_dir*.""" + if not src_dir.is_dir(): + return + for f in sorted(src_dir.iterdir()): + if f.is_file() and not f.is_symlink(): + name = rename(f.name) if rename else f.name + out.append((f, f"{output_prefix}/{name}")) + + +def _collect_recursive( + src_dir: Path, + output_prefix: str, + out: List[Tuple[Path, str]], + *, + rename=None, +) -> None: + """Add every regular non-symlink file under *src_dir*, preserving hierarchy.""" + if not src_dir.is_dir(): + return + for f in sorted(src_dir.rglob("*")): + if not f.is_file() or f.is_symlink(): + continue + rel = f.relative_to(src_dir) + name = rename(rel.name) if rename else rel.name + out_rel = (rel.parent / name).as_posix() + out.append((f, f"{output_prefix}/{out_rel}")) + + +# --------------------------------------------------------------------------- +# Hooks / MCP merging +# --------------------------------------------------------------------------- + + +_MAX_MERGE_DEPTH = 20 + + +def _deep_merge( + base: dict, overlay: dict, *, overwrite: bool = False, _depth: int = 0 +) -> None: + """Recursively merge *overlay* into *base*. + + When *overwrite* is False (default), existing base keys win. + When *overwrite* is True, overlay keys overwrite base keys. + + Raises ``ValueError`` if nesting exceeds ``_MAX_MERGE_DEPTH``. + """ + if _depth > _MAX_MERGE_DEPTH: + raise ValueError( + f"Hooks/MCP config exceeds maximum nesting depth ({_MAX_MERGE_DEPTH})" + ) + for key, value in overlay.items(): + if key not in base: + base[key] = value + elif overwrite: + if isinstance(base[key], dict) and isinstance(value, dict): + _deep_merge(base[key], value, overwrite=True, _depth=_depth + 1) + else: + base[key] = value + else: + if isinstance(base[key], dict) and isinstance(value, dict): + _deep_merge(base[key], value, overwrite=False, _depth=_depth + 1) + + +def _collect_hooks_from_apm(apm_dir: Path) -> dict: + """Return merged hooks from ``.apm/hooks/*.json``.""" + hooks: dict = {} + hooks_dir = apm_dir / "hooks" + if not hooks_dir.is_dir(): + return hooks + for f in sorted(hooks_dir.iterdir()): + if f.is_file() and f.suffix == ".json" and not f.is_symlink(): + try: + data = json.loads(f.read_text(encoding="utf-8")) + if isinstance(data, dict): + _deep_merge(hooks, data, overwrite=False) + except (json.JSONDecodeError, OSError): + pass + return hooks + + +def _collect_hooks_from_root(package_root: Path) -> dict: + """Return hooks from a root-level ``hooks.json`` or ``hooks/`` directory.""" + hooks: dict = {} + # Single file + hooks_file = package_root / "hooks.json" + if hooks_file.is_file() and not hooks_file.is_symlink(): + try: + data = json.loads(hooks_file.read_text(encoding="utf-8")) + if isinstance(data, dict): + _deep_merge(hooks, data, overwrite=False) + except (json.JSONDecodeError, OSError): + pass + # Directory + hooks_dir = package_root / "hooks" + if hooks_dir.is_dir(): + for f in sorted(hooks_dir.iterdir()): + if f.is_file() and f.suffix == ".json" and not f.is_symlink(): + try: + data = json.loads(f.read_text(encoding="utf-8")) + if isinstance(data, dict): + _deep_merge(hooks, data, overwrite=False) + except (json.JSONDecodeError, OSError): + pass + return hooks + + +def _collect_mcp(package_root: Path) -> dict: + """Return ``mcpServers`` dict from ``.mcp.json``.""" + mcp_file = package_root / ".mcp.json" + if not mcp_file.is_file() or mcp_file.is_symlink(): + return {} + try: + data = json.loads(mcp_file.read_text(encoding="utf-8")) + if isinstance(data, dict): + servers = data.get("mcpServers", {}) + return dict(servers) if isinstance(servers, dict) else {} + except (json.JSONDecodeError, OSError): + pass + return {} + + +# --------------------------------------------------------------------------- +# devDependencies filtering +# --------------------------------------------------------------------------- + + +def _get_dev_dependency_urls(apm_yml_path: Path) -> Set[Tuple[str, str]]: + """Read ``devDependencies.apm`` from raw YAML and return a set of + ``(repo_url, virtual_path)`` tuples for matching against lockfile entries. + + Using the composite key avoids false positives when multiple virtual + packages share the same base repo (e.g. different sub-paths under + ``github/awesome-copilot``). + """ + try: + data = yaml.safe_load(apm_yml_path.read_text(encoding="utf-8")) + except (yaml.YAMLError, OSError, ValueError): + return set() + if not isinstance(data, dict): + return set() + dev_deps = data.get("devDependencies", {}) + if not isinstance(dev_deps, dict): + return set() + apm_dev = dev_deps.get("apm", []) + if not isinstance(apm_dev, list): + return set() + keys: Set[Tuple[str, str]] = set() + for dep in apm_dev: + if isinstance(dep, str): + try: + ref = DependencyReference.parse(dep) + keys.add((ref.repo_url, ref.virtual_path or "")) + except ValueError: + keys.add((dep, "")) + elif isinstance(dep, dict): + try: + ref = DependencyReference.parse_from_dict(dep) + keys.add((ref.repo_url, ref.virtual_path or "")) + except ValueError: + pass + return keys + + +# --------------------------------------------------------------------------- +# Plugin.json helpers +# --------------------------------------------------------------------------- + + +def _find_or_synthesize_plugin_json( + project_root: Path, apm_yml_path: Path +) -> dict: + """Locate an existing ``plugin.json`` or synthesise one from ``apm.yml``.""" + from ..deps.plugin_parser import synthesize_plugin_json_from_apm_yml + from ..utils.helpers import find_plugin_json + + plugin_json_path = find_plugin_json(project_root) + if plugin_json_path is not None: + try: + return json.loads(plugin_json_path.read_text(encoding="utf-8")) + except (json.JSONDecodeError, OSError) as exc: + _rich_warning( + f"Found plugin.json at {plugin_json_path} but could not parse it: {exc}. " + "Falling back to synthesis from apm.yml." + ) + + else: + _rich_warning( + "No plugin.json found. Synthesizing from apm.yml. " + "Consider running 'apm init --plugin'." + ) + return synthesize_plugin_json_from_apm_yml(apm_yml_path) + + +def _update_plugin_json_paths(plugin_json: dict, output_files: List[str]) -> dict: + """Update component paths in ``plugin.json`` to reflect the output layout.""" + result = dict(plugin_json) + + # Detect which top-level directories actually exist in the output + top_dirs: Set[str] = set() + for f in output_files: + parts = Path(f).parts + if parts: + top_dirs.add(parts[0]) + + # Map component keys to their output directories + component_dirs = { + "agents": "agents", + "skills": "skills", + "commands": "commands", + "instructions": "instructions", + } + for key, dirname in component_dirs.items(): + if dirname in top_dirs: + result[key] = [f"{dirname}/"] + else: + result.pop(key, None) + + return result + + +# --------------------------------------------------------------------------- +# Dep → filesystem helpers +# --------------------------------------------------------------------------- + + +def _dep_install_path(dep: LockedDependency, apm_modules_dir: Path) -> Path: + """Compute the filesystem install path for a locked dependency.""" + dep_ref = DependencyReference( + repo_url=dep.repo_url, + host=dep.host, + virtual_path=dep.virtual_path, + is_virtual=dep.is_virtual, + is_local=(dep.source == "local"), + local_path=dep.local_path, + ) + return dep_ref.get_install_path(apm_modules_dir) + + +# --------------------------------------------------------------------------- +# Main exporter +# --------------------------------------------------------------------------- + + +def export_plugin_bundle( + project_root: Path, + output_dir: Path, + target: Optional[str] = None, + archive: bool = False, + dry_run: bool = False, + force: bool = False, +) -> PackResult: + """Export the project as a plugin-native directory. + + The output contains only plugin-spec artefacts (``agents/``, ``skills/``, + ``commands/``, ``plugin.json``, …) with no APM-specific files. + + Args: + project_root: Root of the project containing ``apm.yml``. + output_dir: Parent directory for the generated bundle. + target: Unused for plugin format (reserved for future use). + archive: If True, produce a ``.tar.gz`` and remove the directory. + dry_run: If True, resolve the file list without writing to disk. + force: On collision, last writer wins instead of first. + + Returns: + :class:`PackResult` describing what was produced. + """ + # 1. Read lockfile + migrate_lockfile_if_needed(project_root) + lockfile_path = get_lockfile_path(project_root) + lockfile = LockFile.read(lockfile_path) + + # 2. Read apm.yml + apm_yml_path = project_root / "apm.yml" + package = APMPackage.from_apm_yml(apm_yml_path) + pkg_name = package.name + pkg_version = package.version or "0.0.0" + + # Guard: reject local-path dependencies (non-portable) + for dep_ref in package.get_apm_dependencies(): + if dep_ref.is_local: + raise ValueError( + f"Cannot pack — apm.yml contains local path dependency: " + f"{dep_ref.local_path}\n" + f"Local dependencies are for development only. Replace them with " + f"remote references (e.g., 'owner/repo') before packing." + ) + + # 3. Find or synthesize plugin.json + plugin_json = _find_or_synthesize_plugin_json(project_root, apm_yml_path) + + # 4. devDependencies filtering + dev_dep_urls = _get_dev_dependency_urls(apm_yml_path) + + # 5. Collect components -- deps first (lockfile order), then root package + # file_map: output_rel_posix -> (source_abs, owner_name) + file_map: Dict[str, Tuple[Path, str]] = {} + collisions: List[str] = [] + merged_hooks: dict = {} + merged_mcp: dict = {} + + apm_modules_dir = project_root / "apm_modules" + + if lockfile: + for dep in lockfile.get_all_dependencies(): + # Prefer lockfile is_dev flag (covers transitive deps); + # fall back to apm.yml URL matching for older lockfiles + if getattr(dep, "is_dev", False) or ( + dep.repo_url, getattr(dep, "virtual_path", "") or "" + ) in dev_dep_urls: + continue + + install_path = _dep_install_path(dep, apm_modules_dir) + if not install_path.is_dir(): + continue + + dep_name = dep.repo_url + + # Collect from .apm/ + dep_apm_dir = install_path / ".apm" + dep_components = _collect_apm_components(dep_apm_dir) + + # Also collect root-level plugin-native dirs from the dep + dep_components.extend(_collect_root_plugin_components(install_path)) + + # Bare Claude skills: SKILL.md at dep root with no skills/ subdir + _collect_bare_skill(install_path, dep, dep_components) + + _merge_file_map( + file_map, dep_components, dep_name, force, collisions + ) + + # Hooks -- deps merge (first wins among deps) + dep_hooks = _collect_hooks_from_apm(dep_apm_dir) + dep_hooks_root = _collect_hooks_from_root(install_path) + _deep_merge(dep_hooks, dep_hooks_root, overwrite=False) + _deep_merge(merged_hooks, dep_hooks, overwrite=False) + + # MCP -- deps merge (first wins among deps) + dep_mcp = _collect_mcp(install_path) + _deep_merge(merged_mcp, dep_mcp, overwrite=False) + + # 6. Collect own components (.apm/ and root-level) + own_apm_dir = project_root / ".apm" + own_components = _collect_apm_components(own_apm_dir) + own_components.extend(_collect_root_plugin_components(project_root)) + _merge_file_map(file_map, own_components, pkg_name, force, collisions) + + # Hooks -- root package wins on key collision + root_hooks = _collect_hooks_from_apm(own_apm_dir) + root_hooks_top = _collect_hooks_from_root(project_root) + _deep_merge(root_hooks, root_hooks_top, overwrite=False) + _deep_merge(merged_hooks, root_hooks, overwrite=True) + + # MCP -- root package wins on server-name collision + root_mcp = _collect_mcp(project_root) + _deep_merge(merged_mcp, root_mcp, overwrite=True) + + # 7. Emit collision warnings + for msg in collisions: + _rich_warning(msg) + + # 8. Build output file list (sorted for determinism) + output_files = sorted(file_map.keys()) + + # Add generated files to the list + if merged_hooks: + output_files.append("hooks.json") + if merged_mcp: + output_files.append(".mcp.json") + output_files.append("plugin.json") + + # 9. Dry run -- return file list without writing + safe_name = _sanitize_bundle_name(pkg_name) + safe_version = _sanitize_bundle_name(pkg_version) + bundle_dir = output_dir / f"{safe_name}-{safe_version}" + ensure_path_within(bundle_dir, output_dir) + if dry_run: + return PackResult(bundle_path=bundle_dir, files=output_files) + + # 10. Security scan (warn-only, never blocks) + from ..security.gate import WARN_POLICY, SecurityGate + + scan_findings_total = 0 + for _rel, (src, _owner) in file_map.items(): + if src.is_symlink(): + continue + if src.is_dir(): + verdict = SecurityGate.scan_files(src, policy=WARN_POLICY) + scan_findings_total += len(verdict.all_findings) + elif src.is_file(): + try: + text = src.read_text(encoding="utf-8", errors="replace") + except OSError: + continue + verdict = SecurityGate.scan_text(text, str(src), policy=WARN_POLICY) + scan_findings_total += len(verdict.all_findings) + if scan_findings_total: + _rich_warning( + f"Bundle contains {scan_findings_total} hidden character(s) across " + f"source files — run 'apm audit' to inspect before publishing" + ) + + # 11. Write files to output directory (clean slate to prevent symlink attacks) + if bundle_dir.exists(): + safe_rmtree(bundle_dir, output_dir) + bundle_dir.mkdir(parents=True, exist_ok=True) + + for output_rel, (source_abs, _owner) in file_map.items(): + if not _validate_output_rel(output_rel): + continue + dest = bundle_dir / output_rel + if source_abs.is_symlink(): + continue + dest.parent.mkdir(parents=True, exist_ok=True) + try: + ensure_path_within(dest, bundle_dir) + except PathTraversalError: + continue + shutil.copy2(source_abs, dest, follow_symlinks=False) + + # 12. Write merged hooks.json + if merged_hooks: + (bundle_dir / "hooks.json").write_text( + json.dumps(merged_hooks, indent=2, sort_keys=True), encoding="utf-8" + ) + + # 13. Write merged .mcp.json + if merged_mcp: + (bundle_dir / ".mcp.json").write_text( + json.dumps({"mcpServers": merged_mcp}, indent=2, sort_keys=True), + encoding="utf-8", + ) + + # 14. Write plugin.json with updated component paths + plugin_json = _update_plugin_json_paths(plugin_json, output_files) + (bundle_dir / "plugin.json").write_text( + json.dumps(plugin_json, indent=2, sort_keys=False), encoding="utf-8" + ) + + result = PackResult(bundle_path=bundle_dir, files=output_files) + + # 15. Archive if requested + if archive: + archive_path = output_dir / f"{bundle_dir.name}.tar.gz" + ensure_path_within(archive_path, output_dir) + with tarfile.open(archive_path, "w:gz") as tar: + + def _tar_filter(info: tarfile.TarInfo) -> Optional[tarfile.TarInfo]: + if info.issym() or info.islnk(): + return None # reject symlinks injected after write + return info + + tar.add(bundle_dir, arcname=bundle_dir.name, filter=_tar_filter) + shutil.rmtree(bundle_dir) + result.bundle_path = archive_path + + return result + + +# --------------------------------------------------------------------------- +# Collision handling +# --------------------------------------------------------------------------- + + +def _merge_file_map( + file_map: Dict[str, Tuple[Path, str]], + components: List[Tuple[Path, str]], + owner: str, + force: bool, + collisions: List[str], +) -> None: + """Merge *components* into *file_map* with collision handling. + + Without ``--force``: first writer wins (skip with warning). + With ``--force``: last writer wins (overwrite with warning). + """ + for source, output_rel in components: + if not _validate_output_rel(output_rel): + continue + if output_rel in file_map: + existing_owner = file_map[output_rel][1] + collisions.append( + f"{output_rel} — collision between '{existing_owner}' and " + f"'{owner}' ({'last writer wins' if force else 'first writer wins'})" + ) + if force: + file_map[output_rel] = (source, owner) + # else: first writer wins, skip + else: + file_map[output_rel] = (source, owner) diff --git a/src/apm_cli/commands/_helpers.py b/src/apm_cli/commands/_helpers.py index 54fd26a6..bc0c886a 100644 --- a/src/apm_cli/commands/_helpers.py +++ b/src/apm_cli/commands/_helpers.py @@ -394,8 +394,43 @@ def _get_default_config(project_name): } -def _create_minimal_apm_yml(config): - """Create minimal apm.yml file with auto-detected metadata.""" +def _validate_plugin_name(name): + """Validate plugin name is kebab-case (lowercase, numbers, hyphens). + + Returns True if valid, False otherwise. + """ + import re + + return bool(re.match(r"^[a-z][a-z0-9-]{0,63}$", name)) + + +def _create_plugin_json(config): + """Create plugin.json file with package metadata. + + Args: + config: dict with name, version, description, author keys. + """ + import json + + plugin_data = { + "name": config["name"], + "version": config.get("version", "0.1.0"), + "description": config.get("description", ""), + "author": {"name": config.get("author", "")}, + "license": "MIT", + } + + with open("plugin.json", "w", encoding="utf-8") as f: + f.write(json.dumps(plugin_data, indent=2) + "\n") + + +def _create_minimal_apm_yml(config, plugin=False): + """Create minimal apm.yml file with auto-detected metadata. + + Args: + config: dict with name, version, description, author keys. + plugin: if True, include a devDependencies section. + """ yaml = _lazy_yaml() # Create minimal apm.yml structure @@ -405,9 +440,13 @@ def _create_minimal_apm_yml(config): "description": config["description"], "author": config["author"], "dependencies": {"apm": [], "mcp": []}, - "scripts": {}, } + if plugin: + apm_yml_data["devDependencies"] = {"apm": []} + + apm_yml_data["scripts"] = {} + # Write apm.yml with open(APM_YML_FILENAME, "w") as f: yaml.safe_dump(apm_yml_data, f, default_flow_style=False, sort_keys=False) diff --git a/src/apm_cli/commands/init.py b/src/apm_cli/commands/init.py index e54021c9..e0394f76 100644 --- a/src/apm_cli/commands/init.py +++ b/src/apm_cli/commands/init.py @@ -20,10 +20,12 @@ INFO, RESET, _create_minimal_apm_yml, + _create_plugin_json, _get_console, _get_default_config, _lazy_confirm, _rich_blank_line, + _validate_plugin_name, ) @@ -32,11 +34,15 @@ @click.option( "--yes", "-y", is_flag=True, help="Skip interactive prompts and use auto-detected defaults" ) +@click.option( + "--plugin", is_flag=True, help="Initialize as plugin author (creates plugin.json + apm.yml)" +) @click.pass_context -def init(ctx, project_name, yes): +def init(ctx, project_name, yes, plugin): """Initialize a new APM project (like npm init). Creates a minimal apm.yml with auto-detected metadata. + With --plugin, also creates plugin.json for plugin authors. """ try: # Handle explicit current directory @@ -54,6 +60,15 @@ def init(ctx, project_name, yes): project_dir = Path.cwd() final_project_name = project_dir.name + # Validate plugin name early + if plugin and not _validate_plugin_name(final_project_name): + _rich_error( + f"Invalid plugin name '{final_project_name}'. " + "Must be kebab-case (lowercase letters, numbers, hyphens), " + "start with a letter, and be at most 64 characters." + ) + sys.exit(1) + # Check for existing apm.yml apm_yml_exists = Path(APM_YML_FILENAME).exists() @@ -84,10 +99,18 @@ def init(ctx, project_name, yes): # Use auto-detected defaults config = _get_default_config(final_project_name) + # Plugin mode uses 0.1.0 as default version + if plugin and yes: + config["version"] = "0.1.0" + _rich_success(f"Initializing APM project: {config['name']}", symbol="rocket") - # Create minimal apm.yml - _create_minimal_apm_yml(config) + # Create apm.yml (with devDependencies for plugin mode) + _create_minimal_apm_yml(config, plugin=plugin) + + # Create plugin.json for plugin mode + if plugin: + _create_plugin_json(config) _rich_success("APM project initialized successfully!", symbol="sparkles") @@ -98,21 +121,31 @@ def init(ctx, project_name, yes): files_data = [ ("*", APM_YML_FILENAME, "Project configuration"), ] + if plugin: + files_data.append(("*", "plugin.json", "Plugin metadata")) table = _create_files_table(files_data, title="Created Files") console.print(table) except (ImportError, NameError): _rich_info("Created:") _rich_echo(" * apm.yml - Project configuration", style="muted") + if plugin: + _rich_echo(" * plugin.json - Plugin metadata", style="muted") _rich_blank_line() # Next steps - actionable commands matching README workflow - next_steps = [ - "Install a runtime: apm runtime setup copilot", - "Add APM dependencies: apm install /", - "Compile agent context: apm compile", - "Run your first workflow: apm run start", - ] + if plugin: + next_steps = [ + "Add dev dependencies: apm install --dev /", + "Pack as plugin: apm pack --format plugin", + ] + else: + next_steps = [ + "Install a runtime: apm runtime setup copilot", + "Add APM dependencies: apm install /", + "Compile agent context: apm compile", + "Run your first workflow: apm run start", + ] try: _rich_panel( diff --git a/src/apm_cli/commands/install.py b/src/apm_cli/commands/install.py index f4de155b..501377c6 100644 --- a/src/apm_cli/commands/install.py +++ b/src/apm_cli/commands/install.py @@ -56,12 +56,17 @@ # --------------------------------------------------------------------------- -def _validate_and_add_packages_to_apm_yml(packages, dry_run=False): +def _validate_and_add_packages_to_apm_yml(packages, dry_run=False, dev=False): """Validate packages exist and can be accessed, then add to apm.yml dependencies section. Implements normalize-on-write: any input form (HTTPS URL, SSH URL, FQDN, shorthand) is canonicalized before storage. Default host (github.com) is stripped; non-default hosts are preserved. Duplicates are detected by identity. + + Args: + packages: Package specifiers to validate and add. + dry_run: If True, only show what would be added. + dev: If True, write to devDependencies instead of dependencies. """ import subprocess import tempfile @@ -80,12 +85,13 @@ def _validate_and_add_packages_to_apm_yml(packages, dry_run=False): sys.exit(1) # Ensure dependencies structure exists - if "dependencies" not in data: - data["dependencies"] = {} - if "apm" not in data["dependencies"]: - data["dependencies"]["apm"] = [] + dep_section = "devDependencies" if dev else "dependencies" + if dep_section not in data: + data[dep_section] = {} + if "apm" not in data[dep_section]: + data[dep_section]["apm"] = [] - current_deps = data["dependencies"]["apm"] or [] + current_deps = data[dep_section]["apm"] or [] validated_packages = [] # Build identity set from existing deps for duplicate detection @@ -151,12 +157,13 @@ def _validate_and_add_packages_to_apm_yml(packages, dry_run=False): return validated_packages # Add validated packages to dependencies (already canonical) + dep_label = "devDependencies" if dev else "apm.yml" for package in validated_packages: current_deps.append(package) - _rich_info(f"Added {package} to apm.yml") + _rich_info(f"Added {package} to {dep_label}") # Update dependencies - data["dependencies"]["apm"] = current_deps + data[dep_section]["apm"] = current_deps # Write back to apm.yml try: @@ -191,8 +198,11 @@ def _validate_package_exists(package): local = local.resolve() if not local.is_dir(): return False - # Must contain apm.yml or SKILL.md - return (local / "apm.yml").exists() or (local / "SKILL.md").exists() + # Must contain apm.yml, SKILL.md, or plugin.json + if (local / "apm.yml").exists() or (local / "SKILL.md").exists(): + return True + from apm_cli.utils.helpers import find_plugin_json + return find_plugin_json(local) is not None # For virtual packages, use the downloader's validation method if dep_ref.is_virtual: @@ -324,8 +334,14 @@ def _validate_package_exists(package): show_default=True, help="Max concurrent package downloads (0 to disable parallelism)", ) +@click.option( + "--dev", + is_flag=True, + default=False, + help="Install as development dependency (devDependencies)", +) @click.pass_context -def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbose, trust_transitive_mcp, parallel_downloads): +def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbose, trust_transitive_mcp, parallel_downloads, dev): """Install APM and MCP dependencies from apm.yml (like npm install). This command automatically detects AI runtimes from your apm.yml scripts and installs @@ -367,7 +383,7 @@ def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbo # If packages are specified, validate and add them to apm.yml first if packages: validated_packages = _validate_and_add_packages_to_apm_yml( - packages, dry_run + packages, dry_run, dev=dev ) # Note: Empty validated_packages is OK if packages are already in apm.yml # We'll proceed with installation from apm.yml to ensure everything is synced @@ -383,6 +399,8 @@ def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbo # Get APM and MCP dependencies apm_deps = apm_package.get_apm_dependencies() + dev_apm_deps = apm_package.get_dev_apm_dependencies() + has_any_apm_deps = bool(apm_deps) or bool(dev_apm_deps) mcp_deps = apm_package.get_mcp_dependencies() # Convert --only string to InstallMode enum @@ -412,7 +430,7 @@ def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbo for dep in mcp_deps: _rich_info(f" - {dep}") - if not apm_deps and not mcp_deps: + if not apm_deps and not dev_apm_deps and not mcp_deps: _rich_warning("No dependencies found in apm.yml") _rich_success("Dry run complete - no changes made") @@ -439,7 +457,7 @@ def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbo old_mcp_configs = builtins.dict(_existing_lock.mcp_configs) apm_diagnostics = None - if should_install_apm and apm_deps: + if should_install_apm and has_any_apm_deps: if not APM_DEPS_AVAILABLE: _rich_error("APM dependency system not available") _rich_info(f"Import error: {_APM_IMPORT_ERROR}") @@ -460,7 +478,7 @@ def install(ctx, packages, runtime, exclude, only, update, dry_run, force, verbo except Exception as e: _rich_error(f"Failed to install APM dependencies: {e}") sys.exit(1) - elif should_install_apm and not apm_deps: + elif should_install_apm and not has_any_apm_deps: _rich_info("No APM dependencies found in apm.yml") # When --update is used, package files on disk may have changed. @@ -795,6 +813,7 @@ def _integrate_package_primitives( return result + def _copy_local_package(dep_ref, install_path, project_root): """Copy a local package to apm_modules/. @@ -817,9 +836,14 @@ def _copy_local_package(dep_ref, install_path, project_root): if not local.is_dir(): _rich_error(f"Local package path does not exist: {dep_ref.local_path}") return None - if not (local / "apm.yml").exists() and not (local / "SKILL.md").exists(): + from apm_cli.utils.helpers import find_plugin_json + if ( + not (local / "apm.yml").exists() + and not (local / "SKILL.md").exists() + and find_plugin_json(local) is None + ): _rich_error( - f"Local package is not a valid APM package (no apm.yml or SKILL.md): {dep_ref.local_path}" + f"Local package is not a valid APM package (no apm.yml, SKILL.md, or plugin.json): {dep_ref.local_path}" ) return None @@ -857,10 +881,12 @@ def _install_apm_dependencies( raise RuntimeError("APM dependency system not available") apm_deps = apm_package.get_apm_dependencies() - if not apm_deps: + dev_apm_deps = apm_package.get_dev_apm_dependencies() + all_apm_deps = apm_deps + dev_apm_deps + if not all_apm_deps: return InstallResult() - _rich_info(f"Installing APM dependencies ({len(apm_deps)})...") + _rich_info(f"Installing APM dependencies ({len(all_apm_deps)})...") project_root = Path.cwd() @@ -1077,9 +1103,11 @@ def _collect_descendants(node, visited=None): # Collect installed packages for lockfile generation from apm_cli.deps.lockfile import LockFile, LockedDependency, get_lockfile_path - installed_packages: List[tuple] = [] # List of (dep_ref, resolved_commit, depth, resolved_by) + from ..utils.content_hash import compute_package_hash as _compute_hash + installed_packages: List[tuple] = [] # List of (dep_ref, resolved_commit, depth, resolved_by, is_dev) package_deployed_files: builtins.dict = {} # dep_key → list of relative deployed paths package_types: builtins.dict = {} # dep_key → package type string + _package_hashes: builtins.dict = {} # dep_key → sha256 hash (captured at download/verify time) # Build managed_files from existing lockfile for collision detection managed_files = builtins.set() @@ -1262,25 +1290,24 @@ def _collect_descendants(node, visited=None): ) # Detect package type - has_skill = (install_path / "SKILL.md").exists() - has_apm = (install_path / "apm.yml").exists() - from apm_cli.utils.helpers import find_plugin_json - has_plugin = find_plugin_json(install_path) is not None - if has_plugin and not has_apm: - local_info.package_type = PackageType.MARKETPLACE_PLUGIN - elif has_skill and has_apm: - local_info.package_type = PackageType.HYBRID - elif has_skill: - local_info.package_type = PackageType.CLAUDE_SKILL - elif has_apm: - local_info.package_type = PackageType.APM_PACKAGE + from apm_cli.models.validation import detect_package_type + pkg_type, plugin_json_path = detect_package_type(install_path) + local_info.package_type = pkg_type + if pkg_type == PackageType.MARKETPLACE_PLUGIN: + # Normalize: synthesize .apm/ from plugin.json so + # integration can discover and deploy primitives + from apm_cli.deps.plugin_parser import normalize_plugin_directory + normalize_plugin_directory(install_path, plugin_json_path) # Record for lockfile node = dependency_graph.dependency_tree.get_node(dep_ref.get_unique_key()) depth = node.depth if node else 1 resolved_by = node.parent.dependency_ref.repo_url if node and node.parent else None - installed_packages.append((dep_ref, None, depth, resolved_by)) + _is_dev = node.is_dev if node else False + installed_packages.append((dep_ref, None, depth, resolved_by, _is_dev)) dep_key = dep_ref.get_unique_key() + if install_path.is_dir() and not dep_ref.is_local: + _package_hashes[dep_key] = _compute_hash(install_path) dep_deployed_files: builtins.list = [] if hasattr(local_info, 'package_type') and local_info.package_type: @@ -1380,6 +1407,17 @@ def _collect_descendants(node, visited=None): (is_cacheable and not update_refs) or already_resolved or lockfile_match ) + # Verify content integrity when lockfile has a hash + if skip_download and _dep_locked_chk and _dep_locked_chk.content_hash: + from ..utils.content_hash import verify_package_hash + if not verify_package_hash(install_path, _dep_locked_chk.content_hash): + _rich_warning( + f"Content hash mismatch for " + f"{dep_ref.get_unique_key()} — re-downloading" + ) + safe_rmtree(install_path, apm_modules_dir) + skip_download = False + if skip_download: display_name = ( str(dep_ref) if dep_ref.is_virtual else dep_ref.repo_url @@ -1449,23 +1487,15 @@ def _collect_descendants(node, visited=None): # Detect package_type from disk contents so # skill integration is not silently skipped - skill_md_exists = (install_path / SKILL_MD_FILENAME).exists() - apm_yml_exists = (install_path / APM_YML_FILENAME).exists() - from apm_cli.utils.helpers import find_plugin_json - plugin_json_exists = find_plugin_json(install_path) is not None - if plugin_json_exists and not apm_yml_exists: - cached_package_info.package_type = PackageType.MARKETPLACE_PLUGIN - elif skill_md_exists and apm_yml_exists: - cached_package_info.package_type = PackageType.HYBRID - elif skill_md_exists: - cached_package_info.package_type = PackageType.CLAUDE_SKILL - elif apm_yml_exists: - cached_package_info.package_type = PackageType.APM_PACKAGE + from apm_cli.models.validation import detect_package_type + pkg_type, _ = detect_package_type(install_path) + cached_package_info.package_type = pkg_type # Collect for lockfile (cached packages still need to be tracked) node = dependency_graph.dependency_tree.get_node(dep_ref.get_unique_key()) depth = node.depth if node else 1 resolved_by = node.parent.dependency_ref.repo_url if node and node.parent else None + _is_dev = node.is_dev if node else False # Get commit SHA: callback capture > existing lockfile > explicit reference dep_key = dep_ref.get_unique_key() cached_commit = callback_downloaded.get(dep_key) @@ -1475,8 +1505,9 @@ def _collect_descendants(node, visited=None): cached_commit = locked_dep.resolved_commit if not cached_commit: cached_commit = dep_ref.reference - installed_packages.append((dep_ref, cached_commit, depth, resolved_by)) - + installed_packages.append((dep_ref, cached_commit, depth, resolved_by, _is_dev)) + if install_path.is_dir(): + _package_hashes[dep_key] = _compute_hash(install_path) # Track package type for lockfile if hasattr(cached_package_info, 'package_type') and cached_package_info.package_type: package_types[dep_key] = cached_package_info.package_type.value @@ -1582,7 +1613,10 @@ def _collect_descendants(node, visited=None): node = dependency_graph.dependency_tree.get_node(dep_ref.get_unique_key()) depth = node.depth if node else 1 resolved_by = node.parent.dependency_ref.repo_url if node and node.parent else None - installed_packages.append((dep_ref, resolved_commit, depth, resolved_by)) + _is_dev = node.is_dev if node else False + installed_packages.append((dep_ref, resolved_commit, depth, resolved_by, _is_dev)) + if install_path.is_dir(): + _package_hashes[dep_ref.get_unique_key()] = _compute_hash(install_path) # Track package type for lockfile if hasattr(package_info, 'package_type') and package_info.package_type: @@ -1719,6 +1753,10 @@ def _collect_descendants(node, visited=None): for dep_key, pkg_type in package_types.items(): if dep_key in lockfile.dependencies: lockfile.dependencies[dep_key].package_type = pkg_type + # Attach content hashes captured at download/verify time + for dep_key, locked_dep in lockfile.dependencies.items(): + if dep_key in _package_hashes: + locked_dep.content_hash = _package_hashes[dep_key] # Selectively merge entries from the existing lockfile: # - For partial installs (only_packages): preserve all old entries # (sequential install — only the specified package was processed). diff --git a/src/apm_cli/commands/pack.py b/src/apm_cli/commands/pack.py index c9112759..ca5e6ab6 100644 --- a/src/apm_cli/commands/pack.py +++ b/src/apm_cli/commands/pack.py @@ -34,8 +34,9 @@ help="Output directory (default: ./build).", ) @click.option("--dry-run", is_flag=True, default=False, help="Show what would be packed without writing.") +@click.option("--force", is_flag=True, default=False, help="On collision, last writer wins.") @click.pass_context -def pack_cmd(ctx, fmt, target, archive, output, dry_run): +def pack_cmd(ctx, fmt, target, archive, output, dry_run, force): """Create a self-contained APM bundle.""" try: result = pack_bundle( @@ -45,6 +46,7 @@ def pack_cmd(ctx, fmt, target, archive, output, dry_run): target=target, archive=archive, dry_run=dry_run, + force=force, ) if dry_run: @@ -61,6 +63,12 @@ def pack_cmd(ctx, fmt, target, archive, output, dry_run): _rich_warning("No deployed files found -- empty bundle created") else: _rich_success(f"Packed {len(result.files)} file(s) -> {result.bundle_path}") + if fmt == "plugin": + _rich_info( + "Plugin bundle ready — contains plugin.json and " + "plugin-native directories (agents/, skills/, commands/, …). " + "No APM-specific files included." + ) except (FileNotFoundError, ValueError) as exc: _rich_error(str(exc)) diff --git a/src/apm_cli/deps/apm_resolver.py b/src/apm_cli/deps/apm_resolver.py index 717a4905..43f1f77b 100644 --- a/src/apm_cli/deps/apm_resolver.py +++ b/src/apm_cli/deps/apm_resolver.py @@ -127,8 +127,8 @@ def build_dependency_tree(self, root_apm_yml: Path) -> DependencyTree: # Initialize the tree tree = DependencyTree(root_package=root_package) - # Queue for breadth-first traversal: (dependency_ref, depth, parent_node) - processing_queue: deque[Tuple[DependencyReference, int, Optional[DependencyNode]]] = deque() + # Queue for breadth-first traversal: (dependency_ref, depth, parent_node, is_dev) + processing_queue: deque[Tuple[DependencyReference, int, Optional[DependencyNode], bool]] = deque() # Set to track queued unique keys for O(1) lookup instead of O(n) list comprehension queued_keys: Set[str] = set() @@ -136,12 +136,21 @@ def build_dependency_tree(self, root_apm_yml: Path) -> DependencyTree: # Add root dependencies to queue root_deps = root_package.get_apm_dependencies() for dep_ref in root_deps: - processing_queue.append((dep_ref, 1, None)) + processing_queue.append((dep_ref, 1, None, False)) queued_keys.add(dep_ref.get_unique_key()) + + # Add root devDependencies to queue (marked is_dev=True) + root_dev_deps = root_package.get_dev_apm_dependencies() + for dep_ref in root_dev_deps: + key = dep_ref.get_unique_key() + if key not in queued_keys: + processing_queue.append((dep_ref, 1, None, True)) + queued_keys.add(key) + # If already queued as prod, prod wins — skip # Process dependencies breadth-first while processing_queue: - dep_ref, depth, parent_node = processing_queue.popleft() + dep_ref, depth, parent_node, is_dev = processing_queue.popleft() # Remove from queued set since we're now processing this dependency queued_keys.discard(dep_ref.get_unique_key()) @@ -153,6 +162,9 @@ def build_dependency_tree(self, root_apm_yml: Path) -> DependencyTree: # Check if we already processed this dependency at this level or higher existing_node = tree.get_node(dep_ref.get_unique_key()) if existing_node and existing_node.depth <= depth: + # Prod wins over dev: if existing was dev and this is prod, promote it + if existing_node.is_dev and not is_dev: + existing_node.is_dev = False # We've already processed this dependency at a shallower or equal depth # Create parent-child relationship if parent exists if parent_node and existing_node not in parent_node.children: @@ -172,7 +184,8 @@ def build_dependency_tree(self, root_apm_yml: Path) -> DependencyTree: package=placeholder_package, dependency_ref=dep_ref, depth=depth, - parent=parent_node + parent=parent_node, + is_dev=is_dev, ) # Add to tree @@ -194,12 +207,13 @@ def build_dependency_tree(self, root_apm_yml: Path) -> DependencyTree: node.package = loaded_package # Get sub-dependencies and add them to the processing queue + # Transitive deps inherit is_dev from parent sub_dependencies = loaded_package.get_apm_dependencies() for sub_dep in sub_dependencies: # Avoid infinite recursion by checking if we're already processing this dep # Use O(1) set lookup instead of O(n) list comprehension if sub_dep.get_unique_key() not in queued_keys: - processing_queue.append((sub_dep, depth + 1, node)) + processing_queue.append((sub_dep, depth + 1, node, is_dev)) queued_keys.add(sub_dep.get_unique_key()) except (ValueError, FileNotFoundError) as e: # Could not load dependency package - this is expected for remote dependencies diff --git a/src/apm_cli/deps/dependency_graph.py b/src/apm_cli/deps/dependency_graph.py index 42ddb43d..1a59b016 100644 --- a/src/apm_cli/deps/dependency_graph.py +++ b/src/apm_cli/deps/dependency_graph.py @@ -15,6 +15,7 @@ class DependencyNode: depth: int = 0 children: List['DependencyNode'] = field(default_factory=list) parent: Optional['DependencyNode'] = None + is_dev: bool = False # True when reached exclusively through devDependencies def get_id(self) -> str: """Get unique identifier for this node.""" diff --git a/src/apm_cli/deps/lockfile.py b/src/apm_cli/deps/lockfile.py index cda16fb1..01046bca 100644 --- a/src/apm_cli/deps/lockfile.py +++ b/src/apm_cli/deps/lockfile.py @@ -33,6 +33,8 @@ class LockedDependency: deployed_files: List[str] = field(default_factory=list) source: Optional[str] = None # "local" for local deps, None/absent for remote local_path: Optional[str] = None # Original local path (relative to project root) + content_hash: Optional[str] = None # SHA-256 of package file tree + is_dev: bool = False # True for devDependencies def get_unique_key(self) -> str: """Returns unique key for this dependency.""" @@ -69,6 +71,10 @@ def to_dict(self) -> Dict[str, Any]: result["source"] = self.source if self.local_path: result["local_path"] = self.local_path + if self.content_hash: + result["content_hash"] = self.content_hash + if self.is_dev: + result["is_dev"] = True return result @classmethod @@ -102,6 +108,8 @@ def from_dict(cls, data: Dict[str, Any]) -> "LockedDependency": deployed_files=deployed_files, source=data.get("source"), local_path=data.get("local_path"), + content_hash=data.get("content_hash"), + is_dev=data.get("is_dev", False), ) @classmethod @@ -111,6 +119,7 @@ def from_dependency_ref( resolved_commit: Optional[str], depth: int, resolved_by: Optional[str], + is_dev: bool = False, ) -> "LockedDependency": """Create from a DependencyReference with resolution info.""" return cls( @@ -124,6 +133,7 @@ def from_dependency_ref( resolved_by=resolved_by, source="local" if dep_ref.is_local else None, local_path=dep_ref.local_path if dep_ref.is_local else None, + is_dev=is_dev, ) @@ -222,7 +232,9 @@ def from_installed_packages( """Create a lock file from installed packages. Args: - installed_packages: List of (dep_ref, resolved_commit, depth, resolved_by) tuples + installed_packages: List of (dep_ref, resolved_commit, depth, resolved_by) + or (dep_ref, resolved_commit, depth, resolved_by, is_dev) tuples. + The 5th element is optional for backward compatibility. dependency_graph: The resolved DependencyGraph for additional metadata """ # Get APM version @@ -234,12 +246,18 @@ def from_installed_packages( lock = cls(apm_version=apm_version) - for dep_ref, resolved_commit, depth, resolved_by in installed_packages: + for entry in installed_packages: + if len(entry) >= 5: + dep_ref, resolved_commit, depth, resolved_by, is_dev = entry[:5] + else: + dep_ref, resolved_commit, depth, resolved_by = entry[:4] + is_dev = False locked_dep = LockedDependency.from_dependency_ref( dep_ref=dep_ref, resolved_commit=resolved_commit, depth=depth, resolved_by=resolved_by, + is_dev=is_dev, ) lock.add_dependency(locked_dep) diff --git a/src/apm_cli/deps/plugin_parser.py b/src/apm_cli/deps/plugin_parser.py index 9f66d678..5cc2fbca 100644 --- a/src/apm_cli/deps/plugin_parser.py +++ b/src/apm_cli/deps/plugin_parser.py @@ -500,6 +500,50 @@ def _generate_apm_yml(manifest: Dict[str, Any]) -> str: return yaml.dump(apm_package, default_flow_style=False, sort_keys=False) +def synthesize_plugin_json_from_apm_yml(apm_yml_path: Path) -> dict: + """Create a minimal ``plugin.json`` dict from ``apm.yml`` identity fields. + + Reads ``apm.yml`` and extracts ``name``, ``version``, ``description``, + ``author``, and ``license``. The ``author`` string is mapped to the plugin + spec's ``{"name": author}`` object format. + + Args: + apm_yml_path: Path to the ``apm.yml`` file. + + Returns: + dict suitable for writing as ``plugin.json``. + + Raises: + ValueError: If ``name`` is missing from ``apm.yml``. + FileNotFoundError: If the file does not exist. + """ + if not apm_yml_path.exists(): + raise FileNotFoundError(f"apm.yml not found: {apm_yml_path}") + + try: + data = yaml.safe_load(apm_yml_path.read_text(encoding="utf-8")) + except yaml.YAMLError as exc: + raise ValueError(f"Invalid YAML in {apm_yml_path}: {exc}") from exc + + if not isinstance(data, dict) or not data.get("name"): + raise ValueError( + "apm.yml must contain at least a 'name' field to synthesize plugin.json" + ) + + result: Dict[str, Any] = {"name": data["name"]} + + if data.get("version"): + result["version"] = data["version"] + if data.get("description"): + result["description"] = data["description"] + if data.get("author"): + result["author"] = {"name": str(data["author"])} + if data.get("license"): + result["license"] = data["license"] + + return result + + def validate_plugin_package(plugin_path: Path) -> bool: """Check whether a directory looks like a Claude plugin. diff --git a/src/apm_cli/models/__init__.py b/src/apm_cli/models/__init__.py index 31b3cb64..05d9a2c0 100644 --- a/src/apm_cli/models/__init__.py +++ b/src/apm_cli/models/__init__.py @@ -14,6 +14,7 @@ PackageType, ValidationError, ValidationResult, + detect_package_type, validate_apm_package, ) from .results import InstallResult, PrimitiveCounts @@ -35,6 +36,7 @@ "PackageType", "ValidationError", "ValidationResult", + "detect_package_type", "validate_apm_package", # Results "InstallResult", diff --git a/src/apm_cli/models/apm_package.py b/src/apm_cli/models/apm_package.py index a7e20eb4..8058b34a 100644 --- a/src/apm_cli/models/apm_package.py +++ b/src/apm_cli/models/apm_package.py @@ -68,6 +68,7 @@ class APMPackage: source: Optional[str] = None # Source location (for dependencies) resolved_commit: Optional[str] = None # Resolved commit SHA (for dependencies) dependencies: Optional[Dict[str, List[Union[DependencyReference, str, dict]]]] = None # Mixed types for APM/MCP/inline + dev_dependencies: Optional[Dict[str, List[Union[DependencyReference, str, dict]]]] = None scripts: Optional[Dict[str, str]] = None package_path: Optional[Path] = None # Local path to package target: Optional[str] = None # Target agent: vscode, claude, or all (applies to compile and install) @@ -148,6 +149,40 @@ def from_apm_yml(cls, apm_yml_path: Path) -> "APMPackage": # Other dependency types: keep as-is dependencies[dep_type] = [dep for dep in dep_list if isinstance(dep, (str, dict))] + # Parse devDependencies (same structure as dependencies) + dev_dependencies = None + if 'devDependencies' in data and isinstance(data['devDependencies'], dict): + dev_dependencies = {} + for dep_type, dep_list in data['devDependencies'].items(): + if isinstance(dep_list, list): + if dep_type == 'apm': + parsed_deps = [] + for dep_entry in dep_list: + if isinstance(dep_entry, str): + try: + parsed_deps.append(DependencyReference.parse(dep_entry)) + except ValueError as e: + raise ValueError(f"Invalid dev APM dependency '{dep_entry}': {e}") + elif isinstance(dep_entry, dict): + try: + parsed_deps.append(DependencyReference.parse_from_dict(dep_entry)) + except ValueError as e: + raise ValueError(f"Invalid dev APM dependency {dep_entry}: {e}") + dev_dependencies[dep_type] = parsed_deps + elif dep_type == 'mcp': + parsed_mcp = [] + for dep in dep_list: + if isinstance(dep, str): + parsed_mcp.append(MCPDependency.from_string(dep)) + elif isinstance(dep, dict): + try: + parsed_mcp.append(MCPDependency.from_dict(dep)) + except ValueError as e: + raise ValueError(f"Invalid dev MCP dependency: {e}") + dev_dependencies[dep_type] = parsed_mcp + else: + dev_dependencies[dep_type] = [dep for dep in dep_list if isinstance(dep, (str, dict))] + # Parse package content type pkg_type = None if 'type' in data and data['type'] is not None: @@ -166,6 +201,7 @@ def from_apm_yml(cls, apm_yml_path: Path) -> "APMPackage": author=data.get('author'), license=data.get('license'), dependencies=dependencies, + dev_dependencies=dev_dependencies, scripts=data.get('scripts'), package_path=apm_yml_path.parent, target=data.get('target'), @@ -192,6 +228,19 @@ def has_apm_dependencies(self) -> bool: """Check if this package has APM dependencies.""" return bool(self.get_apm_dependencies()) + def get_dev_apm_dependencies(self) -> List[DependencyReference]: + """Get list of dev APM dependencies.""" + if not self.dev_dependencies or 'apm' not in self.dev_dependencies: + return [] + return [dep for dep in self.dev_dependencies['apm'] if isinstance(dep, DependencyReference)] + + def get_dev_mcp_dependencies(self) -> List["MCPDependency"]: + """Get list of dev MCP dependencies.""" + if not self.dev_dependencies or 'mcp' not in self.dev_dependencies: + return [] + return [dep for dep in (self.dev_dependencies.get('mcp') or []) + if isinstance(dep, MCPDependency)] + @dataclass class PackageInfo: diff --git a/src/apm_cli/models/validation.py b/src/apm_cli/models/validation.py index 419f85c8..8cccdbd6 100644 --- a/src/apm_cli/models/validation.py +++ b/src/apm_cli/models/validation.py @@ -6,7 +6,7 @@ from dataclasses import dataclass from enum import Enum from pathlib import Path -from typing import TYPE_CHECKING, List, Optional +from typing import TYPE_CHECKING, List, Optional, Tuple from ..constants import APM_DIR, APM_YML_FILENAME, SKILL_MD_FILENAME @@ -135,6 +135,45 @@ def _has_hook_json(package_path: Path) -> bool: return False +def detect_package_type( + package_path: Path, +) -> Tuple[PackageType, Optional[Path]]: + """Classify a package directory into a ``PackageType``. + + This is the **single source of truth** for the detection cascade. + The function is pure — no side-effects, no file mutations. + + Returns: + A ``(package_type, plugin_json_path)`` tuple. + *plugin_json_path* is non-None only for ``MARKETPLACE_PLUGIN``. + """ + from ..utils.helpers import find_plugin_json + + has_apm_yml = (package_path / APM_YML_FILENAME).exists() + has_skill_md = (package_path / SKILL_MD_FILENAME).exists() + + if has_apm_yml and has_skill_md: + return PackageType.HYBRID, None + if has_apm_yml: + return PackageType.APM_PACKAGE, None + if has_skill_md: + return PackageType.CLAUDE_SKILL, None + if _has_hook_json(package_path): + return PackageType.HOOK_PACKAGE, None + + plugin_json_path = find_plugin_json(package_path) + has_plugin_evidence = ( + plugin_json_path is not None + or (package_path / "agents").is_dir() + or (package_path / "skills").is_dir() + or (package_path / "commands").is_dir() + ) + if has_plugin_evidence: + return PackageType.MARKETPLACE_PLUGIN, plugin_json_path + + return PackageType.INVALID, None + + def validate_apm_package(package_path: Path) -> ValidationResult: """Validate that a directory contains a valid APM package or Claude Skill. @@ -163,49 +202,22 @@ def validate_apm_package(package_path: Path) -> ValidationResult: return result # Detect package type - apm_yml_path = package_path / APM_YML_FILENAME - skill_md_path = package_path / SKILL_MD_FILENAME - - # Check for plugin.json -- optional metadata, not a detection gate - from ..utils.helpers import find_plugin_json - plugin_json_path = find_plugin_json(package_path) + pkg_type, plugin_json_path = detect_package_type(package_path) + result.package_type = pkg_type - has_apm_yml = apm_yml_path.exists() - has_skill_md = skill_md_path.exists() - has_hooks = _has_hook_json(package_path) - - # Determine package type. apm.yml / SKILL.md take precedence; everything - # else (hooks-only or bare plugin directories) normalizes as a Claude plugin. - if has_apm_yml and has_skill_md: - result.package_type = PackageType.HYBRID - elif has_apm_yml: - result.package_type = PackageType.APM_PACKAGE - elif has_skill_md: - result.package_type = PackageType.CLAUDE_SKILL - elif has_hooks: - result.package_type = PackageType.HOOK_PACKAGE - else: - # Require plugin.json or at least one standard component directory - has_plugin_evidence = ( - plugin_json_path is not None - or (package_path / "agents").is_dir() - or (package_path / "skills").is_dir() - or (package_path / "commands").is_dir() + if pkg_type == PackageType.INVALID: + result.add_error( + f"Not a valid APM package: no apm.yml, SKILL.md, hooks, or " + f"plugin structure found in {package_path.name}" ) - if has_plugin_evidence: - result.package_type = PackageType.MARKETPLACE_PLUGIN - else: - result.add_error( - f"Not a valid APM package: no apm.yml, SKILL.md, hooks, or " - f"plugin structure found in {package_path.name}" - ) - return result + return result # Handle hook-only packages (no apm.yml or SKILL.md) if result.package_type == PackageType.HOOK_PACKAGE: return _validate_hook_package(package_path, result) # Handle Claude Skills (no apm.yml) - auto-generate minimal apm.yml + skill_md_path = package_path / SKILL_MD_FILENAME if result.package_type == PackageType.CLAUDE_SKILL: return _validate_claude_skill(package_path, skill_md_path, result) @@ -214,6 +226,7 @@ def validate_apm_package(package_path: Path) -> ValidationResult: return _validate_marketplace_plugin(package_path, plugin_json_path, result) # Standard APM package validation (has apm.yml) + apm_yml_path = package_path / APM_YML_FILENAME return _validate_apm_package_with_yml(package_path, apm_yml_path, result) diff --git a/src/apm_cli/utils/content_hash.py b/src/apm_cli/utils/content_hash.py new file mode 100644 index 00000000..0fb2eace --- /dev/null +++ b/src/apm_cli/utils/content_hash.py @@ -0,0 +1,72 @@ +"""Deterministic SHA-256 content hashing for package integrity verification.""" + +import hashlib +from pathlib import Path +from typing import Optional + +# Directories excluded from hashing (not relevant to package content) +_EXCLUDED_DIRS = {".git", "__pycache__"} + +# Well-known hash for empty/missing packages +_EMPTY_HASH = "sha256:" + hashlib.sha256(b"").hexdigest() + + +def compute_package_hash(package_path: Path) -> str: + """Compute a deterministic SHA-256 hash of a package's file tree. + + The hash is computed over sorted file paths and their contents, + making it independent of filesystem ordering and metadata (timestamps, + permissions). + + Args: + package_path: Root directory of the installed package. + + Returns: + Hash string in format ``"sha256:"``. + """ + if not package_path.is_dir(): + return _EMPTY_HASH + + hasher = hashlib.sha256() + file_count = 0 + + # Collect all regular files, skipping excluded dirs and symlinks + regular_files: list[Path] = [] + for item in package_path.rglob("*"): + # Skip symlinks + if item.is_symlink(): + continue + # Skip excluded directories and their contents + rel = item.relative_to(package_path) + if any(part in _EXCLUDED_DIRS for part in rel.parts): + continue + if item.is_file(): + regular_files.append(rel) + + # Sort lexicographically by POSIX path for determinism + regular_files.sort(key=lambda p: p.as_posix()) + + for rel_path in regular_files: + # Hash the relative path then the file contents + hasher.update(rel_path.as_posix().encode("utf-8")) + hasher.update((package_path / rel_path).read_bytes()) + file_count += 1 + + if file_count == 0: + return _EMPTY_HASH + + return f"sha256:{hasher.hexdigest()}" + + +def verify_package_hash(package_path: Path, expected_hash: str) -> bool: + """Verify a package's content matches the expected hash. + + Args: + package_path: Root directory of the installed package. + expected_hash: Expected hash string (e.g., ``"sha256:abc123..."``). + + Returns: + True if hash matches, False if mismatch. + """ + actual = compute_package_hash(package_path) + return actual == expected_hash diff --git a/tests/test_apm_package_models.py b/tests/test_apm_package_models.py index d3e3893b..34d8b03d 100644 --- a/tests/test_apm_package_models.py +++ b/tests/test_apm_package_models.py @@ -651,7 +651,7 @@ def test_validate_missing_apm_yml(self): result = validate_apm_package(Path(tmpdir)) # Empty directories without plugin.json or component dirs are not valid assert not result.is_valid - assert result.package_type is None + assert result.package_type == PackageType.INVALID def test_validate_invalid_apm_yml(self): """Test validating directory with invalid apm.yml.""" @@ -926,7 +926,86 @@ def test_validate_empty_dir_is_invalid(self): result = validate_apm_package(Path(tmpdir)) # Empty directories without plugin.json or component dirs are not valid assert not result.is_valid - assert result.package_type is None + assert result.package_type == PackageType.INVALID + + +from src.apm_cli.models.validation import detect_package_type + + +class TestDetectPackageType: + """Tests for the centralized detect_package_type() function.""" + + def test_hybrid_when_both_apm_yml_and_skill_md(self, tmp_path): + (tmp_path / "apm.yml").write_text("name: test") + (tmp_path / "SKILL.md").write_text("# Skill") + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.HYBRID + assert pj_path is None + + def test_apm_package_when_only_apm_yml(self, tmp_path): + (tmp_path / "apm.yml").write_text("name: test") + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.APM_PACKAGE + assert pj_path is None + + def test_claude_skill_when_only_skill_md(self, tmp_path): + (tmp_path / "SKILL.md").write_text("# Skill") + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.CLAUDE_SKILL + assert pj_path is None + + def test_hook_package_when_hooks_json(self, tmp_path): + hooks_dir = tmp_path / "hooks" + hooks_dir.mkdir() + (hooks_dir / "pre-commit.json").write_text("{}") + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.HOOK_PACKAGE + assert pj_path is None + + def test_marketplace_plugin_with_plugin_json(self, tmp_path): + (tmp_path / "plugin.json").write_text('{"name": "test"}') + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.MARKETPLACE_PLUGIN + assert pj_path is not None + assert pj_path.name == "plugin.json" + + def test_marketplace_plugin_with_agents_dir(self, tmp_path): + (tmp_path / "agents").mkdir() + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.MARKETPLACE_PLUGIN + assert pj_path is None + + def test_marketplace_plugin_with_skills_dir(self, tmp_path): + (tmp_path / "skills").mkdir() + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.MARKETPLACE_PLUGIN + assert pj_path is None + + def test_marketplace_plugin_with_commands_dir(self, tmp_path): + (tmp_path / "commands").mkdir() + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.MARKETPLACE_PLUGIN + assert pj_path is None + + def test_invalid_when_empty_dir(self, tmp_path): + pkg_type, pj_path = detect_package_type(tmp_path) + assert pkg_type == PackageType.INVALID + assert pj_path is None + + def test_apm_yml_takes_precedence_over_plugin_json(self, tmp_path): + (tmp_path / "apm.yml").write_text("name: test") + (tmp_path / "plugin.json").write_text('{"name": "test"}') + pkg_type, _ = detect_package_type(tmp_path) + assert pkg_type == PackageType.APM_PACKAGE + + def test_hook_package_apm_yml_precedence(self, tmp_path): + """apm.yml takes precedence even when hooks exist.""" + (tmp_path / "apm.yml").write_text("name: test") + hooks_dir = tmp_path / "hooks" + hooks_dir.mkdir() + (hooks_dir / "pre-commit.json").write_text("{}") + pkg_type, _ = detect_package_type(tmp_path) + assert pkg_type == PackageType.APM_PACKAGE class TestGitReferenceUtils: diff --git a/tests/unit/test_apm_package.py b/tests/unit/test_apm_package.py new file mode 100644 index 00000000..037357e8 --- /dev/null +++ b/tests/unit/test_apm_package.py @@ -0,0 +1,233 @@ +"""Unit tests for APMPackage devDependencies support.""" + +from pathlib import Path + +import pytest +import yaml + +from apm_cli.models.apm_package import ( + APMPackage, + DependencyReference, + MCPDependency, + clear_apm_yml_cache, +) + + +def _write_apm_yml(tmp_path: Path, data: dict) -> Path: + """Write an apm.yml file and return its path.""" + clear_apm_yml_cache() + path = tmp_path / "apm.yml" + path.write_text(yaml.dump(data), encoding="utf-8") + return path + + +class TestDevDependencies: + """Tests for devDependencies support in APMPackage.""" + + def test_parse_dev_dependencies(self, tmp_path): + """devDependencies section is parsed from apm.yml.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "apm": ["owner/dev-tool"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + + assert pkg.dev_dependencies is not None + assert "apm" in pkg.dev_dependencies + + def test_get_dev_apm_dependencies(self, tmp_path): + """get_dev_apm_dependencies returns DependencyReference objects.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "apm": ["owner/dev-tool", "org/test-helper"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + dev_deps = pkg.get_dev_apm_dependencies() + + assert len(dev_deps) == 2 + assert all(isinstance(d, DependencyReference) for d in dev_deps) + urls = {d.repo_url for d in dev_deps} + assert "owner/dev-tool" in urls + assert "org/test-helper" in urls + + def test_get_dev_mcp_dependencies(self, tmp_path): + """get_dev_mcp_dependencies returns MCPDependency objects.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "mcp": [ + {"name": "io.github.test/mcp-server", "transport": "stdio"}, + ], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + dev_mcp = pkg.get_dev_mcp_dependencies() + + assert len(dev_mcp) == 1 + assert isinstance(dev_mcp[0], MCPDependency) + assert dev_mcp[0].name == "io.github.test/mcp-server" + assert dev_mcp[0].transport == "stdio" + + def test_get_dev_mcp_from_string(self, tmp_path): + """MCP dev dependencies can be specified as plain strings.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "mcp": ["io.github.test/mcp-server"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + dev_mcp = pkg.get_dev_mcp_dependencies() + + assert len(dev_mcp) == 1 + assert dev_mcp[0].name == "io.github.test/mcp-server" + + def test_missing_dev_dependencies_returns_empty(self, tmp_path): + """No devDependencies section returns empty lists.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + }) + + pkg = APMPackage.from_apm_yml(yml) + + assert pkg.dev_dependencies is None + assert pkg.get_dev_apm_dependencies() == [] + assert pkg.get_dev_mcp_dependencies() == [] + + def test_empty_dev_dependencies_returns_empty(self, tmp_path): + """Empty devDependencies.apm list returns empty.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": {"apm": []}, + }) + + pkg = APMPackage.from_apm_yml(yml) + + assert pkg.dev_dependencies is not None + assert pkg.get_dev_apm_dependencies() == [] + + def test_dev_and_prod_dependencies_independent(self, tmp_path): + """devDeps and deps are independent — changing one doesn't affect other.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "dependencies": { + "apm": ["owner/prod-dep"], + }, + "devDependencies": { + "apm": ["owner/dev-dep"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + + prod_deps = pkg.get_apm_dependencies() + dev_deps = pkg.get_dev_apm_dependencies() + + assert len(prod_deps) == 1 + assert len(dev_deps) == 1 + assert prod_deps[0].repo_url == "owner/prod-dep" + assert dev_deps[0].repo_url == "owner/dev-dep" + + def test_dev_dependencies_do_not_pollute_prod(self, tmp_path): + """Dev dependencies don't appear in get_apm_dependencies().""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "dependencies": {"apm": []}, + "devDependencies": { + "apm": ["owner/dev-only"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + + prod_urls = {d.repo_url for d in pkg.get_apm_dependencies()} + assert "owner/dev-only" not in prod_urls + + def test_dev_dependencies_dict_format(self, tmp_path): + """devDependencies support dict-format entries (Cargo-style git objects).""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "apm": [ + {"git": "https://github.com/owner/complex-dep.git", "ref": "main"}, + ], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + dev_deps = pkg.get_dev_apm_dependencies() + + assert len(dev_deps) == 1 + assert isinstance(dev_deps[0], DependencyReference) + + def test_mixed_dev_dependency_types(self, tmp_path): + """devDependencies can have both apm and mcp types.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "apm": ["owner/dev-apm"], + "mcp": ["io.github.test/mcp-debug"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + + assert len(pkg.get_dev_apm_dependencies()) == 1 + assert len(pkg.get_dev_mcp_dependencies()) == 1 + + def test_dev_apm_no_mcp_key(self, tmp_path): + """get_dev_mcp_dependencies returns empty when only apm devDeps exist.""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + "devDependencies": { + "apm": ["owner/dev-tool"], + }, + }) + + pkg = APMPackage.from_apm_yml(yml) + + assert pkg.get_dev_mcp_dependencies() == [] + + +class TestClearCache: + """Tests for clear_apm_yml_cache.""" + + def test_clear_forces_reparse(self, tmp_path): + """After clear, the same file is re-parsed (not cached).""" + yml = _write_apm_yml(tmp_path, { + "name": "test-pkg", + "version": "1.0.0", + }) + + pkg1 = APMPackage.from_apm_yml(yml) + + # Overwrite with different data + clear_apm_yml_cache() + yml.write_text(yaml.dump({ + "name": "changed-pkg", + "version": "2.0.0", + }), encoding="utf-8") + + pkg2 = APMPackage.from_apm_yml(yml) + + assert pkg1.name == "test-pkg" + assert pkg2.name == "changed-pkg" diff --git a/tests/unit/test_content_hash.py b/tests/unit/test_content_hash.py new file mode 100644 index 00000000..c5433f63 --- /dev/null +++ b/tests/unit/test_content_hash.py @@ -0,0 +1,236 @@ +"""Tests for SHA-256 content integrity hashing.""" + +import os +from pathlib import Path + +import pytest + +from apm_cli.utils.content_hash import compute_package_hash, verify_package_hash + + +# --------------------------------------------------------------------------- +# compute_package_hash +# --------------------------------------------------------------------------- + +class TestComputePackageHash: + def test_basic_hash(self, tmp_path): + """Computes deterministic hash for a package directory.""" + (tmp_path / "file.txt").write_text("hello") + result = compute_package_hash(tmp_path) + assert result.startswith("sha256:") + assert len(result) == len("sha256:") + 64 # SHA-256 hex digest is 64 chars + + def test_deterministic_across_calls(self, tmp_path): + """Same content produces same hash.""" + (tmp_path / "a.txt").write_text("content") + assert compute_package_hash(tmp_path) == compute_package_hash(tmp_path) + + def test_different_content_different_hash(self, tmp_path): + """Different file content produces different hash.""" + (tmp_path / "a.txt").write_text("version1") + hash1 = compute_package_hash(tmp_path) + (tmp_path / "a.txt").write_text("version2") + hash2 = compute_package_hash(tmp_path) + assert hash1 != hash2 + + def test_file_order_independent(self, tmp_path): + """Hash is the same regardless of filesystem ordering.""" + # Create files in two different orders, hash should be the same + d1 = tmp_path / "dir1" + d1.mkdir() + (d1 / "b.txt").write_text("B") + (d1 / "a.txt").write_text("A") + + d2 = tmp_path / "dir2" + d2.mkdir() + (d2 / "a.txt").write_text("A") + (d2 / "b.txt").write_text("B") + + assert compute_package_hash(d1) == compute_package_hash(d2) + + def test_skips_git_directory(self, tmp_path): + """The .git directory is excluded from hashing.""" + (tmp_path / "code.py").write_text("print('hi')") + hash_before = compute_package_hash(tmp_path) + + git_dir = tmp_path / ".git" + git_dir.mkdir() + (git_dir / "HEAD").write_text("ref: refs/heads/main") + hash_after = compute_package_hash(tmp_path) + + assert hash_before == hash_after + + def test_skips_pycache(self, tmp_path): + """__pycache__ directories are excluded from hashing.""" + (tmp_path / "module.py").write_text("x = 1") + hash_before = compute_package_hash(tmp_path) + + cache_dir = tmp_path / "__pycache__" + cache_dir.mkdir() + (cache_dir / "module.cpython-312.pyc").write_bytes(b"\x00\x01\x02") + hash_after = compute_package_hash(tmp_path) + + assert hash_before == hash_after + + def test_empty_directory(self, tmp_path): + """Empty directory returns a well-known hash.""" + empty = tmp_path / "empty" + empty.mkdir() + result = compute_package_hash(empty) + assert result.startswith("sha256:") + # Empty hash is the SHA-256 of an empty bytestring + import hashlib + expected = "sha256:" + hashlib.sha256(b"").hexdigest() + assert result == expected + + def test_nonexistent_directory(self, tmp_path): + """Non-existent path returns the empty hash.""" + import hashlib + expected = "sha256:" + hashlib.sha256(b"").hexdigest() + assert compute_package_hash(tmp_path / "nope") == expected + + def test_binary_files_handled(self, tmp_path): + """Binary files are hashed correctly.""" + (tmp_path / "data.bin").write_bytes(bytes(range(256))) + result = compute_package_hash(tmp_path) + assert result.startswith("sha256:") + # Verify it doesn't raise and produces a valid digest + assert len(result) == len("sha256:") + 64 + + def test_symlinks_skipped(self, tmp_path): + """Symlinks are not followed during hashing.""" + (tmp_path / "real.txt").write_text("real") + hash_before = compute_package_hash(tmp_path) + + # Create a symlink + link = tmp_path / "link.txt" + try: + link.symlink_to(tmp_path / "real.txt") + except OSError: + pytest.skip("Cannot create symlinks on this platform") + hash_after = compute_package_hash(tmp_path) + assert hash_before == hash_after + + def test_hash_format(self, tmp_path): + """Hash starts with 'sha256:' prefix.""" + (tmp_path / "f.txt").write_text("x") + result = compute_package_hash(tmp_path) + assert result.startswith("sha256:") + hex_part = result[len("sha256:"):] + # Validate it's a valid hex string + int(hex_part, 16) + + def test_nested_directories(self, tmp_path): + """Nested directory structure is hashed correctly.""" + sub = tmp_path / "sub" / "deep" + sub.mkdir(parents=True) + (sub / "nested.txt").write_text("deep content") + (tmp_path / "top.txt").write_text("top content") + result = compute_package_hash(tmp_path) + assert result.startswith("sha256:") + + def test_path_uses_posix_format(self, tmp_path): + """File paths use POSIX separators for cross-platform determinism.""" + sub = tmp_path / "dir" + sub.mkdir() + (sub / "file.txt").write_text("content") + # Hash should be the same on any platform (POSIX paths used internally) + hash1 = compute_package_hash(tmp_path) + hash2 = compute_package_hash(tmp_path) + assert hash1 == hash2 + + +# --------------------------------------------------------------------------- +# verify_package_hash +# --------------------------------------------------------------------------- + +class TestVerifyPackageHash: + def test_matching_hash(self, tmp_path): + """Verification passes when content matches.""" + (tmp_path / "a.txt").write_text("hello") + expected = compute_package_hash(tmp_path) + assert verify_package_hash(tmp_path, expected) is True + + def test_mismatched_hash(self, tmp_path): + """Verification fails when content changed.""" + (tmp_path / "a.txt").write_text("original") + expected = compute_package_hash(tmp_path) + (tmp_path / "a.txt").write_text("tampered") + assert verify_package_hash(tmp_path, expected) is False + + def test_missing_file_fails(self, tmp_path): + """Verification fails when file is deleted.""" + (tmp_path / "a.txt").write_text("data") + (tmp_path / "b.txt").write_text("more") + expected = compute_package_hash(tmp_path) + (tmp_path / "b.txt").unlink() + assert verify_package_hash(tmp_path, expected) is False + + def test_added_file_fails(self, tmp_path): + """Verification fails when an extra file is added.""" + (tmp_path / "a.txt").write_text("data") + expected = compute_package_hash(tmp_path) + (tmp_path / "extra.txt").write_text("injected") + assert verify_package_hash(tmp_path, expected) is False + + +# --------------------------------------------------------------------------- +# Lockfile integration +# --------------------------------------------------------------------------- + +class TestLockfileContentHash: + def test_content_hash_serialized(self): + """content_hash appears in lockfile YAML output.""" + from apm_cli.deps.lockfile import LockedDependency + dep = LockedDependency( + repo_url="owner/repo", + content_hash="sha256:abc123", + ) + d = dep.to_dict() + assert d["content_hash"] == "sha256:abc123" + + def test_content_hash_deserialized(self): + """content_hash is read back from lockfile.""" + from apm_cli.deps.lockfile import LockedDependency + dep = LockedDependency.from_dict({ + "repo_url": "owner/repo", + "content_hash": "sha256:abc123", + }) + assert dep.content_hash == "sha256:abc123" + + def test_missing_content_hash_backward_compat(self): + """Old lockfiles without content_hash parse fine (None).""" + from apm_cli.deps.lockfile import LockedDependency + dep = LockedDependency.from_dict({ + "repo_url": "owner/repo", + }) + assert dep.content_hash is None + + def test_content_hash_none_not_emitted(self): + """content_hash=None is not written to YAML.""" + from apm_cli.deps.lockfile import LockedDependency + dep = LockedDependency( + repo_url="owner/repo", + content_hash=None, + ) + d = dep.to_dict() + assert "content_hash" not in d + + def test_content_hash_roundtrip_yaml(self, tmp_path): + """content_hash survives a full write/read YAML cycle.""" + from apm_cli.deps.lockfile import LockFile, LockedDependency + lockfile = LockFile(apm_version="test") + dep = LockedDependency( + repo_url="owner/repo", + resolved_commit="abc123", + content_hash="sha256:deadbeef", + ) + lockfile.add_dependency(dep) + path = tmp_path / "apm.lock.yaml" + lockfile.save(path) + + loaded = LockFile.read(path) + assert loaded is not None + loaded_dep = loaded.get_dependency("owner/repo") + assert loaded_dep is not None + assert loaded_dep.content_hash == "sha256:deadbeef" diff --git a/tests/unit/test_dev_dependencies.py b/tests/unit/test_dev_dependencies.py new file mode 100644 index 00000000..825e9b6d --- /dev/null +++ b/tests/unit/test_dev_dependencies.py @@ -0,0 +1,448 @@ +"""Tests for devDependencies support: --dev flag, resolver awareness, lockfile is_dev.""" + +import contextlib +import os +import tempfile +from pathlib import Path +from unittest.mock import MagicMock, Mock, patch + +import pytest +import yaml +from click.testing import CliRunner + +from apm_cli.deps.dependency_graph import DependencyNode +from apm_cli.deps.lockfile import LockedDependency, LockFile +from apm_cli.models.apm_package import APMPackage, DependencyReference +from apm_cli.models.results import InstallResult + + +# --------------------------------------------------------------------------- +# Part 3d: LockedDependency.is_dev field +# --------------------------------------------------------------------------- + + +class TestLockedDependencyIsDev: + """Tests for the is_dev field on LockedDependency.""" + + def test_is_dev_defaults_to_false(self): + dep = LockedDependency(repo_url="owner/repo") + assert dep.is_dev is False + + def test_is_dev_can_be_set_true(self): + dep = LockedDependency(repo_url="owner/repo", is_dev=True) + assert dep.is_dev is True + + def test_to_dict_omits_is_dev_when_false(self): + dep = LockedDependency(repo_url="owner/repo", is_dev=False) + result = dep.to_dict() + assert "is_dev" not in result + + def test_to_dict_includes_is_dev_when_true(self): + dep = LockedDependency(repo_url="owner/repo", is_dev=True) + result = dep.to_dict() + assert result["is_dev"] is True + + def test_from_dict_reads_is_dev_true(self): + data = {"repo_url": "owner/repo", "is_dev": True} + dep = LockedDependency.from_dict(data) + assert dep.is_dev is True + + def test_from_dict_defaults_missing_is_dev(self): + data = {"repo_url": "owner/repo"} + dep = LockedDependency.from_dict(data) + assert dep.is_dev is False + + def test_from_dependency_ref_passes_is_dev(self): + dep_ref = DependencyReference(repo_url="owner/repo", host="github.com") + locked = LockedDependency.from_dependency_ref( + dep_ref, "abc123", 1, None, is_dev=True + ) + assert locked.is_dev is True + + def test_from_dependency_ref_defaults_is_dev_false(self): + dep_ref = DependencyReference(repo_url="owner/repo", host="github.com") + locked = LockedDependency.from_dependency_ref(dep_ref, "abc123", 1, None) + assert locked.is_dev is False + + def test_is_dev_round_trip_yaml(self, tmp_path): + """is_dev survives a write/read YAML cycle.""" + lock = LockFile() + lock.add_dependency(LockedDependency(repo_url="prod/dep")) + lock.add_dependency(LockedDependency(repo_url="dev/dep", is_dev=True)) + lock_path = tmp_path / "apm.lock.yaml" + lock.write(lock_path) + + loaded = LockFile.read(lock_path) + assert loaded is not None + assert loaded.dependencies["prod/dep"].is_dev is False + assert loaded.dependencies["dev/dep"].is_dev is True + + def test_backward_compat_old_lockfile_no_is_dev(self): + """Old lockfiles without is_dev deserialize with is_dev=False.""" + yaml_str = ( + 'lockfile_version: "1"\n' + "dependencies:\n" + " - repo_url: legacy/dep\n" + " resolved_commit: abc123\n" + ) + lock = LockFile.from_yaml(yaml_str) + assert lock.dependencies["legacy/dep"].is_dev is False + + +class TestFromInstalledPackagesIsDev: + """Tests for LockFile.from_installed_packages with 5-element tuples.""" + + def _mock_dep_ref(self, repo_url): + ref = Mock() + ref.repo_url = repo_url + ref.host = None + ref.reference = "main" + ref.virtual_path = None + ref.is_virtual = False + ref.is_local = False + ref.local_path = None + return ref + + def test_5_element_tuple_with_is_dev_true(self): + dep_ref = self._mock_dep_ref("dev/pkg") + installed = [(dep_ref, "sha1", 1, None, True)] + lock = LockFile.from_installed_packages(installed, Mock()) + assert lock.dependencies["dev/pkg"].is_dev is True + + def test_5_element_tuple_with_is_dev_false(self): + dep_ref = self._mock_dep_ref("prod/pkg") + installed = [(dep_ref, "sha1", 1, None, False)] + lock = LockFile.from_installed_packages(installed, Mock()) + assert lock.dependencies["prod/pkg"].is_dev is False + + def test_4_element_tuple_backward_compat(self): + """Old callers passing 4-element tuples still work (is_dev defaults False).""" + dep_ref = self._mock_dep_ref("old/pkg") + installed = [(dep_ref, "sha1", 1, None)] + lock = LockFile.from_installed_packages(installed, Mock()) + assert lock.dependencies["old/pkg"].is_dev is False + + def test_mixed_prod_and_dev(self): + prod = self._mock_dep_ref("prod/pkg") + dev = self._mock_dep_ref("dev/pkg") + installed = [ + (prod, "sha1", 1, None, False), + (dev, "sha2", 1, None, True), + ] + lock = LockFile.from_installed_packages(installed, Mock()) + assert lock.dependencies["prod/pkg"].is_dev is False + assert lock.dependencies["dev/pkg"].is_dev is True + + +# --------------------------------------------------------------------------- +# Part 3c: Resolver devDependencies awareness +# --------------------------------------------------------------------------- + + +class TestDependencyNodeIsDev: + """Tests for DependencyNode.is_dev field.""" + + def test_is_dev_defaults_false(self): + pkg = APMPackage(name="test", version="1.0.0") + ref = DependencyReference(repo_url="owner/repo") + node = DependencyNode(package=pkg, dependency_ref=ref, depth=1) + assert node.is_dev is False + + def test_is_dev_can_be_set(self): + pkg = APMPackage(name="test", version="1.0.0") + ref = DependencyReference(repo_url="owner/repo") + node = DependencyNode(package=pkg, dependency_ref=ref, depth=1, is_dev=True) + assert node.is_dev is True + + +class TestResolverDevDeps: + """Tests for APMDependencyResolver handling devDependencies.""" + + def test_resolver_includes_dev_deps(self, tmp_path): + """Dev dependencies should appear in the resolved tree.""" + from apm_cli.deps.apm_resolver import APMDependencyResolver + + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump( + { + "name": "test-project", + "version": "1.0.0", + "dependencies": {"apm": ["prod/pkg"]}, + "devDependencies": {"apm": ["dev/pkg"]}, + } + ) + ) + + resolver = APMDependencyResolver(apm_modules_dir=tmp_path / "apm_modules") + graph = resolver.resolve_dependencies(tmp_path) + + tree = graph.dependency_tree + # Both prod and dev deps should be in the tree + assert tree.has_dependency("prod/pkg") + assert tree.has_dependency("dev/pkg") + + def test_resolver_marks_dev_deps(self, tmp_path): + """Dev-only dependencies should have is_dev=True in the tree.""" + from apm_cli.deps.apm_resolver import APMDependencyResolver + + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump( + { + "name": "test-project", + "version": "1.0.0", + "dependencies": {"apm": ["prod/pkg"]}, + "devDependencies": {"apm": ["dev/pkg"]}, + } + ) + ) + + resolver = APMDependencyResolver(apm_modules_dir=tmp_path / "apm_modules") + graph = resolver.resolve_dependencies(tmp_path) + + tree = graph.dependency_tree + prod_node = tree.get_node("prod/pkg") + dev_node = tree.get_node("dev/pkg") + assert prod_node is not None + assert dev_node is not None + assert prod_node.is_dev is False + assert dev_node.is_dev is True + + def test_resolver_prod_wins_over_dev(self, tmp_path): + """A dep in both dependencies and devDependencies should be is_dev=False.""" + from apm_cli.deps.apm_resolver import APMDependencyResolver + + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump( + { + "name": "test-project", + "version": "1.0.0", + "dependencies": {"apm": ["shared/pkg"]}, + "devDependencies": {"apm": ["shared/pkg"]}, + } + ) + ) + + resolver = APMDependencyResolver(apm_modules_dir=tmp_path / "apm_modules") + graph = resolver.resolve_dependencies(tmp_path) + + tree = graph.dependency_tree + node = tree.get_node("shared/pkg") + assert node is not None + # Prod takes precedence + assert node.is_dev is False + + def test_resolver_only_dev_deps(self, tmp_path): + """When only devDependencies exist, they resolve correctly.""" + from apm_cli.deps.apm_resolver import APMDependencyResolver + + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump( + { + "name": "test-project", + "version": "1.0.0", + "devDependencies": {"apm": ["dev/only"]}, + } + ) + ) + + resolver = APMDependencyResolver(apm_modules_dir=tmp_path / "apm_modules") + graph = resolver.resolve_dependencies(tmp_path) + tree = graph.dependency_tree + node = tree.get_node("dev/only") + assert node is not None + assert node.is_dev is True + + +# --------------------------------------------------------------------------- +# Part 3b: apm install --dev flag +# --------------------------------------------------------------------------- + + +class TestInstallDevFlag: + """Tests for the --dev flag on apm install.""" + + def setup_method(self): + self.runner = CliRunner() + try: + self.original_dir = os.getcwd() + except FileNotFoundError: + self.original_dir = str(Path(__file__).parent.parent.parent) + os.chdir(self.original_dir) + + def teardown_method(self): + try: + os.chdir(self.original_dir) + except (FileNotFoundError, OSError): + repo_root = Path(__file__).parent.parent.parent + os.chdir(str(repo_root)) + + @contextlib.contextmanager + def _chdir_tmp(self): + with tempfile.TemporaryDirectory() as tmp_dir: + try: + os.chdir(tmp_dir) + yield Path(tmp_dir) + finally: + os.chdir(self.original_dir) + + @patch("apm_cli.commands.install._validate_package_exists") + @patch("apm_cli.commands.install.APM_DEPS_AVAILABLE", True) + @patch("apm_cli.commands.install.APMPackage") + @patch("apm_cli.commands.install._install_apm_dependencies") + def test_dev_flag_writes_to_dev_dependencies( + self, mock_install_apm, mock_apm_package, mock_validate + ): + """--dev should add packages to devDependencies.apm.""" + from apm_cli.cli import cli + + with self._chdir_tmp(): + # Create minimal apm.yml + apm_yml = { + "name": "test-project", + "version": "1.0.0", + "dependencies": {"apm": [], "mcp": []}, + } + with open("apm.yml", "w") as f: + yaml.dump(apm_yml, f) + + mock_validate.return_value = True + + mock_pkg = MagicMock() + mock_pkg.get_apm_dependencies.return_value = [] + mock_pkg.get_dev_apm_dependencies.return_value = [ + MagicMock(repo_url="test/dev-pkg", reference="main") + ] + mock_pkg.get_mcp_dependencies.return_value = [] + mock_pkg.target = None + mock_apm_package.from_apm_yml.return_value = mock_pkg + + mock_install_apm.return_value = InstallResult( + diagnostics=MagicMock(has_diagnostics=False, has_critical_security=False) + ) + + result = self.runner.invoke(cli, ["install", "--dev", "test/dev-pkg"]) + assert result.exit_code == 0 + + with open("apm.yml") as f: + config = yaml.safe_load(f) + assert "devDependencies" in config + assert "test/dev-pkg" in config["devDependencies"]["apm"] + # Prod dependencies should be untouched + assert config["dependencies"]["apm"] == [] + + @patch("apm_cli.commands.install._validate_package_exists") + @patch("apm_cli.commands.install.APM_DEPS_AVAILABLE", True) + @patch("apm_cli.commands.install.APMPackage") + @patch("apm_cli.commands.install._install_apm_dependencies") + def test_no_dev_flag_writes_to_dependencies( + self, mock_install_apm, mock_apm_package, mock_validate + ): + """Without --dev, packages go to dependencies.apm.""" + from apm_cli.cli import cli + + with self._chdir_tmp(): + apm_yml = { + "name": "test-project", + "version": "1.0.0", + "dependencies": {"apm": [], "mcp": []}, + } + with open("apm.yml", "w") as f: + yaml.dump(apm_yml, f) + + mock_validate.return_value = True + + mock_pkg = MagicMock() + mock_pkg.get_apm_dependencies.return_value = [ + MagicMock(repo_url="test/prod-pkg", reference="main") + ] + mock_pkg.get_dev_apm_dependencies.return_value = [] + mock_pkg.get_mcp_dependencies.return_value = [] + mock_pkg.target = None + mock_apm_package.from_apm_yml.return_value = mock_pkg + + mock_install_apm.return_value = InstallResult( + diagnostics=MagicMock(has_diagnostics=False, has_critical_security=False) + ) + + result = self.runner.invoke(cli, ["install", "test/prod-pkg"]) + assert result.exit_code == 0 + + with open("apm.yml") as f: + config = yaml.safe_load(f) + assert "test/prod-pkg" in config["dependencies"]["apm"] + assert "devDependencies" not in config + + +class TestValidateAndAddDevDeps: + """Tests for _validate_and_add_packages_to_apm_yml with dev=True.""" + + def setup_method(self): + try: + self.original_dir = os.getcwd() + except FileNotFoundError: + self.original_dir = str(Path(__file__).parent.parent.parent) + os.chdir(self.original_dir) + + def teardown_method(self): + try: + os.chdir(self.original_dir) + except (FileNotFoundError, OSError): + repo_root = Path(__file__).parent.parent.parent + os.chdir(str(repo_root)) + + @patch("apm_cli.commands.install._validate_package_exists") + def test_dev_creates_dev_dependencies_section(self, mock_validate, tmp_path): + """dev=True creates devDependencies.apm if missing.""" + from apm_cli.commands.install import _validate_and_add_packages_to_apm_yml + + os.chdir(tmp_path) + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump( + { + "name": "test", + "version": "1.0.0", + "dependencies": {"apm": [], "mcp": []}, + } + ) + ) + + mock_validate.return_value = True + result = _validate_and_add_packages_to_apm_yml(["org/dev-pkg"], dev=True) + assert "org/dev-pkg" in result + + with open(apm_yml) as f: + data = yaml.safe_load(f) + assert "devDependencies" in data + assert "org/dev-pkg" in data["devDependencies"]["apm"] + # Prod deps untouched + assert data["dependencies"]["apm"] == [] + + @patch("apm_cli.commands.install._validate_package_exists") + def test_dev_false_writes_to_dependencies(self, mock_validate, tmp_path): + """dev=False (default) writes to dependencies.apm.""" + from apm_cli.commands.install import _validate_and_add_packages_to_apm_yml + + os.chdir(tmp_path) + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump( + { + "name": "test", + "version": "1.0.0", + "dependencies": {"apm": [], "mcp": []}, + } + ) + ) + + mock_validate.return_value = True + _validate_and_add_packages_to_apm_yml(["org/prod-pkg"], dev=False) + + with open(apm_yml) as f: + data = yaml.safe_load(f) + assert "org/prod-pkg" in data["dependencies"]["apm"] + assert "devDependencies" not in data diff --git a/tests/unit/test_init_command.py b/tests/unit/test_init_command.py index 9ead69f6..af21b9e7 100644 --- a/tests/unit/test_init_command.py +++ b/tests/unit/test_init_command.py @@ -1,5 +1,6 @@ """Tests for the apm init command.""" +import json import pytest import tempfile import os @@ -294,3 +295,27 @@ def test_init_does_not_create_skill_md(self): assert not Path("SKILL.md").exists() finally: os.chdir(self.original_dir) # restore CWD before TemporaryDirectory cleanup + + + +class TestPluginNameValidation: + """Unit tests for _validate_plugin_name helper.""" + + def test_valid_names(self): + from apm_cli.commands._helpers import _validate_plugin_name + + assert _validate_plugin_name("a") is True + assert _validate_plugin_name("my-plugin") is True + assert _validate_plugin_name("plugin2") is True + assert _validate_plugin_name("a" * 64) is True + + def test_invalid_names(self): + from apm_cli.commands._helpers import _validate_plugin_name + + assert _validate_plugin_name("") is False + assert _validate_plugin_name("A") is False + assert _validate_plugin_name("my_plugin") is False + assert _validate_plugin_name("1plugin") is False + assert _validate_plugin_name("-plugin") is False + assert _validate_plugin_name("a" * 65) is False + assert _validate_plugin_name("My-Plugin") is False diff --git a/tests/unit/test_init_plugin.py b/tests/unit/test_init_plugin.py new file mode 100644 index 00000000..1b27237e --- /dev/null +++ b/tests/unit/test_init_plugin.py @@ -0,0 +1,301 @@ +"""Tests for apm init --plugin flag. + +Focused test suite for the plugin author initialization workflow. +Complements the broader ``TestInitCommand`` tests in ``test_init_command.py``. +""" + +import json +import os +import tempfile +from pathlib import Path + +import pytest +import yaml +from click.testing import CliRunner + +from apm_cli.cli import cli +from apm_cli.commands._helpers import _validate_plugin_name + + +# --------------------------------------------------------------------------- +# CLI integration tests +# --------------------------------------------------------------------------- + + +class TestInitPlugin: + """Tests for apm init --plugin.""" + + def setup_method(self): + self.runner = CliRunner() + try: + self.original_dir = os.getcwd() + except FileNotFoundError: + self.original_dir = str(Path(__file__).parent.parent.parent) + os.chdir(self.original_dir) + + def teardown_method(self): + try: + os.chdir(self.original_dir) + except (FileNotFoundError, OSError): + repo_root = Path(__file__).parent.parent.parent + os.chdir(str(repo_root)) + + def test_plugin_creates_two_files(self): + """--plugin creates exactly plugin.json and apm.yml.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + result = self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + + assert result.exit_code == 0, result.output + created = {e.name for e in project_dir.iterdir()} + assert "apm.yml" in created + assert "plugin.json" in created + # Only these two files, nothing else + assert created == {"apm.yml", "plugin.json"} + finally: + os.chdir(self.original_dir) + + def test_plugin_json_structure(self): + """plugin.json has required fields: name, version, description, author, license.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + result = self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + assert result.exit_code == 0, result.output + + with open("plugin.json") as f: + data = json.load(f) + + assert data["name"] == "my-plugin" + assert "version" in data + assert isinstance(data["description"], str) + assert isinstance(data["author"], dict) + assert "name" in data["author"] + assert data["license"] == "MIT" + finally: + os.chdir(self.original_dir) + + def test_apm_yml_has_dev_dependencies(self): + """apm.yml includes devDependencies section when --plugin.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + result = self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + assert result.exit_code == 0, result.output + + with open("apm.yml") as f: + config = yaml.safe_load(f) + + assert "devDependencies" in config + assert config["devDependencies"] == {"apm": []} + finally: + os.chdir(self.original_dir) + + def test_no_skill_md_created(self): + """SKILL.md is NOT created (not mandatory per spec).""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + assert not Path("SKILL.md").exists() + finally: + os.chdir(self.original_dir) + + def test_no_empty_directories_created(self): + """No empty agents/, skills/ dirs (only files).""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + + entries = list(project_dir.iterdir()) + dirs = [e for e in entries if e.is_dir()] + assert dirs == [], f"Unexpected directories: {dirs}" + finally: + os.chdir(self.original_dir) + + def test_name_validation_rejects_uppercase(self): + """Plugin names must be lowercase kebab-case.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "MyPlugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + result = self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + assert result.exit_code != 0 + assert "Invalid plugin name" in result.output + finally: + os.chdir(self.original_dir) + + def test_name_validation_rejects_too_long(self): + """Plugin names max 64 chars.""" + assert _validate_plugin_name("a" * 65) is False + assert _validate_plugin_name("a" * 64) is True + + def test_name_validation_accepts_valid(self): + """Valid kebab-case names pass.""" + assert _validate_plugin_name("my-plugin") is True + assert _validate_plugin_name("plugin2") is True + assert _validate_plugin_name("a") is True + assert _validate_plugin_name("cool-plugin-v3") is True + + def test_name_validation_rejects_underscores(self): + """Underscores are not valid in plugin names.""" + assert _validate_plugin_name("my_plugin") is False + + def test_name_validation_rejects_start_with_number(self): + """Names starting with numbers are invalid.""" + assert _validate_plugin_name("1plugin") is False + + def test_name_validation_rejects_start_with_hyphen(self): + """Names starting with hyphens are invalid.""" + assert _validate_plugin_name("-plugin") is False + + def test_name_validation_rejects_empty(self): + """Empty name is invalid.""" + assert _validate_plugin_name("") is False + + def test_yes_mode_works_with_plugin(self): + """--yes and --plugin together work without interaction.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "auto-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + result = self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + + assert result.exit_code == 0, result.output + assert Path("apm.yml").exists() + assert Path("plugin.json").exists() + + with open("apm.yml") as f: + config = yaml.safe_load(f) + # --yes + --plugin uses 0.1.0 version + assert config["version"] == "0.1.0" + finally: + os.chdir(self.original_dir) + + def test_plugin_flag_without_plugin(self): + """Regular init (no --plugin) still works unchanged.""" + with tempfile.TemporaryDirectory() as tmp_dir: + os.chdir(tmp_dir) + try: + result = self.runner.invoke(cli, ["init", "--yes"]) + + assert result.exit_code == 0, result.output + assert Path("apm.yml").exists() + assert not Path("plugin.json").exists() + + with open("apm.yml") as f: + config = yaml.safe_load(f) + assert "devDependencies" not in config + finally: + os.chdir(self.original_dir) + + def test_plugin_version_defaults_to_0_1_0(self): + """--plugin --yes defaults version to 0.1.0 (not 1.0.0).""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + + with open("plugin.json") as f: + data = json.load(f) + assert data["version"] == "0.1.0" + + with open("apm.yml") as f: + config = yaml.safe_load(f) + assert config["version"] == "0.1.0" + finally: + os.chdir(self.original_dir) + + def test_plugin_author_is_object(self): + """Author in plugin.json is an object with 'name' key.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + + with open("plugin.json") as f: + data = json.load(f) + assert isinstance(data["author"], dict) + assert "name" in data["author"] + finally: + os.chdir(self.original_dir) + + def test_plugin_shows_next_steps(self): + """Plugin init shows plugin-specific next steps.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + result = self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + assert result.exit_code == 0, result.output + assert "apm pack" in result.output + finally: + os.chdir(self.original_dir) + + def test_plugin_with_project_name_argument(self): + """--plugin with explicit project_name creates directory.""" + with tempfile.TemporaryDirectory() as tmp_dir: + os.chdir(tmp_dir) + try: + result = self.runner.invoke( + cli, ["init", "cool-plugin", "--plugin", "--yes"] + ) + assert result.exit_code == 0, result.output + + project_path = Path(tmp_dir) / "cool-plugin" + assert (project_path / "apm.yml").exists() + assert (project_path / "plugin.json").exists() + + with open(project_path / "plugin.json") as f: + data = json.load(f) + assert data["name"] == "cool-plugin" + finally: + os.chdir(self.original_dir) + + def test_plugin_json_ends_with_newline(self): + """plugin.json ends with a trailing newline.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + raw = Path("plugin.json").read_text() + assert raw.endswith("\n") + finally: + os.chdir(self.original_dir) + + def test_plugin_apm_yml_has_dependencies(self): + """apm.yml created with --plugin still has regular dependencies section.""" + with tempfile.TemporaryDirectory() as tmp_dir: + project_dir = Path(tmp_dir) / "my-plugin" + project_dir.mkdir() + os.chdir(project_dir) + try: + self.runner.invoke(cli, ["init", "--plugin", "--yes"]) + + with open("apm.yml") as f: + config = yaml.safe_load(f) + assert "dependencies" in config + assert config["dependencies"] == {"apm": [], "mcp": []} + finally: + os.chdir(self.original_dir) diff --git a/tests/unit/test_plugin_exporter.py b/tests/unit/test_plugin_exporter.py new file mode 100644 index 00000000..46d9be90 --- /dev/null +++ b/tests/unit/test_plugin_exporter.py @@ -0,0 +1,856 @@ +"""Unit tests for apm_cli.bundle.plugin_exporter.""" + +import json +import os +import tarfile +from pathlib import Path +from unittest.mock import patch + +import pytest +import yaml + +from apm_cli.bundle.plugin_exporter import ( + PackResult, + _collect_apm_components, + _collect_bare_skill, + _collect_hooks_from_apm, + _collect_hooks_from_root, + _collect_mcp, + _collect_root_plugin_components, + _deep_merge, + _get_dev_dependency_urls, + _merge_file_map, + _rename_prompt, + _update_plugin_json_paths, + _validate_output_rel, + export_plugin_bundle, +) +from apm_cli.deps.lockfile import LockFile, LockedDependency +from apm_cli.deps.plugin_parser import synthesize_plugin_json_from_apm_yml + + +# --------------------------------------------------------------------------- +# Helpers +# --------------------------------------------------------------------------- + + +def _write_apm_yml( + project: Path, + *, + name: str = "test-pkg", + version: str = "1.0.0", + extra: dict | None = None, +) -> Path: + """Write a minimal apm.yml and return its path.""" + data = {"name": name, "version": version} + if extra: + data.update(extra) + path = project / "apm.yml" + path.write_text(yaml.dump(data), encoding="utf-8") + return path + + +def _write_lockfile( + project: Path, + deps: list[LockedDependency] | None = None, +) -> Path: + lockfile = LockFile() + for d in deps or []: + lockfile.add_dependency(d) + lockfile.write(project / "apm.lock.yaml") + return project / "apm.lock.yaml" + + +def _make_apm_dir( + base: Path, + *, + agents: list[str] | None = None, + skills: dict[str, list[str]] | None = None, + prompts: list[str] | None = None, + instructions: list[str] | None = None, + commands: list[str] | None = None, +) -> Path: + """Create a .apm/ directory tree under *base* with given component files.""" + apm = base / ".apm" + apm.mkdir(parents=True, exist_ok=True) + + def _write_files(subdir, names): + d = apm / subdir + d.mkdir(parents=True, exist_ok=True) + for n in names: + (d / n).write_text(f"content of {n}", encoding="utf-8") + + if agents: + _write_files("agents", agents) + if skills: + for skill_name, files in skills.items(): + skill_dir = apm / "skills" / skill_name + skill_dir.mkdir(parents=True, exist_ok=True) + for fn in files: + (skill_dir / fn).write_text(f"content of {fn}", encoding="utf-8") + if prompts: + _write_files("prompts", prompts) + if instructions: + _write_files("instructions", instructions) + if commands: + _write_files("commands", commands) + return apm + + +def _setup_plugin_project( + tmp_path: Path, + *, + deps: list[LockedDependency] | None = None, + agents: list[str] | None = None, + skills: dict[str, list[str]] | None = None, + prompts: list[str] | None = None, + instructions: list[str] | None = None, + commands: list[str] | None = None, + apm_yml_extra: dict | None = None, + plugin_json: dict | None = None, +) -> Path: + project = tmp_path / "project" + project.mkdir() + _write_apm_yml(project, extra=apm_yml_extra) + _write_lockfile(project, deps) + _make_apm_dir( + project, + agents=agents, + skills=skills, + prompts=prompts, + instructions=instructions, + commands=commands, + ) + if plugin_json is not None: + (project / "plugin.json").write_text( + json.dumps(plugin_json), encoding="utf-8" + ) + return project + + +# --------------------------------------------------------------------------- +# Unit tests: helpers +# --------------------------------------------------------------------------- + + +class TestValidateOutputRel: + def test_valid_paths(self): + assert _validate_output_rel("agents/a.md") is True + assert _validate_output_rel("commands/deep/b.md") is True + + def test_rejects_traversal(self): + assert _validate_output_rel("../escape.md") is False + assert _validate_output_rel("agents/../../etc/passwd") is False + + def test_rejects_absolute(self): + assert _validate_output_rel("/etc/passwd") is False + + +class TestRenamePrompt: + def test_strips_prompt_infix(self): + assert _rename_prompt("foo.prompt.md") == "foo.md" + + def test_preserves_plain_md(self): + assert _rename_prompt("foo.md") == "foo.md" + + def test_preserves_non_md(self): + assert _rename_prompt("foo.txt") == "foo.txt" + + +class TestDeepMerge: + def test_first_wins_by_default(self): + base = {"a": 1, "b": 2} + _deep_merge(base, {"a": 99, "c": 3}) + assert base == {"a": 1, "b": 2, "c": 3} + + def test_overwrite_mode(self): + base = {"a": 1, "b": 2} + _deep_merge(base, {"a": 99, "c": 3}, overwrite=True) + assert base == {"a": 99, "b": 2, "c": 3} + + def test_nested_first_wins(self): + base = {"hooks": {"preCommit": "old"}} + _deep_merge(base, {"hooks": {"preCommit": "new", "postCommit": "added"}}) + assert base == {"hooks": {"preCommit": "old", "postCommit": "added"}} + + def test_nested_overwrite(self): + base = {"hooks": {"preCommit": "old"}} + _deep_merge( + base, + {"hooks": {"preCommit": "new", "postCommit": "added"}}, + overwrite=True, + ) + assert base == {"hooks": {"preCommit": "new", "postCommit": "added"}} + + def test_depth_limit_raises(self): + """Deeply nested dicts beyond _MAX_MERGE_DEPTH raise ValueError.""" + from apm_cli.bundle.plugin_exporter import _MAX_MERGE_DEPTH + + # Build two dicts nested deeper than the limit with overlapping keys + # so _deep_merge actually recurses on every level + def _nested(depth: int) -> dict: + d = {"leaf": True} + for _ in range(depth): + d = {"k": d} + return d + + base = _nested(_MAX_MERGE_DEPTH + 5) + overlay = _nested(_MAX_MERGE_DEPTH + 5) + + with pytest.raises(ValueError, match="maximum nesting depth"): + _deep_merge(base, overlay) + + +# --------------------------------------------------------------------------- +# Unit tests: component collectors +# --------------------------------------------------------------------------- + + +class TestCollectApmComponents: + def test_agents(self, tmp_path): + _make_apm_dir(tmp_path, agents=["helper.agent.md"]) + comps = _collect_apm_components(tmp_path / ".apm") + assert any(r == "agents/helper.agent.md" for _, r in comps) + + def test_skills_preserve_structure(self, tmp_path): + _make_apm_dir(tmp_path, skills={"my-skill": ["SKILL.md", "lib.py"]}) + comps = _collect_apm_components(tmp_path / ".apm") + rels = {r for _, r in comps} + assert "skills/my-skill/SKILL.md" in rels + assert "skills/my-skill/lib.py" in rels + + def test_prompts_rename(self, tmp_path): + _make_apm_dir(tmp_path, prompts=["task.prompt.md", "plain.md"]) + comps = _collect_apm_components(tmp_path / ".apm") + rels = {r for _, r in comps} + assert "commands/task.md" in rels + assert "commands/plain.md" in rels + + def test_instructions(self, tmp_path): + _make_apm_dir(tmp_path, instructions=["rules.instructions.md"]) + comps = _collect_apm_components(tmp_path / ".apm") + assert any(r == "instructions/rules.instructions.md" for _, r in comps) + + def test_commands_passthrough(self, tmp_path): + _make_apm_dir(tmp_path, commands=["deploy.md"]) + comps = _collect_apm_components(tmp_path / ".apm") + assert any(r == "commands/deploy.md" for _, r in comps) + + def test_empty_apm_dir(self, tmp_path): + (tmp_path / ".apm").mkdir() + comps = _collect_apm_components(tmp_path / ".apm") + assert comps == [] + + def test_missing_apm_dir(self, tmp_path): + comps = _collect_apm_components(tmp_path / ".apm") + assert comps == [] + + def test_skips_symlinks(self, tmp_path): + apm = _make_apm_dir(tmp_path, agents=["real.agent.md"]) + link = apm / "agents" / "link.agent.md" + target = apm / "agents" / "real.agent.md" + try: + os.symlink(target, link) + except OSError: + pytest.skip("symlinks not supported") + comps = _collect_apm_components(tmp_path / ".apm") + rels = {r for _, r in comps} + assert "agents/link.agent.md" not in rels + assert "agents/real.agent.md" in rels + + +class TestCollectRootPluginComponents: + def test_root_agents(self, tmp_path): + (tmp_path / "agents").mkdir() + (tmp_path / "agents" / "bot.agent.md").write_text("x") + comps = _collect_root_plugin_components(tmp_path) + assert any(r == "agents/bot.agent.md" for _, r in comps) + + def test_ignores_nonexistent(self, tmp_path): + comps = _collect_root_plugin_components(tmp_path) + assert comps == [] + + +class TestCollectBareSkill: + """Tests for _collect_bare_skill — bare SKILL.md at dep root.""" + + def test_bare_skill_detected(self, tmp_path): + """A SKILL.md at root with no skills/ subdir is collected.""" + from apm_cli.bundle.plugin_exporter import _collect_bare_skill + + (tmp_path / "SKILL.md").write_text("# My Skill") + (tmp_path / "LICENSE.txt").write_text("MIT") + dep = LockedDependency( + repo_url="owner/my-skill", + resolved_commit="abc123", + depth=1, + ) + out: list = [] + _collect_bare_skill(tmp_path, dep, out) + rel_paths = [r for _, r in out] + assert "skills/my-skill/SKILL.md" in rel_paths + assert "skills/my-skill/LICENSE.txt" in rel_paths + + def test_virtual_path_used_as_slug(self, tmp_path): + """virtual_path is preferred over repo_url for the skill slug.""" + from apm_cli.bundle.plugin_exporter import _collect_bare_skill + + (tmp_path / "SKILL.md").write_text("# Frontend") + dep = LockedDependency( + repo_url="github/awesome-copilot", + resolved_commit="abc123", + depth=1, + virtual_path="frontend-design", + is_virtual=True, + ) + out: list = [] + _collect_bare_skill(tmp_path, dep, out) + assert any(r.startswith("skills/frontend-design/") for _, r in out) + + def test_skips_when_no_skill_md(self, tmp_path): + """No SKILL.md at root means nothing collected.""" + from apm_cli.bundle.plugin_exporter import _collect_bare_skill + + (tmp_path / "README.md").write_text("hello") + dep = LockedDependency( + repo_url="owner/pkg", resolved_commit="abc", depth=1, + ) + out: list = [] + _collect_bare_skill(tmp_path, dep, out) + assert out == [] + + def test_skips_when_skills_already_collected(self, tmp_path): + """If skills/ was already collected via normal paths, bare skill is skipped.""" + from apm_cli.bundle.plugin_exporter import _collect_bare_skill + + (tmp_path / "SKILL.md").write_text("# Root skill") + dep = LockedDependency( + repo_url="owner/pkg", resolved_commit="abc", depth=1, + ) + out = [(tmp_path / "skills" / "sub" / "SKILL.md", "skills/sub/SKILL.md")] + _collect_bare_skill(tmp_path, dep, out) + # Should not add another entry + assert len(out) == 1 + + def test_excludes_apm_files(self, tmp_path): + """apm.yml, apm.lock.yaml, plugin.json are excluded from bare skill output.""" + from apm_cli.bundle.plugin_exporter import _collect_bare_skill + + (tmp_path / "SKILL.md").write_text("# Skill") + (tmp_path / "apm.yml").write_text("name: x") + (tmp_path / "plugin.json").write_text("{}") + (tmp_path / "apm.lock.yaml").write_text("deps: []") + dep = LockedDependency( + repo_url="owner/pkg", resolved_commit="abc", depth=1, + ) + out: list = [] + _collect_bare_skill(tmp_path, dep, out) + rel_paths = [r for _, r in out] + assert "skills/pkg/SKILL.md" in rel_paths + assert not any("apm.yml" in r for r in rel_paths) + assert not any("plugin.json" in r for r in rel_paths) + assert not any("apm.lock.yaml" in r for r in rel_paths) + + +# --------------------------------------------------------------------------- +# Unit tests: hooks / MCP collection +# --------------------------------------------------------------------------- + + +class TestCollectHooks: + def test_from_apm_hooks_dir(self, tmp_path): + apm = tmp_path / ".apm" + hooks_dir = apm / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "a.json").write_text(json.dumps({"preCommit": ["lint"]})) + result = _collect_hooks_from_apm(apm) + assert result == {"preCommit": ["lint"]} + + def test_from_root_hooks_json(self, tmp_path): + (tmp_path / "hooks.json").write_text(json.dumps({"postPush": ["deploy"]})) + result = _collect_hooks_from_root(tmp_path) + assert result == {"postPush": ["deploy"]} + + def test_invalid_json_skipped(self, tmp_path): + apm = tmp_path / ".apm" + hooks_dir = apm / "hooks" + hooks_dir.mkdir(parents=True) + (hooks_dir / "bad.json").write_text("not json") + result = _collect_hooks_from_apm(apm) + assert result == {} + + +class TestCollectMcp: + def test_reads_mcp_servers(self, tmp_path): + (tmp_path / ".mcp.json").write_text( + json.dumps({"mcpServers": {"db": {"command": "db-server"}}}) + ) + result = _collect_mcp(tmp_path) + assert result == {"db": {"command": "db-server"}} + + def test_missing_file(self, tmp_path): + assert _collect_mcp(tmp_path) == {} + + +# --------------------------------------------------------------------------- +# Unit tests: devDependencies filtering +# --------------------------------------------------------------------------- + + +class TestDevDependencyUrls: + def test_simple_list(self, tmp_path): + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump({ + "name": "test", + "version": "1.0.0", + "devDependencies": {"apm": ["owner/dev-tool", "other/helper"]}, + }) + ) + urls = _get_dev_dependency_urls(apm_yml) + assert ("owner/dev-tool", "") in urls + assert ("other/helper", "") in urls + + def test_virtual_path_preserved(self, tmp_path): + """Deps from the same repo but different virtual paths are distinct.""" + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text( + yaml.dump({ + "name": "test", + "version": "1.0.0", + "devDependencies": { + "apm": ["owner/repo/sub/dev-tool"] + }, + }) + ) + keys = _get_dev_dependency_urls(apm_yml) + assert ("owner/repo", "sub/dev-tool") in keys + # The bare repo should NOT match + assert ("owner/repo", "") not in keys + + def test_no_dev_deps(self, tmp_path): + apm_yml = tmp_path / "apm.yml" + apm_yml.write_text(yaml.dump({"name": "test", "version": "1.0.0"})) + assert _get_dev_dependency_urls(apm_yml) == set() + + def test_missing_file(self, tmp_path): + assert _get_dev_dependency_urls(tmp_path / "missing.yml") == set() + + +# --------------------------------------------------------------------------- +# Unit tests: collision handling +# --------------------------------------------------------------------------- + + +class TestMergeFileMap: + def test_first_writer_wins_by_default(self, tmp_path): + f1 = tmp_path / "a.md" + f2 = tmp_path / "b.md" + f1.write_text("first") + f2.write_text("second") + file_map: dict = {} + collisions: list = [] + _merge_file_map(file_map, [(f1, "agents/a.md")], "pkg-a", False, collisions) + _merge_file_map(file_map, [(f2, "agents/a.md")], "pkg-b", False, collisions) + assert file_map["agents/a.md"][0] == f1 + assert len(collisions) == 1 + assert "first writer wins" in collisions[0] + + def test_force_last_writer_wins(self, tmp_path): + f1 = tmp_path / "a.md" + f2 = tmp_path / "b.md" + f1.write_text("first") + f2.write_text("second") + file_map: dict = {} + collisions: list = [] + _merge_file_map(file_map, [(f1, "agents/a.md")], "pkg-a", True, collisions) + _merge_file_map(file_map, [(f2, "agents/a.md")], "pkg-b", True, collisions) + assert file_map["agents/a.md"][0] == f2 + assert len(collisions) == 1 + assert "last writer wins" in collisions[0] + + +# --------------------------------------------------------------------------- +# Unit tests: plugin.json synthesis +# --------------------------------------------------------------------------- + + +class TestSynthesizePluginJson: + def test_basic_synthesis(self, tmp_path): + _write_apm_yml(tmp_path, extra={"description": "A tool", "author": "Alice"}) + result = synthesize_plugin_json_from_apm_yml(tmp_path / "apm.yml") + assert result["name"] == "test-pkg" + assert result["version"] == "1.0.0" + assert result["description"] == "A tool" + assert result["author"] == {"name": "Alice"} + + def test_missing_name_raises(self, tmp_path): + (tmp_path / "apm.yml").write_text(yaml.dump({"version": "1.0.0"})) + with pytest.raises(ValueError, match="name"): + synthesize_plugin_json_from_apm_yml(tmp_path / "apm.yml") + + def test_missing_file_raises(self, tmp_path): + with pytest.raises(FileNotFoundError): + synthesize_plugin_json_from_apm_yml(tmp_path / "nope.yml") + + def test_license_included(self, tmp_path): + _write_apm_yml(tmp_path, extra={"license": "MIT"}) + result = synthesize_plugin_json_from_apm_yml(tmp_path / "apm.yml") + assert result["license"] == "MIT" + + +# --------------------------------------------------------------------------- +# Unit tests: plugin.json path updating +# --------------------------------------------------------------------------- + + +class TestUpdatePluginJsonPaths: + def test_adds_present_directories(self): + pj = {"name": "test"} + files = ["agents/a.md", "commands/b.md"] + result = _update_plugin_json_paths(pj, files) + assert result["agents"] == ["agents/"] + assert result["commands"] == ["commands/"] + assert "skills" not in result + + def test_removes_absent_directories(self): + pj = {"name": "test", "skills": ["skills/"]} + files = ["agents/a.md"] + result = _update_plugin_json_paths(pj, files) + assert "skills" not in result + + +# --------------------------------------------------------------------------- +# Integration tests: export_plugin_bundle +# --------------------------------------------------------------------------- + + +class TestExportPluginBundle: + def test_basic_export(self, tmp_path): + project = _setup_plugin_project( + tmp_path, + agents=["helper.agent.md"], + prompts=["task.prompt.md"], + ) + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + assert result.bundle_path == out / "test-pkg-1.0.0" + assert result.bundle_path.exists() + assert (result.bundle_path / "agents" / "helper.agent.md").exists() + assert (result.bundle_path / "commands" / "task.md").exists() + assert (result.bundle_path / "plugin.json").exists() + # No APM artifacts in output + assert not (result.bundle_path / "apm.yml").exists() + assert not (result.bundle_path / "apm.lock.yaml").exists() + assert not (result.bundle_path / ".apm").exists() + assert not (result.bundle_path / "apm_modules").exists() + + def test_uses_existing_plugin_json(self, tmp_path): + project = _setup_plugin_project( + tmp_path, + agents=["a.agent.md"], + plugin_json={"name": "custom-name", "version": "2.0.0"}, + ) + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + pj = json.loads((result.bundle_path / "plugin.json").read_text()) + assert pj["name"] == "custom-name" + assert pj["version"] == "2.0.0" + + def test_synthesizes_plugin_json_when_absent(self, tmp_path): + project = _setup_plugin_project(tmp_path, agents=["a.agent.md"]) + out = tmp_path / "build" + + with patch("apm_cli.bundle.plugin_exporter._rich_warning") as mock_warn: + result = export_plugin_bundle(project, out) + + pj = json.loads((result.bundle_path / "plugin.json").read_text()) + assert pj["name"] == "test-pkg" + # Warning emitted about synthesis + assert any("plugin.json" in str(c) for c in mock_warn.call_args_list) + + def test_prompt_md_rename(self, tmp_path): + project = _setup_plugin_project( + tmp_path, + prompts=["do-thing.prompt.md", "plain.md"], + ) + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + assert (result.bundle_path / "commands" / "do-thing.md").exists() + assert (result.bundle_path / "commands" / "plain.md").exists() + # The .prompt.md variant should NOT exist + assert not (result.bundle_path / "commands" / "do-thing.prompt.md").exists() + + def test_skills_structure_preserved(self, tmp_path): + project = _setup_plugin_project( + tmp_path, + skills={"my-skill": ["SKILL.md"]}, + ) + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + assert (result.bundle_path / "skills" / "my-skill" / "SKILL.md").exists() + + def test_dry_run_no_output(self, tmp_path): + project = _setup_plugin_project(tmp_path, agents=["a.agent.md"]) + out = tmp_path / "build" + + result = export_plugin_bundle(project, out, dry_run=True) + + assert not out.exists() + assert len(result.files) > 0 + assert "plugin.json" in result.files + + def test_archive(self, tmp_path): + project = _setup_plugin_project(tmp_path, agents=["a.agent.md"]) + out = tmp_path / "build" + + result = export_plugin_bundle(project, out, archive=True) + + assert result.bundle_path.name == "test-pkg-1.0.0.tar.gz" + assert result.bundle_path.exists() + assert not (out / "test-pkg-1.0.0").exists() + with tarfile.open(result.bundle_path, "r:gz") as tar: + names = tar.getnames() + assert any("agent.md" in n for n in names) + + def test_dependency_components_included(self, tmp_path): + project = _setup_plugin_project(tmp_path, agents=["own.agent.md"]) + + # Set up a dependency in apm_modules + dep = LockedDependency(repo_url="acme/tools", depth=1) + _write_lockfile(project, [dep]) + dep_path = project / "apm_modules" / "acme" / "tools" + _make_apm_dir(dep_path, agents=["dep-agent.agent.md"]) + + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + assert (result.bundle_path / "agents" / "dep-agent.agent.md").exists() + assert (result.bundle_path / "agents" / "own.agent.md").exists() + + def test_dev_dependency_excluded(self, tmp_path): + project = _setup_plugin_project( + tmp_path, + agents=["own.agent.md"], + apm_yml_extra={"devDependencies": {"apm": ["acme/dev-only"]}}, + ) + + dep = LockedDependency(repo_url="acme/dev-only", depth=1) + _write_lockfile(project, [dep]) + dep_path = project / "apm_modules" / "acme" / "dev-only" + _make_apm_dir(dep_path, agents=["dev-agent.agent.md"]) + + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + assert (result.bundle_path / "agents" / "own.agent.md").exists() + assert not (result.bundle_path / "agents" / "dev-agent.agent.md").exists() + + def test_collision_first_wins(self, tmp_path): + project = _setup_plugin_project(tmp_path) + + # Two deps with the same agent file + dep1 = LockedDependency(repo_url="acme/first", depth=1) + dep2 = LockedDependency(repo_url="acme/second", depth=1) + _write_lockfile(project, [dep1, dep2]) + + dep1_path = project / "apm_modules" / "acme" / "first" + _make_apm_dir(dep1_path, agents=["shared.agent.md"]) + dep2_path = project / "apm_modules" / "acme" / "second" + _make_apm_dir(dep2_path, agents=["shared.agent.md"]) + + out = tmp_path / "build" + with patch("apm_cli.bundle.plugin_exporter._rich_warning"): + result = export_plugin_bundle(project, out) + + content = (result.bundle_path / "agents" / "shared.agent.md").read_text() + assert "shared.agent.md" in content # From dep1 + + def test_collision_force_last_wins(self, tmp_path): + project = _setup_plugin_project(tmp_path) + + dep1 = LockedDependency(repo_url="acme/first", depth=1) + dep2 = LockedDependency(repo_url="acme/second", depth=1) + _write_lockfile(project, [dep1, dep2]) + + dep1_path = project / "apm_modules" / "acme" / "first" + agents1 = dep1_path / ".apm" / "agents" + agents1.mkdir(parents=True) + (agents1 / "shared.agent.md").write_text("from-first") + + dep2_path = project / "apm_modules" / "acme" / "second" + agents2 = dep2_path / ".apm" / "agents" + agents2.mkdir(parents=True) + (agents2 / "shared.agent.md").write_text("from-second") + + out = tmp_path / "build" + with patch("apm_cli.bundle.plugin_exporter._rich_warning"): + result = export_plugin_bundle(project, out, force=True) + + content = (result.bundle_path / "agents" / "shared.agent.md").read_text() + assert content == "from-second" + + def test_hooks_merged(self, tmp_path): + project = _setup_plugin_project(tmp_path) + + # Root hooks + root_hooks_dir = project / ".apm" / "hooks" + root_hooks_dir.mkdir(parents=True, exist_ok=True) + (root_hooks_dir / "hooks.json").write_text( + json.dumps({"preCommit": ["root-lint"]}) + ) + + # Dep hooks + dep = LockedDependency(repo_url="acme/hooks-pkg", depth=1) + _write_lockfile(project, [dep]) + dep_path = project / "apm_modules" / "acme" / "hooks-pkg" + dep_hooks_dir = dep_path / ".apm" / "hooks" + dep_hooks_dir.mkdir(parents=True) + (dep_hooks_dir / "hooks.json").write_text( + json.dumps({"preCommit": ["dep-lint"], "postPush": ["deploy"]}) + ) + + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + hooks = json.loads((result.bundle_path / "hooks.json").read_text()) + # Root wins on key collision + assert hooks["preCommit"] == ["root-lint"] + # Dep-only key preserved + assert hooks["postPush"] == ["deploy"] + + def test_mcp_merged(self, tmp_path): + project = _setup_plugin_project(tmp_path) + + # Root MCP + (project / ".mcp.json").write_text( + json.dumps({"mcpServers": {"root-db": {"command": "root-server"}}}) + ) + + # Dep MCP + dep = LockedDependency(repo_url="acme/mcp-pkg", depth=1) + _write_lockfile(project, [dep]) + dep_path = project / "apm_modules" / "acme" / "mcp-pkg" + dep_path.mkdir(parents=True) + (dep_path / ".mcp.json").write_text( + json.dumps( + { + "mcpServers": { + "root-db": {"command": "dep-server"}, + "dep-only": {"command": "extra"}, + } + } + ) + ) + + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + mcp = json.loads((result.bundle_path / ".mcp.json").read_text()) + # Root wins on name collision + assert mcp["mcpServers"]["root-db"]["command"] == "root-server" + # Dep-only server preserved + assert mcp["mcpServers"]["dep-only"]["command"] == "extra" + + def test_empty_project(self, tmp_path): + project = _setup_plugin_project(tmp_path) + out = tmp_path / "build" + + result = export_plugin_bundle(project, out) + + assert result.bundle_path.exists() + assert (result.bundle_path / "plugin.json").exists() + + def test_no_lockfile_still_exports(self, tmp_path): + project = tmp_path / "project" + project.mkdir() + _write_apm_yml(project) + (project / ".apm").mkdir() + + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + assert result.bundle_path.exists() + assert (result.bundle_path / "plugin.json").exists() + + def test_security_scan_warns(self, tmp_path): + project = _setup_plugin_project(tmp_path, agents=["sneaky.agent.md"]) + # Inject hidden Unicode + sneaky = project / ".apm" / "agents" / "sneaky.agent.md" + sneaky.write_text(f"Hello \U000E0001 world", encoding="utf-8") + + out = tmp_path / "build" + with patch("apm_cli.bundle.plugin_exporter._rich_warning") as mock_warn: + result = export_plugin_bundle(project, out) + + assert result.bundle_path.exists() + assert any("hidden character" in str(c) for c in mock_warn.call_args_list) + + def test_plugin_json_updated_with_component_dirs(self, tmp_path): + project = _setup_plugin_project( + tmp_path, + agents=["a.agent.md"], + skills={"s1": ["SKILL.md"]}, + plugin_json={"name": "custom"}, + ) + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + + pj = json.loads((result.bundle_path / "plugin.json").read_text()) + assert pj["agents"] == ["agents/"] + assert pj["skills"] == ["skills/"] + + def test_root_level_plugin_dirs_collected(self, tmp_path): + """Root-level agents/ commands/ etc. are picked up for plugin-native repos.""" + project = _setup_plugin_project(tmp_path) + # Create root-level agents dir (no .apm/) + root_agents = project / "agents" + root_agents.mkdir() + (root_agents / "root-bot.agent.md").write_text("root bot") + + out = tmp_path / "build" + result = export_plugin_bundle(project, out) + assert (result.bundle_path / "agents" / "root-bot.agent.md").exists() + + +class TestExportPluginBundleViaPackBundle: + """Verify pack_bundle(fmt='plugin') delegates correctly.""" + + def test_fmt_plugin_delegates(self, tmp_path): + from apm_cli.bundle.packer import pack_bundle + + project = _setup_plugin_project(tmp_path, agents=["a.agent.md"]) + out = tmp_path / "build" + + result = pack_bundle(project, out, fmt="plugin") + + assert (result.bundle_path / "plugin.json").exists() + assert (result.bundle_path / "agents" / "a.agent.md").exists() + + def test_force_flag_passed_through(self, tmp_path): + from apm_cli.bundle.packer import pack_bundle + + project = _setup_plugin_project(tmp_path) + dep1 = LockedDependency(repo_url="acme/first", depth=1) + dep2 = LockedDependency(repo_url="acme/second", depth=1) + _write_lockfile(project, [dep1, dep2]) + + dep1_path = project / "apm_modules" / "acme" / "first" + a1 = dep1_path / ".apm" / "agents" + a1.mkdir(parents=True) + (a1 / "shared.agent.md").write_text("from-first") + + dep2_path = project / "apm_modules" / "acme" / "second" + a2 = dep2_path / ".apm" / "agents" + a2.mkdir(parents=True) + (a2 / "shared.agent.md").write_text("from-second") + + out = tmp_path / "build" + with patch("apm_cli.bundle.plugin_exporter._rich_warning"): + result = pack_bundle(project, out, fmt="plugin", force=True) + + content = (result.bundle_path / "agents" / "shared.agent.md").read_text() + assert content == "from-second" diff --git a/tests/unit/test_plugin_synthesis.py b/tests/unit/test_plugin_synthesis.py new file mode 100644 index 00000000..b4f5207b --- /dev/null +++ b/tests/unit/test_plugin_synthesis.py @@ -0,0 +1,167 @@ +"""Unit tests for synthesize_plugin_json_from_apm_yml. + +Focused test suite for the plugin.json synthesis from apm.yml identity fields. +""" + +from pathlib import Path + +import pytest +import yaml + +from apm_cli.deps.plugin_parser import synthesize_plugin_json_from_apm_yml + + +def _write_apm_yml(tmp_path: Path, data: dict) -> Path: + """Write an apm.yml file and return its path.""" + path = tmp_path / "apm.yml" + path.write_text(yaml.dump(data), encoding="utf-8") + return path + + +class TestPluginJsonSynthesis: + """Tests for synthesize_plugin_json_from_apm_yml.""" + + def test_basic_synthesis(self, tmp_path): + """Synthesizes plugin.json with mapped fields.""" + yml = _write_apm_yml(tmp_path, { + "name": "my-plugin", + "version": "1.0.0", + "description": "A cool plugin", + "author": "Jane Doe", + "license": "MIT", + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert result["name"] == "my-plugin" + assert result["version"] == "1.0.0" + assert result["description"] == "A cool plugin" + assert result["author"] == {"name": "Jane Doe"} + assert result["license"] == "MIT" + + def test_author_string_to_object(self, tmp_path): + """Author string maps to {name: string} object.""" + yml = _write_apm_yml(tmp_path, { + "name": "test", + "version": "1.0.0", + "author": "John Smith", + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert result["author"] == {"name": "John Smith"} + assert isinstance(result["author"], dict) + + def test_author_numeric_coerced_to_string(self, tmp_path): + """Numeric author values are coerced to strings.""" + yml = _write_apm_yml(tmp_path, { + "name": "test", + "version": "1.0.0", + "author": 42, + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert result["author"] == {"name": "42"} + + def test_missing_name_raises(self, tmp_path): + """Missing name in apm.yml raises ValueError.""" + yml = _write_apm_yml(tmp_path, {"version": "1.0.0"}) + + with pytest.raises(ValueError, match="name"): + synthesize_plugin_json_from_apm_yml(yml) + + def test_empty_name_raises(self, tmp_path): + """Empty string name raises ValueError.""" + yml = _write_apm_yml(tmp_path, {"name": "", "version": "1.0.0"}) + + with pytest.raises(ValueError, match="name"): + synthesize_plugin_json_from_apm_yml(yml) + + def test_optional_fields_omitted_if_missing(self, tmp_path): + """Optional fields (description, license, author) not in output if missing from apm.yml.""" + yml = _write_apm_yml(tmp_path, { + "name": "minimal-pkg", + "version": "1.0.0", + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert result["name"] == "minimal-pkg" + assert result["version"] == "1.0.0" + assert "description" not in result + assert "author" not in result + assert "license" not in result + + def test_version_omitted_if_missing(self, tmp_path): + """Version is optional in output when absent from apm.yml.""" + yml = _write_apm_yml(tmp_path, {"name": "no-version"}) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert result["name"] == "no-version" + assert "version" not in result + + def test_file_not_found_raises(self, tmp_path): + """Non-existent file raises FileNotFoundError.""" + with pytest.raises(FileNotFoundError): + synthesize_plugin_json_from_apm_yml(tmp_path / "nonexistent.yml") + + def test_invalid_yaml_raises(self, tmp_path): + """Invalid YAML raises ValueError.""" + bad = tmp_path / "apm.yml" + bad.write_text("{{invalid: yaml: [", encoding="utf-8") + + with pytest.raises(ValueError, match="Invalid YAML"): + synthesize_plugin_json_from_apm_yml(bad) + + def test_non_dict_yaml_raises(self, tmp_path): + """YAML that is a list instead of dict raises ValueError.""" + bad = tmp_path / "apm.yml" + bad.write_text("- item1\n- item2\n", encoding="utf-8") + + with pytest.raises(ValueError, match="name"): + synthesize_plugin_json_from_apm_yml(bad) + + def test_license_without_author(self, tmp_path): + """License can be present without author.""" + yml = _write_apm_yml(tmp_path, { + "name": "test", + "version": "1.0.0", + "license": "Apache-2.0", + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert result["license"] == "Apache-2.0" + assert "author" not in result + + def test_all_fields_present(self, tmp_path): + """All supported fields are mapped correctly.""" + yml = _write_apm_yml(tmp_path, { + "name": "full-pkg", + "version": "2.1.0", + "description": "Full package", + "author": "Acme Corp", + "license": "ISC", + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert set(result.keys()) == {"name", "version", "description", "author", "license"} + + def test_extra_apm_fields_ignored(self, tmp_path): + """Fields not part of plugin spec (dependencies, scripts) are not in output.""" + yml = _write_apm_yml(tmp_path, { + "name": "test", + "version": "1.0.0", + "dependencies": {"apm": ["owner/repo"]}, + "scripts": {"build": "echo hi"}, + "target": "vscode", + }) + + result = synthesize_plugin_json_from_apm_yml(yml) + + assert "dependencies" not in result + assert "scripts" not in result + assert "target" not in result