diff --git a/docs/adr/26903-frontmatter-redirect-support-for-workflow-updates.md b/docs/adr/26903-frontmatter-redirect-support-for-workflow-updates.md new file mode 100644 index 00000000000..3f18ae02985 --- /dev/null +++ b/docs/adr/26903-frontmatter-redirect-support-for-workflow-updates.md @@ -0,0 +1,92 @@ +# ADR-26903: Frontmatter Redirect Support for Workflow Updates + +**Date**: 2026-04-17 +**Status**: Draft +**Deciders**: pelikhan, Copilot + +--- + +## Part 1 — Narrative (Human-Friendly) + +### Context + +Agentic workflows installed via `gh aw update` track their upstream source using a `source` frontmatter field. When a workflow is renamed, relocated, or replaced by a newer version in the upstream repository, consumers have no automated way to discover the new location — the next `gh aw update` continues fetching the old (potentially stale or deleted) path. The system needs a lightweight mechanism for upstream workflow authors to declare that a workflow has moved, and for the update command to transparently follow that declaration without requiring manual consumer intervention. + +### Decision + +We will introduce a `redirect` frontmatter field in the workflow schema that points to the new canonical location of a workflow. When `gh aw update` encounters a workflow whose upstream content declares a `redirect`, it will follow the redirect chain (up to a depth of 20), rewrite the local `source` field to the resolved destination, and disable 3-way merge for the hop to avoid spurious conflicts. A `--no-redirect` flag is added so operators who require explicit control over source changes can refuse any update that would silently relocate a workflow. + +### Alternatives Considered + +#### Alternative 1: Out-of-band redirect registry + +Maintain a central registry (e.g., a JSON file in the upstream repo or a service endpoint) that maps old workflow paths to new ones. The update command would consult this registry before fetching. This was not chosen because it requires additional infrastructure, creates a coupling to a registry format that all upstream repos must adopt, and does not compose well with forks or private repositories. + +#### Alternative 2: HTTP-style redirect at the content-hosting layer + +Rely on GitHub repository redirects (e.g., redirecting the raw content URL via a GitHub redirect when a file is moved). This was not chosen because GitHub does not provide automatic content-level HTTP redirects for individual file paths within a repository, and even if it did, the redirect would lose the semantic information needed to update the `source` field in the local workflow file. + +#### Alternative 3: Require manual consumer re-pinning + +Document that when a workflow moves, consumers must manually run `gh aw add` with the new location and delete the old file. This was not chosen because it places the burden on every consumer rather than the upstream author, and silent staleness is a worse outcome than a transparent automated redirect. + +### Consequences + +#### Positive +- Upstream workflow authors can declare a move once, and all consumers transparently follow it on the next `gh aw update` run. +- The `source` field in consumer files is automatically rewritten to the resolved canonical location, keeping provenance accurate. +- Redirect loops (A → B → A) are detected and reported rather than spinning indefinitely. +- The `--no-redirect` flag gives security- or stability-conscious operators an explicit opt-out with a clear error message. +- The compiler emits an informational message when compiling a workflow that has a `redirect` configured, making the stub status visible during local development. + +#### Negative +- Redirect chains are followed silently at update time; consumers may not notice that the source of their workflow has changed unless they inspect the diff. +- Disabling 3-way merge on redirect hops means local customizations to a redirected workflow will be overwritten on the first update after a redirect is followed. +- The maximum redirect depth (20) is an arbitrary constant; very long chains will fail rather than succeed. +- Adding `noRedirect bool` to the already-long parameter list of `UpdateWorkflows` / `updateWorkflow` / `RunUpdateWorkflows` increases function arity and slightly complicates call sites. + +#### Neutral +- The `redirect` field is defined in the JSON schema for workflow frontmatter, so existing schema validation tooling will recognize it without special-casing. +- The `FrontmatterConfig` struct gains a `Redirect` field that is serialized/deserialized symmetrically with the existing `Source` field. +- No changes are required to lock file format or compilation output; the redirect field is used only at update time and during compiler validation messages. + +--- + +## Part 2 — Normative Specification (RFC 2119) + +> The key words **MUST**, **MUST NOT**, **REQUIRED**, **SHALL**, **SHALL NOT**, **SHOULD**, **SHOULD NOT**, **RECOMMENDED**, **MAY**, and **OPTIONAL** in this section are to be interpreted as described in [RFC 2119](https://www.rfc-editor.org/rfc/rfc2119). + +### Redirect Field + +1. The `redirect` field **MUST** be a string when present in workflow frontmatter; non-string values **MUST** cause the update to fail with a descriptive error. +2. The `redirect` field value **MUST** be either a workflow source spec (`owner/repo/path@ref`) or a GitHub blob URL (`https://github.com/owner/repo/blob/ref/path`); other formats **MUST** be rejected with a parse error. +3. The `redirect` field **MUST NOT** point to a local path or a non-remote location; redirect targets **MUST** resolve to a remote repository slug. + +### Redirect Chain Resolution + +1. Implementations **MUST** resolve redirect chains iteratively, following each `redirect` field until a workflow without a redirect is reached. +2. Implementations **MUST** detect redirect loops using a visited-location set and **MUST** return an error when a previously visited location is encountered. +3. Implementations **MUST NOT** follow more than 20 redirect hops; exceeding this limit **MUST** result in an error. +4. Implementations **MUST** rewrite the local `source` field to the fully resolved final location after following a redirect chain. +5. When a redirect is followed, implementations **MUST** disable 3-way merge for that update and override the local file with the redirected content. +6. Implementations **SHOULD** emit a warning message to stderr for each redirect hop followed, naming the source and destination locations. + +### `--no-redirect` Flag + +1. When `--no-redirect` is specified, implementations **MUST** refuse any update where the upstream workflow declares a `redirect` field, and **MUST** return a non-zero exit code with an explanatory error message. +2. The error message **MUST** identify the workflow name, the current upstream location, and the redirect target so the operator understands what redirect was refused. +3. When `--no-redirect` is not specified, redirect following **MUST** be the default behavior. + +### Compiler Behavior + +1. The compiler **MUST** emit an informational message to stderr when compiling a workflow whose frontmatter contains a non-empty `redirect` field. +2. The informational message **MUST** include the redirect target value so developers are aware the compiled file is a redirect stub. +3. The presence of a `redirect` field **MUST NOT** cause compilation to fail; it is advisory metadata only. + +### Conformance + +An implementation is considered conformant with this ADR if it satisfies all **MUST** and **MUST NOT** requirements above. Failure to meet any **MUST** or **MUST NOT** requirement constitutes non-conformance. + +--- + +*This is a DRAFT ADR generated by the [Design Decision Gate](https://github.com/github/gh-aw/actions/runs/24575079707) workflow. The PR author must review, complete, and finalize this document before the PR can merge.* diff --git a/pkg/cli/compile_safe_update_integration_test.go b/pkg/cli/compile_safe_update_integration_test.go index 6c253e9a85f..b2f052794f9 100644 --- a/pkg/cli/compile_safe_update_integration_test.go +++ b/pkg/cli/compile_safe_update_integration_test.go @@ -78,6 +78,21 @@ engine: copilot Test workflow that uses only GITHUB_TOKEN. ` +const safeUpdateWorkflowWithRedirect = `--- +name: Safe Update Redirect Test +on: + workflow_dispatch: +permissions: + contents: read +engine: copilot +redirect: owner/repo/workflows/new-location.md@main +--- + +# Safe Update Redirect Test + +Test workflow that declares frontmatter redirect. +` + // manifestWithAPISecret is a minimal lock file content containing a gh-aw-manifest // that pre-approves MY_API_SECRET. Writing this to the lock file path // before compilation simulates a workflow that was previously compiled and approved. @@ -154,6 +169,34 @@ func TestSafeUpdateFirstCompileCreatesBaselineForActions(t *testing.T) { t.Logf("First compile correctly emitted warnings for new action.\nOutput:\n%s", outputStr) } +// TestSafeUpdateFirstCompileCreatesBaselineForRedirect verifies that adding a +// frontmatter redirect is surfaced by safe update enforcement so it is reviewed. +func TestSafeUpdateFirstCompileCreatesBaselineForRedirect(t *testing.T) { + setup := setupIntegrationTest(t) + defer setup.cleanup() + + workflowPath := filepath.Join(setup.workflowsDir, "safe-update-redirect.md") + require.NoError(t, os.WriteFile(workflowPath, []byte(safeUpdateWorkflowWithRedirect), 0o644), + "should write workflow file") + + cmd := exec.Command(setup.binaryPath, "compile", workflowPath) + cmd.Env = append(os.Environ(), "GH_AW_ACTION_MODE=release") + output, err := cmd.CombinedOutput() + outputStr := string(output) + + assert.NoError(t, err, "first compile should succeed (warnings don't fail the build)\nOutput:\n%s", outputStr) + assert.Contains(t, outputStr, "SECURITY REVIEW REQUIRED", + "first compile should emit safe update warnings for redirect changes") + assert.Contains(t, outputStr, "New redirect configured", + "warning should include redirect additions for security review") + + lockFilePath := filepath.Join(setup.workflowsDir, "safe-update-redirect.lock.yml") + lockContent, readErr := os.ReadFile(lockFilePath) + require.NoError(t, readErr, "should read lock file after first compile") + assert.Contains(t, string(lockContent), `"redirect":"owner/repo/workflows/new-location.md@main"`, + "manifest should include redirect for future safe-update comparisons") +} + // TestSafeUpdateAllowsKnownSecretWithPriorManifest verifies that safe update // enforcement allows a compilation when the secret is already recorded in the // prior manifest embedded in the existing lock file. diff --git a/pkg/cli/update_command.go b/pkg/cli/update_command.go index 02487e4649c..cb305d3da02 100644 --- a/pkg/cli/update_command.go +++ b/pkg/cli/update_command.go @@ -46,6 +46,7 @@ Examples: ` + string(constants.CLIExtensionPrefix) + ` update --force # Force update even if no changes ` + string(constants.CLIExtensionPrefix) + ` update --disable-release-bump # Update without force-bumping all action versions ` + string(constants.CLIExtensionPrefix) + ` update --no-compile # Update without regenerating lock files + ` + string(constants.CLIExtensionPrefix) + ` update --no-redirect # Refuse workflows that use redirect frontmatter ` + string(constants.CLIExtensionPrefix) + ` update --dir custom/workflows # Update workflows in custom directory ` + string(constants.CLIExtensionPrefix) + ` update --create-pull-request # Update and open a pull request`, RunE: func(cmd *cobra.Command, args []string) error { @@ -59,6 +60,7 @@ Examples: noMergeFlag, _ := cmd.Flags().GetBool("no-merge") disableReleaseBump, _ := cmd.Flags().GetBool("disable-release-bump") noCompile, _ := cmd.Flags().GetBool("no-compile") + noRedirect, _ := cmd.Flags().GetBool("no-redirect") createPRFlag, _ := cmd.Flags().GetBool("create-pull-request") prFlagAlias, _ := cmd.Flags().GetBool("pr") createPR := createPRFlag || prFlagAlias @@ -73,7 +75,7 @@ Examples: } } - if err := RunUpdateWorkflows(cmd.Context(), args, majorFlag, forceFlag, verbose, engineOverride, workflowDir, noStopAfter, stopAfter, noMergeFlag, disableReleaseBump, noCompile); err != nil { + if err := RunUpdateWorkflows(cmd.Context(), args, majorFlag, forceFlag, verbose, engineOverride, workflowDir, noStopAfter, stopAfter, noMergeFlag, disableReleaseBump, noCompile, noRedirect); err != nil { return err } @@ -96,6 +98,7 @@ Examples: cmd.Flags().Bool("no-merge", false, "Override local changes with upstream version instead of merging") cmd.Flags().Bool("disable-release-bump", false, "Disable automatic major version bumps for all actions (only core actions/* are force-updated)") cmd.Flags().Bool("no-compile", false, "Skip recompiling workflows (do not modify lock files)") + cmd.Flags().Bool("no-redirect", false, "Refuse updates when redirect frontmatter is present") cmd.Flags().Bool("create-pull-request", false, "Create a pull request with the update changes") cmd.Flags().Bool("pr", false, "Alias for --create-pull-request") _ = cmd.Flags().MarkHidden("pr") // Hide the short alias from help output @@ -110,12 +113,12 @@ Examples: // RunUpdateWorkflows updates workflows from their source repositories. // Each workflow is compiled immediately after update. -func RunUpdateWorkflows(ctx context.Context, workflowNames []string, allowMajor, force, verbose bool, engineOverride string, workflowsDir string, noStopAfter bool, stopAfter string, noMerge bool, disableReleaseBump bool, noCompile bool) error { - updateLog.Printf("Starting update process: workflows=%v, allowMajor=%v, force=%v, noMerge=%v, disableReleaseBump=%v, noCompile=%v", workflowNames, allowMajor, force, noMerge, disableReleaseBump, noCompile) +func RunUpdateWorkflows(ctx context.Context, workflowNames []string, allowMajor, force, verbose bool, engineOverride string, workflowsDir string, noStopAfter bool, stopAfter string, noMerge bool, disableReleaseBump bool, noCompile bool, noRedirect bool) error { + updateLog.Printf("Starting update process: workflows=%v, allowMajor=%v, force=%v, noMerge=%v, disableReleaseBump=%v, noCompile=%v, noRedirect=%v", workflowNames, allowMajor, force, noMerge, disableReleaseBump, noCompile, noRedirect) var firstErr error - if err := UpdateWorkflows(ctx, workflowNames, allowMajor, force, verbose, engineOverride, workflowsDir, noStopAfter, stopAfter, noMerge, noCompile); err != nil { + if err := UpdateWorkflows(ctx, workflowNames, allowMajor, force, verbose, engineOverride, workflowsDir, noStopAfter, stopAfter, noMerge, noCompile, noRedirect); err != nil { firstErr = fmt.Errorf("workflow update failed: %w", err) } diff --git a/pkg/cli/update_command_test.go b/pkg/cli/update_command_test.go index 07113e4b167..4aac6b3143c 100644 --- a/pkg/cli/update_command_test.go +++ b/pkg/cli/update_command_test.go @@ -980,7 +980,7 @@ func TestRunUpdateWorkflows_NoSourceWorkflows(t *testing.T) { os.Chdir(tmpDir) // Running update with no source workflows should succeed with an info message, not an error - err := RunUpdateWorkflows(context.Background(), nil, false, false, false, "", "", false, "", false, false, false) + err := RunUpdateWorkflows(context.Background(), nil, false, false, false, "", "", false, "", false, false, false, false) assert.NoError(t, err, "Should not error when no workflows with source field exist") } @@ -996,7 +996,7 @@ func TestRunUpdateWorkflows_SpecificWorkflowNotFound(t *testing.T) { os.Chdir(tmpDir) // Running update with a specific name that doesn't exist should fail - err := RunUpdateWorkflows(context.Background(), []string{"nonexistent"}, false, false, false, "", "", false, "", false, false, false) + err := RunUpdateWorkflows(context.Background(), []string{"nonexistent"}, false, false, false, "", "", false, "", false, false, false, false) require.Error(t, err, "Should error when specified workflow not found") assert.Contains(t, err.Error(), "no workflows found matching the specified names") } diff --git a/pkg/cli/update_integration_test.go b/pkg/cli/update_integration_test.go index 5fc693daad9..30a5d3cbfdf 100644 --- a/pkg/cli/update_integration_test.go +++ b/pkg/cli/update_integration_test.go @@ -174,6 +174,21 @@ func TestUpdateCommand_NoMergeFlag(t *testing.T) { assert.Contains(t, outputStr, "no workflows found", "Should report no workflows found") } +// TestUpdateCommand_NoRedirectFlag verifies that --no-redirect flag is recognized. +func TestUpdateCommand_NoRedirectFlag(t *testing.T) { + setup := setupUpdateIntegrationTest(t) + defer setup.cleanup() + + cmd := exec.Command(setup.binaryPath, "update", "--no-redirect", "--verbose") + cmd.Dir = setup.tempDir + output, err := cmd.CombinedOutput() + outputStr := string(output) + + assert.NoError(t, err, "Should succeed (no source workflows = info message, not error), output: %s", outputStr) + assert.NotContains(t, outputStr, "unknown flag", "The --no-redirect flag should be recognized") + assert.Contains(t, outputStr, "no workflows found", "Should report no workflows found") +} + // TestUpdateCommand_RemovedFlags verifies that old flags are no longer accepted. func TestUpdateCommand_RemovedFlags(t *testing.T) { setup := setupUpdateIntegrationTest(t) @@ -208,6 +223,7 @@ func TestUpdateCommand_HelpText(t *testing.T) { // Should mention merge behavior assert.Contains(t, outputStr, "no-merge", "Help should document --no-merge flag") + assert.Contains(t, outputStr, "no-redirect", "Help should document --no-redirect flag") assert.Contains(t, outputStr, "3-way merge", "Help should explain merge behavior") // Should reference upgrade for other features diff --git a/pkg/cli/update_merge.go b/pkg/cli/update_merge.go index f86848c83f2..17c9ec2cbe0 100644 --- a/pkg/cli/update_merge.go +++ b/pkg/cli/update_merge.go @@ -88,8 +88,8 @@ func hasLocalModifications(sourceContent, localContent, sourceSpec, localWorkflo // localWorkflowPath is the filesystem path of the local workflow file being updated; // when non-empty its directory is used to preserve relative import paths whose files // exist locally rather than rewriting them to cross-repo references. -func MergeWorkflowContent(base, current, new, oldSourceSpec, newRef, localWorkflowPath string, verbose bool) (string, bool, error) { - updateMergeLog.Printf("Starting 3-way merge: old_ref=%s, new_ref=%s", oldSourceSpec, newRef) +func MergeWorkflowContent(base, current, new, oldSourceSpec, newRefOrSourceSpec, localWorkflowPath string, verbose bool) (string, bool, error) { + updateMergeLog.Printf("Starting 3-way merge: old_source=%s, new_ref_or_source=%s", oldSourceSpec, newRefOrSourceSpec) if verbose { fmt.Fprintln(os.Stderr, console.FormatVerboseMessage("Performing 3-way merge using git merge-file")) @@ -103,6 +103,17 @@ func MergeWorkflowContent(base, current, new, oldSourceSpec, newRef, localWorkfl } currentSourceSpec := fmt.Sprintf("%s/%s@%s", sourceSpec.Repo, sourceSpec.Path, sourceSpec.Ref) + // Support both legacy ref-only and full source spec for the merge target. + newSourceSpec := fmt.Sprintf("%s/%s@%s", sourceSpec.Repo, sourceSpec.Path, newRefOrSourceSpec) + if tentativeSourceSpec, parseErr := parseSourceSpec(newRefOrSourceSpec); parseErr == nil { + newSourceSpec = sourceSpecWithRef(tentativeSourceSpec, tentativeSourceSpec.Ref) + } + parsedNewSourceSpec, err := parseSourceSpec(newSourceSpec) + if err != nil { + return "", false, fmt.Errorf("failed to parse new source spec: %w", err) + } + newRef := parsedNewSourceSpec.Ref + // Fix the base version by adding the source field to match what both current and new have // This prevents unnecessary conflicts over the source field baseWithSource, err := UpdateFieldInFrontmatter(base, "source", currentSourceSpec) @@ -115,7 +126,7 @@ func MergeWorkflowContent(base, current, new, oldSourceSpec, newRef, localWorkfl } // Update the source field in the new content with the new ref - newWithUpdatedSource, err := UpdateFieldInFrontmatter(new, "source", fmt.Sprintf("%s/%s@%s", sourceSpec.Repo, sourceSpec.Path, newRef)) + newWithUpdatedSource, err := UpdateFieldInFrontmatter(new, "source", newSourceSpec) if err != nil { if verbose { fmt.Fprintln(os.Stderr, console.FormatWarningMessage(fmt.Sprintf("Failed to update source in new content: %v", err))) @@ -202,29 +213,26 @@ func MergeWorkflowContent(base, current, new, oldSourceSpec, newRef, localWorkfl // Process @include directives if present and no conflicts // Skip include processing if there are conflicts to avoid errors if !hasConflicts { - sourceSpec, err := parseSourceSpec(oldSourceSpec) - if err == nil { - workflow := &WorkflowSpec{ - RepoSpec: RepoSpec{ - RepoSlug: sourceSpec.Repo, - Version: newRef, - }, - WorkflowPath: sourceSpec.Path, - } + workflow := &WorkflowSpec{ + RepoSpec: RepoSpec{ + RepoSlug: parsedNewSourceSpec.Repo, + Version: newRef, + }, + WorkflowPath: parsedNewSourceSpec.Path, + } - localWorkflowDir := "" - if localWorkflowPath != "" { - localWorkflowDir = filepath.Dir(localWorkflowPath) - } - processedContent, err := processIncludesInContent(mergedStr, workflow, newRef, localWorkflowDir, verbose) - if err != nil { - if verbose { - fmt.Fprintln(os.Stderr, console.FormatWarningMessage(fmt.Sprintf("Failed to process includes: %v", err))) - } - // Return unprocessed content on error - } else { - mergedStr = processedContent + localWorkflowDir := "" + if localWorkflowPath != "" { + localWorkflowDir = filepath.Dir(localWorkflowPath) + } + processedContent, err := processIncludesInContent(mergedStr, workflow, newRef, localWorkflowDir, verbose) + if err != nil { + if verbose { + fmt.Fprintln(os.Stderr, console.FormatWarningMessage(fmt.Sprintf("Failed to process includes: %v", err))) } + // Return unprocessed content on error + } else { + mergedStr = processedContent } } diff --git a/pkg/cli/update_redirects.go b/pkg/cli/update_redirects.go new file mode 100644 index 00000000000..7ec4ffe78d0 --- /dev/null +++ b/pkg/cli/update_redirects.go @@ -0,0 +1,170 @@ +package cli + +import ( + "context" + "errors" + "fmt" + "os" + "strings" + + "github.com/github/gh-aw/pkg/console" + "github.com/github/gh-aw/pkg/parser" +) + +const maxRedirectDepth = 20 + +var resolveLatestRefFn = resolveLatestRef +var downloadWorkflowContentFn = downloadWorkflowContent + +type resolvedUpdateLocation struct { + sourceSpec *SourceSpec + currentRef string + latestRef string + sourceFieldRef string + content []byte + redirectHistory []string +} + +func resolveRedirectedUpdateLocation(ctx context.Context, workflowName string, initialSource *SourceSpec, allowMajor, verbose bool, noRedirect bool) (*resolvedUpdateLocation, error) { + current := &SourceSpec{ + Repo: initialSource.Repo, + Path: initialSource.Path, + Ref: initialSource.Ref, + } + visited := make(map[string]struct{}) + history := make([]string, 0, 2) + + for range maxRedirectDepth { + currentRef := current.Ref + if currentRef == "" { + currentRef = "main" + } + + locationKey := sourceSpecWithRef(current, currentRef) + if _, exists := visited[locationKey]; exists { + return nil, fmt.Errorf("redirect loop detected while updating %s at %s", workflowName, locationKey) + } + visited[locationKey] = struct{}{} + + latestRef, err := resolveLatestRefFn(ctx, current.Repo, currentRef, allowMajor, verbose) + if err != nil { + return nil, fmt.Errorf("failed to resolve latest ref for %s: %w", sourceSpecWithRef(current, currentRef), err) + } + + content, err := downloadWorkflowContentFn(ctx, current.Repo, current.Path, latestRef, verbose) + if err != nil { + return nil, fmt.Errorf("failed to download workflow %s: %w", sourceSpecWithRef(current, latestRef), err) + } + + redirect, err := extractRedirectFromContent(string(content)) + if err != nil { + return nil, err + } + + sourceFieldRef := latestRef + if isBranchRef(currentRef) { + sourceFieldRef = currentRef + } + + if redirect == "" { + return &resolvedUpdateLocation{ + sourceSpec: current, + currentRef: currentRef, + latestRef: latestRef, + sourceFieldRef: sourceFieldRef, + content: content, + redirectHistory: history, + }, nil + } + + if noRedirect { + return nil, fmt.Errorf("redirect is disabled by --no-redirect for %s: %s declares redirect to %s (remove redirect frontmatter or run update without --no-redirect)", workflowName, sourceSpecWithRef(current, latestRef), redirect) + } + + redirectedSource, err := normalizeRedirectToSourceSpec(redirect) + if err != nil { + return nil, fmt.Errorf("invalid redirect %q in %s: %w", redirect, sourceSpecWithRef(current, latestRef), err) + } + + nextRef := redirectedSource.Ref + if nextRef == "" { + nextRef = "main" + } + + redirectMessage := fmt.Sprintf("Workflow %s redirect: %s → %s", workflowName, sourceSpecWithRef(current, latestRef), sourceSpecWithRef(redirectedSource, nextRef)) + fmt.Fprintln(os.Stderr, console.FormatWarningMessage(redirectMessage)) + history = append(history, redirectMessage) + current = redirectedSource + } + + return nil, fmt.Errorf("redirect chain exceeded maximum depth (%d) while updating %s", maxRedirectDepth, workflowName) +} + +func extractRedirectFromContent(content string) (string, error) { + result, err := parser.ExtractFrontmatterFromContent(content) + if err != nil { + return "", fmt.Errorf("failed to parse redirected workflow frontmatter: %w", err) + } + + value, ok := result.Frontmatter["redirect"] + if !ok { + return "", nil + } + + redirect, ok := value.(string) + if !ok { + return "", fmt.Errorf("redirect must be a string, got %T", value) + } + + return strings.TrimSpace(redirect), nil +} + +func normalizeRedirectToSourceSpec(redirect string) (*SourceSpec, error) { + redirect = strings.TrimSpace(redirect) + if redirect == "" { + return nil, errors.New("redirect cannot be empty") + } + + if strings.Contains(redirect, "://") { + workflowSpec, workflowErr := parseWorkflowSpec(redirect) + if workflowErr != nil { + return nil, fmt.Errorf("must be a workflow spec or GitHub URL: %w", workflowErr) + } + if workflowSpec.RepoSlug == "" { + return nil, errors.New("redirect must point to a remote workflow location") + } + return &SourceSpec{ + Repo: workflowSpec.RepoSlug, + Path: workflowSpec.WorkflowPath, + Ref: workflowSpec.Version, + }, nil + } + + // First try strict source syntax (owner/repo/path@ref). + sourceSpec, sourceErr := parseSourceSpec(redirect) + if sourceErr == nil { + return sourceSpec, nil + } + + // Fall back to workflow spec syntax (including GitHub URLs). + workflowSpec, workflowErr := parseWorkflowSpec(redirect) + if workflowErr != nil { + return nil, fmt.Errorf("must be a workflow spec or GitHub URL: %w", workflowErr) + } + if workflowSpec.RepoSlug == "" { + return nil, errors.New("redirect must point to a remote workflow location") + } + + return &SourceSpec{ + Repo: workflowSpec.RepoSlug, + Path: workflowSpec.WorkflowPath, + Ref: workflowSpec.Version, + }, nil +} + +func sourceSpecWithRef(spec *SourceSpec, ref string) string { + if ref == "" { + return fmt.Sprintf("%s/%s", spec.Repo, spec.Path) + } + return fmt.Sprintf("%s/%s@%s", spec.Repo, spec.Path, ref) +} diff --git a/pkg/cli/update_redirects_test.go b/pkg/cli/update_redirects_test.go new file mode 100644 index 00000000000..74ee04c283c --- /dev/null +++ b/pkg/cli/update_redirects_test.go @@ -0,0 +1,130 @@ +//go:build !integration + +package cli + +import ( + "context" + "fmt" + "testing" + + "github.com/github/gh-aw/pkg/testutil" + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" +) + +func TestNormalizeRedirectToSourceSpec(t *testing.T) { + t.Run("parses workflow spec", func(t *testing.T) { + spec, err := normalizeRedirectToSourceSpec("owner/repo/workflows/new.md@main") + require.NoError(t, err, "redirect spec should parse") + assert.Equal(t, "owner/repo", spec.Repo, "repo should be parsed from redirect spec") + assert.Equal(t, "workflows/new.md", spec.Path, "path should be parsed from redirect spec") + assert.Equal(t, "main", spec.Ref, "ref should be parsed from redirect spec") + }) + + t.Run("parses GitHub URL", func(t *testing.T) { + spec, err := normalizeRedirectToSourceSpec("https://github.com/owner/repo/blob/main/workflows/new.md") + require.NoError(t, err, "redirect URL should parse") + assert.Equal(t, "owner/repo", spec.Repo, "repo should be parsed from redirect URL") + assert.Equal(t, "workflows/new.md", spec.Path, "path should be parsed from redirect URL") + assert.Equal(t, "main", spec.Ref, "ref should be parsed from redirect URL") + }) +} + +func TestResolveRedirectedUpdateLocation(t *testing.T) { + originalResolveLatestRef := resolveLatestRefFn + originalDownloadWorkflow := downloadWorkflowContentFn + t.Cleanup(func() { + resolveLatestRefFn = originalResolveLatestRef + downloadWorkflowContentFn = originalDownloadWorkflow + }) + + t.Run("follows redirect chain", func(t *testing.T) { + resolveLatestRefFn = func(_ context.Context, _ string, currentRef string, _ bool, _ bool) (string, error) { + return currentRef, nil + } + downloadWorkflowContentFn = func(_ context.Context, repo, path, ref string, _ bool) ([]byte, error) { + key := fmt.Sprintf("%s/%s@%s", repo, path, ref) + switch key { + case "owner/repo/workflows/original.md@main": + return []byte("---\nredirect: owner/repo/workflows/new.md@main\n---\n"), nil + case "owner/repo/workflows/new.md@main": + return []byte("---\nname: New Workflow\n---\n"), nil + default: + return nil, fmt.Errorf("unexpected download key %s", key) + } + } + + var result *resolvedUpdateLocation + output := testutil.CaptureStderr(t, func() { + var err error + result, err = resolveRedirectedUpdateLocation( + context.Background(), + "test-workflow", + &SourceSpec{Repo: "owner/repo", Path: "workflows/original.md", Ref: "main"}, + false, + false, + false, + ) + require.NoError(t, err, "redirect chain should resolve") + }) + assert.Equal(t, "owner/repo", result.sourceSpec.Repo, "final redirect repo should be tracked") + assert.Equal(t, "workflows/new.md", result.sourceSpec.Path, "final redirect path should be tracked") + assert.Equal(t, "main", result.sourceFieldRef, "branch ref should be preserved in source field") + assert.Len(t, result.redirectHistory, 1, "redirect history should include the followed hop") + assert.Contains(t, output, "Workflow test-workflow redirect: owner/repo/workflows/original.md@main → owner/repo/workflows/new.md@main", "redirect warning should explain source movement") + }) + + t.Run("detects redirect loops", func(t *testing.T) { + resolveLatestRefFn = func(_ context.Context, _ string, currentRef string, _ bool, _ bool) (string, error) { + return currentRef, nil + } + downloadWorkflowContentFn = func(_ context.Context, repo, path, ref string, _ bool) ([]byte, error) { + key := fmt.Sprintf("%s/%s@%s", repo, path, ref) + switch key { + case "owner/repo/workflows/a.md@main": + return []byte("---\nredirect: owner/repo/workflows/b.md@main\n---\n"), nil + case "owner/repo/workflows/b.md@main": + return []byte("---\nredirect: owner/repo/workflows/a.md@main\n---\n"), nil + default: + return nil, fmt.Errorf("unexpected download key %s", key) + } + } + + _, err := resolveRedirectedUpdateLocation( + context.Background(), + "loop-workflow", + &SourceSpec{Repo: "owner/repo", Path: "workflows/a.md", Ref: "main"}, + false, + false, + false, + ) + require.Error(t, err, "redirect loop should return an error") + assert.Contains(t, err.Error(), "redirect loop detected", "error should explain redirect loop") + }) + + t.Run("refuses redirect when no-redirect is enabled", func(t *testing.T) { + resolveLatestRefFn = func(_ context.Context, _ string, currentRef string, _ bool, _ bool) (string, error) { + return currentRef, nil + } + downloadWorkflowContentFn = func(_ context.Context, repo, path, ref string, _ bool) ([]byte, error) { + key := fmt.Sprintf("%s/%s@%s", repo, path, ref) + switch key { + case "owner/repo/workflows/original.md@main": + return []byte("---\nredirect: owner/repo/workflows/new.md@main\n---\n"), nil + default: + return nil, fmt.Errorf("unexpected download key %s", key) + } + } + + _, err := resolveRedirectedUpdateLocation( + context.Background(), + "no-redirect-workflow", + &SourceSpec{Repo: "owner/repo", Path: "workflows/original.md", Ref: "main"}, + false, + false, + true, + ) + require.Error(t, err, "redirect should be refused with --no-redirect") + assert.Contains(t, err.Error(), "redirect is disabled by --no-redirect", "error should explain redirect refusal") + }) +} diff --git a/pkg/cli/update_workflows.go b/pkg/cli/update_workflows.go index dc997ae8bc4..60ef2d9d571 100644 --- a/pkg/cli/update_workflows.go +++ b/pkg/cli/update_workflows.go @@ -17,8 +17,8 @@ import ( ) // UpdateWorkflows updates workflows from their source repositories -func UpdateWorkflows(ctx context.Context, workflowNames []string, allowMajor, force, verbose bool, engineOverride string, workflowsDir string, noStopAfter bool, stopAfter string, noMerge bool, noCompile bool) error { - updateLog.Printf("Scanning for workflows with source field: dir=%s, filter=%v, noMerge=%v, noCompile=%v", workflowsDir, workflowNames, noMerge, noCompile) +func UpdateWorkflows(ctx context.Context, workflowNames []string, allowMajor, force, verbose bool, engineOverride string, workflowsDir string, noStopAfter bool, stopAfter string, noMerge bool, noCompile bool, noRedirect bool) error { + updateLog.Printf("Scanning for workflows with source field: dir=%s, filter=%v, noMerge=%v, noCompile=%v, noRedirect=%v", workflowsDir, workflowNames, noMerge, noCompile, noRedirect) // Use provided workflows directory or default if workflowsDir == "" { @@ -50,7 +50,7 @@ func UpdateWorkflows(ctx context.Context, workflowNames []string, allowMajor, fo // Update each workflow for _, wf := range workflows { updateLog.Printf("Updating workflow: %s (source: %s)", wf.Name, wf.SourceSpec) - if err := updateWorkflow(ctx, wf, allowMajor, force, verbose, engineOverride, noStopAfter, stopAfter, noMerge, noCompile); err != nil { + if err := updateWorkflow(ctx, wf, allowMajor, force, verbose, engineOverride, noStopAfter, stopAfter, noMerge, noCompile, noRedirect); err != nil { updateLog.Printf("Failed to update workflow %s: %v", wf.Name, err) failedUpdates = append(failedUpdates, updateFailure{ Name: wf.Name, @@ -368,7 +368,7 @@ func resolveLatestRelease(ctx context.Context, repo, currentRef string, allowMaj } // updateWorkflow updates a single workflow from its source -func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, force, verbose bool, engineOverride string, noStopAfter bool, stopAfter string, noMerge bool, noCompile bool) error { +func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, force, verbose bool, engineOverride string, noStopAfter bool, stopAfter string, noMerge bool, noCompile bool, noRedirect bool) error { updateLog.Printf("Updating workflow: name=%s, source=%s, force=%v, noMerge=%v", wf.Name, wf.SourceSpec, force, noMerge) if verbose { @@ -378,31 +378,22 @@ func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, for } // Parse source spec - sourceSpec, err := parseSourceSpec(wf.SourceSpec) + initialSourceSpec, err := parseSourceSpec(wf.SourceSpec) if err != nil { updateLog.Printf("Failed to parse source spec: %v", err) return fmt.Errorf("failed to parse source spec: %w", err) } - // If no ref specified, use default branch - currentRef := sourceSpec.Ref - if currentRef == "" { - currentRef = "main" - } - - // Resolve latest ref - latestRef, err := resolveLatestRef(ctx, sourceSpec.Repo, currentRef, allowMajor, verbose) + resolvedLocation, err := resolveRedirectedUpdateLocation(ctx, wf.Name, initialSourceSpec, allowMajor, verbose, noRedirect) if err != nil { - return fmt.Errorf("failed to resolve latest ref: %w", err) + return err } - // For branch refs, resolveLatestRef returns the branch-head SHA so that - // we can detect upstream changes (currentRef != latestRef). However the - // source field must keep the branch *name* to avoid SHA-pinning. - sourceFieldRef := latestRef - if isBranchRef(currentRef) { - sourceFieldRef = currentRef - } + sourceSpec := resolvedLocation.sourceSpec + currentRef := resolvedLocation.currentRef + latestRef := resolvedLocation.latestRef + sourceFieldRef := resolvedLocation.sourceFieldRef + newContent := resolvedLocation.content if verbose { fmt.Fprintln(os.Stderr, console.FormatVerboseMessage("Current ref: "+currentRef)) @@ -410,11 +401,11 @@ func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, for } // Check if update is needed - if !force && currentRef == latestRef { + if !force && currentRef == latestRef && len(resolvedLocation.redirectHistory) == 0 { updateLog.Printf("Workflow already at latest ref: %s, checking for local modifications", currentRef) // Download the source content to check if local file has been modified - sourceContent, err := downloadWorkflowContent(ctx, sourceSpec.Repo, sourceSpec.Path, currentRef, verbose) + sourceContent, err := downloadWorkflowContentFn(ctx, sourceSpec.Repo, sourceSpec.Path, currentRef, verbose) if err != nil { // If we can't download for comparison, just show the up-to-date message if verbose { @@ -443,25 +434,22 @@ func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, for return nil } - // Download the latest version - if verbose { - fmt.Fprintln(os.Stderr, console.FormatVerboseMessage(fmt.Sprintf("Downloading latest version from %s/%s@%s", sourceSpec.Repo, sourceSpec.Path, latestRef))) - } - - newContent, err := downloadWorkflowContent(ctx, sourceSpec.Repo, sourceSpec.Path, latestRef, verbose) - if err != nil { - return fmt.Errorf("failed to download workflow: %w", err) + if len(resolvedLocation.redirectHistory) > 0 { + fmt.Fprintln(os.Stderr, console.FormatWarningMessage(fmt.Sprintf("Workflow %s source location changed; updating source to %s/%s@%s", wf.Name, sourceSpec.Repo, sourceSpec.Path, sourceFieldRef))) } // Determine merge mode. Merge is the default behaviour — it detects // local modifications and performs a 3-way merge to preserve them. // When --no-merge is used, local changes are overridden with upstream. merge := !noMerge + if len(resolvedLocation.redirectHistory) > 0 { + merge = false + } // When merge mode is on, detect local modifications to confirm we // actually need to merge (if no local mods, override is fine either way). if merge { - baseContent, dlErr := downloadWorkflowContent(ctx, sourceSpec.Repo, sourceSpec.Path, currentRef, verbose) + baseContent, dlErr := downloadWorkflowContentFn(ctx, sourceSpec.Repo, sourceSpec.Path, currentRef, verbose) if dlErr == nil { localContent, readErr := os.ReadFile(wf.Path) if readErr == nil && hasLocalModifications(string(baseContent), string(localContent), wf.SourceSpec, filepath.Dir(wf.Path), verbose) { @@ -489,7 +477,7 @@ func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, for fmt.Fprintln(os.Stderr, console.FormatVerboseMessage(fmt.Sprintf("Downloading base version from %s/%s@%s", sourceSpec.Repo, sourceSpec.Path, currentRef))) } - baseContent, err := downloadWorkflowContent(ctx, sourceSpec.Repo, sourceSpec.Path, currentRef, verbose) + baseContent, err := downloadWorkflowContentFn(ctx, sourceSpec.Repo, sourceSpec.Path, currentRef, verbose) if err != nil { return fmt.Errorf("failed to download base workflow: %w", err) } @@ -502,7 +490,7 @@ func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, for // Perform 3-way merge using git merge-file updateLog.Printf("Performing 3-way merge for workflow: %s", wf.Name) - mergedContent, conflicts, err := MergeWorkflowContent(string(baseContent), string(currentContent), string(newContent), wf.SourceSpec, sourceFieldRef, wf.Path, verbose) + mergedContent, conflicts, err := MergeWorkflowContent(string(baseContent), string(currentContent), string(newContent), wf.SourceSpec, sourceSpecWithRef(sourceSpec, sourceFieldRef), wf.Path, verbose) if err != nil { updateLog.Printf("Merge failed for workflow %s: %v", wf.Name, err) return fmt.Errorf("failed to merge workflow content: %w", err) @@ -521,7 +509,7 @@ func updateWorkflow(ctx context.Context, wf *workflowWithSource, allowMajor, for } // Update the source field in the new content with the new ref - newWithUpdatedSource, err := UpdateFieldInFrontmatter(string(newContent), "source", fmt.Sprintf("%s/%s@%s", sourceSpec.Repo, sourceSpec.Path, sourceFieldRef)) + newWithUpdatedSource, err := UpdateFieldInFrontmatter(string(newContent), "source", sourceSpecWithRef(sourceSpec, sourceFieldRef)) if err != nil { if verbose { fmt.Fprintln(os.Stderr, console.FormatWarningMessage(fmt.Sprintf("Failed to update source in new content: %v", err))) diff --git a/pkg/parser/schemas/main_workflow_schema.json b/pkg/parser/schemas/main_workflow_schema.json index d1b3c563c44..42abef8ec10 100644 --- a/pkg/parser/schemas/main_workflow_schema.json +++ b/pkg/parser/schemas/main_workflow_schema.json @@ -25,6 +25,11 @@ "description": "Optional source reference indicating where this workflow was added from. Format: owner/repo/path@ref (e.g., githubnext/agentics/workflows/ci-doctor.md@v1.0.0). Rendered as a comment in the generated lock file.", "examples": ["githubnext/agentics/workflows/ci-doctor.md", "githubnext/agentics/workflows/daily-perf-improver.md@1f181b37d3fe5862ab590648f25a292e345b5de6"] }, + "redirect": { + "type": "string", + "description": "Optional workflow location redirect for updates. Format: workflow spec or GitHub URL (e.g., owner/repo/path@ref or https://github.com/owner/repo/blob/main/path.md). When present, update follows this location and rewrites source.", + "examples": ["githubnext/agentics/workflows/ci-doctor-v2.md@main", "https://github.com/githubnext/agentics/blob/main/workflows/ci-doctor-v2.md"] + }, "tracker-id": { "type": "string", "minLength": 8, diff --git a/pkg/workflow/compiler.go b/pkg/workflow/compiler.go index 653b22a8b14..6853cc26d06 100644 --- a/pkg/workflow/compiler.go +++ b/pkg/workflow/compiler.go @@ -300,6 +300,12 @@ func (c *Compiler) validateWorkflowData(workflowData *WorkflowData, markdownPath c.IncrementWarningCount() } + // Inform users when this workflow is a redirect stub for updates. + if workflowData.Redirect != "" { + fmt.Fprintln(os.Stderr, formatCompilerMessage(markdownPath, "info", + "workflow redirect configured: updates move to "+workflowData.Redirect)) + } + // Validate workflow_run triggers have branch restrictions log.Printf("Validating workflow_run triggers for branch restrictions") if err := c.validateWorkflowRunBranches(workflowData, markdownPath); err != nil { @@ -762,7 +768,7 @@ func (c *Compiler) CompileWorkflowData(workflowData *WorkflowData, markdownPath // Emitting a warning instead of failing allows compilation to succeed so that the lock // file is written and the agent receives the actionable guidance embedded in the warning. if c.effectiveSafeUpdate(workflowData) { - if enforceErr := EnforceSafeUpdate(oldManifest, bodySecrets, bodyActions); enforceErr != nil { + if enforceErr := EnforceSafeUpdate(oldManifest, bodySecrets, bodyActions, workflowData.Redirect); enforceErr != nil { warningMsg := buildSafeUpdateWarningPrompt(enforceErr.Error()) c.AddSafeUpdateWarning(warningMsg) fmt.Fprintln(os.Stderr, formatCompilerMessage(markdownPath, "warning", enforceErr.Error())) diff --git a/pkg/workflow/compiler_types.go b/pkg/workflow/compiler_types.go index c1ba519b780..1bf0bfa7296 100644 --- a/pkg/workflow/compiler_types.go +++ b/pkg/workflow/compiler_types.go @@ -389,6 +389,7 @@ type WorkflowData struct { FrontmatterHash string // SHA-256 hash of frontmatter (computed before job building, used to derive stable heredoc delimiters) Description string // optional description rendered as comment in lock file Source string // optional source field (owner/repo@ref/path) rendered as comment in lock file + Redirect string // optional redirect field describing a moved workflow location TrackerID string // optional tracker identifier for created assets (min 8 chars, alphanumeric + hyphens/underscores) ImportedFiles []string // list of files imported via imports field (rendered as comment in lock file) ImportedMarkdown string // Only imports WITH inputs (for compile-time substitution) diff --git a/pkg/workflow/compiler_yaml.go b/pkg/workflow/compiler_yaml.go index 410bbb325c5..d2ab35ca8c0 100644 --- a/pkg/workflow/compiler_yaml.go +++ b/pkg/workflow/compiler_yaml.go @@ -118,7 +118,7 @@ func (c *Compiler) generateWorkflowHeader(yaml *strings.Builder, data *WorkflowD // Embed the gh-aw-manifest immediately after gh-aw-metadata for easy machine parsing. // The manifest records all secrets, external actions, and container images detected at // compile time so that subsequent compilations can perform safe update enforcement. - manifest := NewGHAWManifest(secrets, actions, data.DockerImagePins) + manifest := NewGHAWManifest(secrets, actions, data.DockerImagePins, data.Redirect) if manifestJSON, err := manifest.ToJSON(); err == nil { fmt.Fprintf(yaml, "# gh-aw-manifest: %s\n", manifestJSON) } else { diff --git a/pkg/workflow/frontmatter_extraction_metadata.go b/pkg/workflow/frontmatter_extraction_metadata.go index 29ec9ec6f16..cd761ad74c1 100644 --- a/pkg/workflow/frontmatter_extraction_metadata.go +++ b/pkg/workflow/frontmatter_extraction_metadata.go @@ -68,6 +68,21 @@ func (c *Compiler) extractSource(frontmatter map[string]any) string { return "" } +// extractRedirect extracts the redirect field from frontmatter +func (c *Compiler) extractRedirect(frontmatter map[string]any) string { + value, exists := frontmatter["redirect"] + if !exists { + return "" + } + + // Convert the value to string + if strValue, ok := value.(string); ok { + return strings.TrimSpace(strValue) + } + + return "" +} + // extractTrackerID extracts and validates the tracker-id field from frontmatter func (c *Compiler) extractTrackerID(frontmatter map[string]any) (string, error) { value, exists := frontmatter["tracker-id"] diff --git a/pkg/workflow/frontmatter_serialization.go b/pkg/workflow/frontmatter_serialization.go index 94f9e50510d..bb90c989357 100644 --- a/pkg/workflow/frontmatter_serialization.go +++ b/pkg/workflow/frontmatter_serialization.go @@ -71,6 +71,9 @@ func (fc *FrontmatterConfig) ToMap() map[string]any { if fc.Source != "" { result["source"] = fc.Source } + if fc.Redirect != "" { + result["redirect"] = fc.Redirect + } if fc.TrackerID != "" { result["tracker-id"] = fc.TrackerID } diff --git a/pkg/workflow/frontmatter_types.go b/pkg/workflow/frontmatter_types.go index d3eeb8626ff..399c15fda3a 100644 --- a/pkg/workflow/frontmatter_types.go +++ b/pkg/workflow/frontmatter_types.go @@ -146,6 +146,7 @@ type FrontmatterConfig struct { // ParseFrontmatterConfig to return nil and break features that depend on it (e.g. OTLP). Engine any `json:"engine,omitempty"` Source string `json:"source,omitempty"` + Redirect string `json:"redirect,omitempty"` TrackerID string `json:"tracker-id,omitempty"` Version string `json:"version,omitempty"` TimeoutMinutes *TemplatableInt32 `json:"timeout-minutes,omitempty"` diff --git a/pkg/workflow/frontmatter_types_test.go b/pkg/workflow/frontmatter_types_test.go index 621d71560bf..5602f417041 100644 --- a/pkg/workflow/frontmatter_types_test.go +++ b/pkg/workflow/frontmatter_types_test.go @@ -58,6 +58,7 @@ func TestParseFrontmatterConfig(t *testing.T) { "description": "A complete workflow", "engine": "copilot", "source": "owner/repo/path@main", + "redirect": "owner/repo/new-path@main", "tracker-id": "test-tracker-123", "tools": map[string]any{ "bash": map[string]any{ @@ -97,6 +98,9 @@ func TestParseFrontmatterConfig(t *testing.T) { if config.Source != "owner/repo/path@main" { t.Errorf("Source = %q, want %q", config.Source, "owner/repo/path@main") } + if config.Redirect != "owner/repo/new-path@main" { + t.Errorf("Redirect = %q, want %q", config.Redirect, "owner/repo/new-path@main") + } if config.TrackerID != "test-tracker-123" { t.Errorf("TrackerID = %q, want %q", config.TrackerID, "test-tracker-123") diff --git a/pkg/workflow/redirect_field_test.go b/pkg/workflow/redirect_field_test.go new file mode 100644 index 00000000000..a43a8855b2f --- /dev/null +++ b/pkg/workflow/redirect_field_test.go @@ -0,0 +1,51 @@ +//go:build !integration + +package workflow + +import ( + "os" + "path/filepath" + "testing" + + "github.com/github/gh-aw/pkg/stringutil" + "github.com/github/gh-aw/pkg/testutil" + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" +) + +func TestRedirectFieldExtraction(t *testing.T) { + compiler := NewCompiler() + redirect := compiler.extractRedirect(map[string]any{"redirect": "owner/repo/workflows/new.md@main"}) + assert.Equal(t, "owner/repo/workflows/new.md@main", redirect, "redirect should be extracted from frontmatter") + + redirect = compiler.extractRedirect(map[string]any{}) + assert.Empty(t, redirect, "missing redirect should return empty string") +} + +func TestCompileWorkflow_PrintsRedirectInfoMessage(t *testing.T) { + tmpDir := testutil.TempDir(t, "redirect-compile-test") + workflowFile := filepath.Join(tmpDir, "redirected.md") + workflowContent := `--- +redirect: owner/repo/workflows/new-location.md@main +on: + workflow_dispatch: +permissions: + contents: read +engine: copilot +--- + +# Redirected Workflow` + require.NoError(t, os.WriteFile(workflowFile, []byte(workflowContent), 0644), "workflow fixture should be written") + + compiler := NewCompiler() + output := testutil.CaptureStderr(t, func() { + err := compiler.CompileWorkflow(workflowFile) + require.NoError(t, err, "workflow should compile when redirect is configured") + }) + + assert.Contains(t, output, "workflow redirect configured", "compile output should describe redirect usage") + assert.Contains(t, output, "owner/repo/workflows/new-location.md@main", "compile output should include redirect target") + + lockFile := stringutil.MarkdownToLockFile(workflowFile) + require.NoError(t, os.Remove(lockFile), "lock file should be cleaned up") +} diff --git a/pkg/workflow/safe_update_enforcement.go b/pkg/workflow/safe_update_enforcement.go index 9fdcbc09566..45e4fc728ca 100644 --- a/pkg/workflow/safe_update_enforcement.go +++ b/pkg/workflow/safe_update_enforcement.go @@ -48,7 +48,7 @@ var ghAwInternalSecrets = map[string]bool{ // e.g. "actions/checkout@abc1234 # v4". // // Returns a structured, actionable error when violations are found. -func EnforceSafeUpdate(manifest *GHAWManifest, secretNames []string, actionRefs []string) error { +func EnforceSafeUpdate(manifest *GHAWManifest, secretNames []string, actionRefs []string, currentRedirect string) error { if manifest == nil { // Lock file exists but predates the safe-updates feature (no gh-aw-manifest // section). Skip enforcement so legacy lock files are not flagged on upgrade. @@ -58,8 +58,9 @@ func EnforceSafeUpdate(manifest *GHAWManifest, secretNames []string, actionRefs secretViolations := collectSecretViolations(manifest, secretNames) addedActions, removedActions := collectActionViolations(manifest, actionRefs) + addedRedirect, removedRedirect := collectRedirectViolations(manifest, currentRedirect) - if len(secretViolations) == 0 && len(addedActions) == 0 && len(removedActions) == 0 { + if len(secretViolations) == 0 && len(addedActions) == 0 && len(removedActions) == 0 && addedRedirect == "" && removedRedirect == "" { safeUpdateLog.Printf("Safe update check passed (%d secret(s), %d action(s) verified)", len(secretNames), len(actionRefs)) return nil @@ -77,8 +78,14 @@ func EnforceSafeUpdate(manifest *GHAWManifest, secretNames []string, actionRefs safeUpdateLog.Printf("Safe update violation: %d action(s) removed: %s", len(removedActions), strings.Join(removedActions, ", ")) } + if addedRedirect != "" { + safeUpdateLog.Printf("Safe update violation: redirect added: %s", addedRedirect) + } + if removedRedirect != "" { + safeUpdateLog.Printf("Safe update violation: redirect removed: %s", removedRedirect) + } - return buildSafeUpdateError(secretViolations, addedActions, removedActions) + return buildSafeUpdateError(secretViolations, addedActions, removedActions, addedRedirect, removedRedirect) } // collectSecretViolations returns the normalized secret names that are new (not in the @@ -196,9 +203,32 @@ func collectActionViolations(manifest *GHAWManifest, actionRefs []string) (added return added, removed } +// collectRedirectViolations compares the redirect recorded in the previous manifest +// with the redirect currently configured in frontmatter. +// It returns: +// - added: a redirect newly configured in current frontmatter +// - removed: a previously-approved redirect that is now absent +func collectRedirectViolations(manifest *GHAWManifest, currentRedirect string) (added string, removed string) { + knownRedirect := strings.TrimSpace(manifest.Redirect) + current := strings.TrimSpace(currentRedirect) + + if knownRedirect == current { + return "", "" + } + if knownRedirect == "" && current != "" { + return current, "" + } + if knownRedirect != "" && current == "" { + return "", knownRedirect + } + // At this point both values are non-empty and differ after TrimSpace normalization, + // so treat the change as one removed redirect plus one added redirect. + return current, knownRedirect +} + // buildSafeUpdateError creates a clear, structured error message that names the -// offending secrets and actions and tells the user how to remediate. -func buildSafeUpdateError(secretViolations, addedActions, removedActions []string) error { +// offending secrets, actions, and redirects and tells the user how to remediate. +func buildSafeUpdateError(secretViolations, addedActions, removedActions []string, addedRedirect, removedRedirect string) error { var sb strings.Builder sb.WriteString("safe update mode detected unapproved changes\n") @@ -214,6 +244,14 @@ func buildSafeUpdateError(secretViolations, addedActions, removedActions []strin sb.WriteString("\nPreviously-approved action(s) removed:\n - ") sb.WriteString(strings.Join(removedActions, "\n - ")) } + if addedRedirect != "" { + sb.WriteString("\nNew redirect configured:\n - ") + sb.WriteString(addedRedirect) + } + if removedRedirect != "" { + sb.WriteString("\nPreviously-approved redirect removed:\n - ") + sb.WriteString(removedRedirect) + } sb.WriteString("\n\nRemediation options:\n 1. Use the --approve flag to allow the changes.\n 2. Revert the unapproved changes.\n 3. Use an interactive coding agent to review and approve the changes.") return fmt.Errorf("%s", sb.String()) @@ -233,7 +271,7 @@ func buildSafeUpdateWarningPrompt(violationMsg string) string { sb.WriteString(" 1. Review every new secret and action listed above for potential malicious use\n") sb.WriteString(" (e.g. credential exfiltration, supply-chain attacks, unexpected network access).\n") sb.WriteString(" 2. Add a security review note to the pull request description that:\n") - sb.WriteString(" - Lists the new secrets and actions introduced by this compilation.\n") + sb.WriteString(" - Lists any new or removed secrets, actions, and redirect changes from this compilation.\n") sb.WriteString(" - Confirms you have reviewed each one and explains why it is safe to add.\n") sb.WriteString(" - Flags any item you cannot fully verify so a human reviewer can inspect it.\n") return sb.String() diff --git a/pkg/workflow/safe_update_enforcement_test.go b/pkg/workflow/safe_update_enforcement_test.go index 3d141a9fd82..869827da29f 100644 --- a/pkg/workflow/safe_update_enforcement_test.go +++ b/pkg/workflow/safe_update_enforcement_test.go @@ -15,6 +15,7 @@ func TestEnforceSafeUpdate(t *testing.T) { manifest *GHAWManifest secretNames []string actionRefs []string + redirect string wantErr bool wantErrMsgs []string }{ @@ -285,11 +286,60 @@ func TestEnforceSafeUpdate(t *testing.T) { actionRefs: []string{"github/gh-aw-actions/setup@abc1234 # v0.68.1"}, wantErr: false, }, + { + name: "new redirect causes failure", + manifest: &GHAWManifest{Version: 1}, + secretNames: []string{}, + actionRefs: []string{}, + redirect: "owner/repo/workflows/new.md@main", + wantErr: true, + wantErrMsgs: []string{"New redirect configured", "owner/repo/workflows/new.md@main"}, + }, + { + name: "removed redirect causes failure", + manifest: &GHAWManifest{ + Version: 1, + Redirect: "owner/repo/workflows/old.md@main", + }, + secretNames: []string{}, + actionRefs: []string{}, + redirect: "", + wantErr: true, + wantErrMsgs: []string{"Previously-approved redirect removed", "owner/repo/workflows/old.md@main"}, + }, + { + name: "changed redirect reports add and remove", + manifest: &GHAWManifest{ + Version: 1, + Redirect: "owner/repo/workflows/old.md@main", + }, + secretNames: []string{}, + actionRefs: []string{}, + redirect: "owner/repo/workflows/new.md@main", + wantErr: true, + wantErrMsgs: []string{ + "New redirect configured", + "Previously-approved redirect removed", + "owner/repo/workflows/new.md@main", + "owner/repo/workflows/old.md@main", + }, + }, + { + name: "redirect whitespace differences are normalized", + manifest: &GHAWManifest{ + Version: 1, + Redirect: "owner/repo/workflows/new.md@main", + }, + secretNames: []string{}, + actionRefs: []string{}, + redirect: " owner/repo/workflows/new.md@main ", + wantErr: false, + }, } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - err := EnforceSafeUpdate(tt.manifest, tt.secretNames, tt.actionRefs) + err := EnforceSafeUpdate(tt.manifest, tt.secretNames, tt.actionRefs, tt.redirect) if tt.wantErr { require.Error(t, err, "expected safe update enforcement error") for _, msg := range tt.wantErrMsgs { @@ -305,7 +355,7 @@ func TestEnforceSafeUpdate(t *testing.T) { func TestBuildSafeUpdateError(t *testing.T) { t.Run("secrets only", func(t *testing.T) { violations := []string{"NEW_SECRET", "ANOTHER_SECRET"} - err := buildSafeUpdateError(violations, nil, nil) + err := buildSafeUpdateError(violations, nil, nil, "", "") require.Error(t, err, "should return an error") msg := err.Error() @@ -316,7 +366,7 @@ func TestBuildSafeUpdateError(t *testing.T) { }) t.Run("added actions only", func(t *testing.T) { - err := buildSafeUpdateError(nil, []string{"evil-org/bad-action"}, nil) + err := buildSafeUpdateError(nil, []string{"evil-org/bad-action"}, nil, "", "") require.Error(t, err, "should return an error") msg := err.Error() assert.Contains(t, msg, "evil-org/bad-action", "action in message") @@ -324,24 +374,44 @@ func TestBuildSafeUpdateError(t *testing.T) { }) t.Run("removed actions only", func(t *testing.T) { - err := buildSafeUpdateError(nil, nil, []string{"actions/setup-node"}) + err := buildSafeUpdateError(nil, nil, []string{"actions/setup-node"}, "", "") require.Error(t, err, "should return an error") msg := err.Error() assert.Contains(t, msg, "actions/setup-node", "action in message") assert.Contains(t, msg, "Previously-approved action", "section header in message") }) + t.Run("added redirect only", func(t *testing.T) { + err := buildSafeUpdateError(nil, nil, nil, "owner/repo/workflows/new.md@main", "") + require.Error(t, err, "should return an error") + msg := err.Error() + assert.Contains(t, msg, "New redirect configured", "added redirect section header in message") + assert.Contains(t, msg, "owner/repo/workflows/new.md@main", "added redirect in message") + }) + + t.Run("removed redirect only", func(t *testing.T) { + err := buildSafeUpdateError(nil, nil, nil, "", "owner/repo/workflows/old.md@main") + require.Error(t, err, "should return an error") + msg := err.Error() + assert.Contains(t, msg, "Previously-approved redirect removed", "removed redirect section header in message") + assert.Contains(t, msg, "owner/repo/workflows/old.md@main", "removed redirect in message") + }) + t.Run("mixed violations", func(t *testing.T) { err := buildSafeUpdateError( []string{"MY_SECRET"}, []string{"evil-org/bad-action"}, []string{"actions/checkout"}, + "owner/repo/workflows/new.md@main", + "owner/repo/workflows/old.md@main", ) require.Error(t, err, "should return an error") msg := err.Error() assert.Contains(t, msg, "MY_SECRET", "secret in message") assert.Contains(t, msg, "evil-org/bad-action", "added action in message") assert.Contains(t, msg, "actions/checkout", "removed action in message") + assert.Contains(t, msg, "New redirect configured", "added redirect section in message") + assert.Contains(t, msg, "Previously-approved redirect removed", "removed redirect section in message") }) } diff --git a/pkg/workflow/safe_update_manifest.go b/pkg/workflow/safe_update_manifest.go index a42ce5a580f..47f37d60ade 100644 --- a/pkg/workflow/safe_update_manifest.go +++ b/pkg/workflow/safe_update_manifest.go @@ -44,6 +44,7 @@ type GHAWManifest struct { Secrets []string `json:"secrets"` Actions []GHAWManifestAction `json:"actions"` Containers []GHAWManifestContainer `json:"containers,omitempty"` // container images used, with digest when available + Redirect string `json:"redirect,omitempty"` // frontmatter redirect target for moved workflows } // NewGHAWManifest builds a GHAWManifest from the raw secret names, action reference @@ -56,7 +57,7 @@ type GHAWManifest struct { // "actions/checkout@abc1234 # v4" // // containers is the list of container image entries with full digest info (when available). -func NewGHAWManifest(secretNames []string, actionRefs []string, containers []GHAWManifestContainer) *GHAWManifest { +func NewGHAWManifest(secretNames []string, actionRefs []string, containers []GHAWManifestContainer, redirect string) *GHAWManifest { safeUpdateManifestLog.Printf("Building gh-aw-manifest: raw_secrets=%d, raw_actions=%d, containers=%d", len(secretNames), len(actionRefs), len(containers)) // Normalize secret names to full "secrets.NAME" form and deduplicate. @@ -94,6 +95,7 @@ func NewGHAWManifest(secretNames []string, actionRefs []string, containers []GHA Secrets: secrets, Actions: actions, Containers: sortedContainers, + Redirect: strings.TrimSpace(redirect), } } diff --git a/pkg/workflow/safe_update_manifest_test.go b/pkg/workflow/safe_update_manifest_test.go index 67745e62b70..d3d01d5768b 100644 --- a/pkg/workflow/safe_update_manifest_test.go +++ b/pkg/workflow/safe_update_manifest_test.go @@ -15,10 +15,12 @@ func TestNewGHAWManifest(t *testing.T) { secretNames []string actionRefs []string containers []GHAWManifestContainer + redirect string wantVersion int wantSecrets []string wantActionRepos []string wantContainerImages []string + wantRedirect string }{ { name: "empty inputs", @@ -101,15 +103,24 @@ func TestNewGHAWManifest(t *testing.T) { { name: "nil containers produces empty containers field", containers: nil, + redirect: "", wantVersion: 1, wantSecrets: []string{}, wantContainerImages: []string{}, + wantRedirect: "", + }, + { + name: "redirect is included when configured", + redirect: "owner/repo/workflows/new.md@main", + wantVersion: 1, + wantSecrets: []string{}, + wantRedirect: "owner/repo/workflows/new.md@main", }, } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - m := NewGHAWManifest(tt.secretNames, tt.actionRefs, tt.containers) + m := NewGHAWManifest(tt.secretNames, tt.actionRefs, tt.containers, tt.redirect) require.NotNil(t, m, "manifest should not be nil") assert.Equal(t, tt.wantVersion, m.Version, "manifest version") if tt.wantSecrets != nil { @@ -129,6 +140,7 @@ func TestNewGHAWManifest(t *testing.T) { } assert.Equal(t, tt.wantContainerImages, images, "container images") } + assert.Equal(t, tt.wantRedirect, m.Redirect, "manifest redirect") }) } } @@ -144,7 +156,7 @@ func TestNewGHAWManifestContainerDigest(t *testing.T) { Image: "alpine:3.14", // no digest }, } - m := NewGHAWManifest(nil, nil, containers) + m := NewGHAWManifest(nil, nil, containers, "") require.Len(t, m.Containers, 2, "should have two containers") // Sorted: alpine before node @@ -175,6 +187,7 @@ func TestGHAWManifestToJSON(t *testing.T) { Actions: []GHAWManifestAction{ {Repo: "actions/checkout", SHA: "abc123", Version: "v4"}, }, + Redirect: "owner/repo/workflows/new.md@main", } json, err := m.ToJSON() @@ -185,16 +198,18 @@ func TestGHAWManifestToJSON(t *testing.T) { assert.Contains(t, json, `"actions/checkout"`, "action repo in JSON") assert.Contains(t, json, `"abc123"`, "action SHA in JSON") assert.Contains(t, json, `"v4"`, "action version in JSON") + assert.Contains(t, json, `"redirect":"owner/repo/workflows/new.md@main"`, "redirect in JSON") } func TestExtractGHAWManifestFromLockFile(t *testing.T) { tests := []struct { - name string - content string - wantNil bool - wantErr bool - wantVersion int - wantSecrets []string + name string + content string + wantNil bool + wantErr bool + wantVersion int + wantSecrets []string + wantRedirect string }{ { name: "no manifest line returns nil", @@ -226,6 +241,13 @@ name: my-workflow`, wantVersion: 1, wantSecrets: []string{"FOO"}, }, + { + name: "manifest with redirect field", + content: `# gh-aw-manifest: {"version":1,"secrets":[],"actions":[],"redirect":"owner/repo/workflows/new.md@main"}`, + wantVersion: 1, + wantSecrets: []string{}, + wantRedirect: "owner/repo/workflows/new.md@main", + }, } for _, tt := range tests { @@ -243,6 +265,7 @@ name: my-workflow`, require.NotNil(t, m, "manifest should not be nil") assert.Equal(t, tt.wantVersion, m.Version, "manifest version") assert.Equal(t, tt.wantSecrets, m.Secrets, "manifest secrets") + assert.Equal(t, tt.wantRedirect, m.Redirect, "manifest redirect") }) } } diff --git a/pkg/workflow/workflow_builder.go b/pkg/workflow/workflow_builder.go index 3818e6b4bfe..3c812a3ecee 100644 --- a/pkg/workflow/workflow_builder.go +++ b/pkg/workflow/workflow_builder.go @@ -37,6 +37,7 @@ func (c *Compiler) buildInitialWorkflowData( FrontmatterYAML: strings.Join(result.FrontmatterLines, "\n"), Description: c.extractDescription(result.Frontmatter), Source: c.extractSource(result.Frontmatter), + Redirect: c.extractRedirect(result.Frontmatter), TrackerID: toolsResult.trackerID, ImportedFiles: importsResult.ImportedFiles, ImportedMarkdown: toolsResult.importedMarkdown, // Only imports WITH inputs