From 2dd7da562ab2412109b6721b02d0963c63990410 Mon Sep 17 00:00:00 2001 From: James Date: Tue, 21 Apr 2026 19:11:14 +0000 Subject: [PATCH 1/7] feat(codex-plugin): add @hyperframes/codex-plugin package Bundles the 5 hyperframes skills (hyperframes, hyperframes-cli, hyperframes-registry, gsap, website-to-hyperframes) as an OpenAI Codex plugin per developers.openai.com/codex/plugins/build. Users can install directly: codex plugin marketplace add heygen-com/hyperframes --sparse packages/codex-plugin The skills/ directory under packages/codex-plugin/ is a built copy of the repo-root skills/. A lefthook pre-commit hook auto-rebuilds when any skill file changes, and `bun run lint` runs a drift check so CI fails if the committed plugin copy diverges. --- README.md | 6 + bun.lock | 23 +- lefthook.yml | 5 + package.json | 4 +- .../codex-plugin/.codex-plugin/plugin.json | 42 ++ packages/codex-plugin/README.md | 54 ++ packages/codex-plugin/assets/icon.png | Bin 0 -> 42811 bytes packages/codex-plugin/assets/logo.png | Bin 0 -> 42018 bytes packages/codex-plugin/build.mts | 78 +++ packages/codex-plugin/package.json | 20 + packages/codex-plugin/skills/gsap/SKILL.md | 211 +++++++ .../skills/gsap/references/effects.md | 297 +++++++++ .../skills/gsap/scripts/extract-audio-data.py | 188 ++++++ .../skills/hyperframes-cli/SKILL.md | 114 ++++ .../skills/hyperframes-registry/SKILL.md | 104 +++ .../references/demo-html-pattern.md | 54 ++ .../references/discovery.md | 53 ++ .../references/install-locations.md | 45 ++ .../references/wiring-blocks.md | 91 +++ .../references/wiring-components.md | 77 +++ .../codex-plugin/skills/hyperframes/SKILL.md | 347 ++++++++++ .../skills/hyperframes/data-in-motion.md | 19 + .../skills/hyperframes/house-style.md | 71 +++ .../hyperframes/palettes/bold-energetic.md | 14 + .../hyperframes/palettes/clean-corporate.md | 14 + .../hyperframes/palettes/dark-premium.md | 14 + .../skills/hyperframes/palettes/jewel-rich.md | 14 + .../skills/hyperframes/palettes/monochrome.md | 14 + .../hyperframes/palettes/nature-earth.md | 14 + .../hyperframes/palettes/neon-electric.md | 14 + .../hyperframes/palettes/pastel-soft.md | 14 + .../hyperframes/palettes/warm-editorial.md | 14 + .../skills/hyperframes/patterns.md | 118 ++++ .../hyperframes/references/audio-reactive.md | 76 +++ .../skills/hyperframes/references/captions.md | 132 ++++ .../hyperframes/references/css-patterns.md | 373 +++++++++++ .../references/dynamic-techniques.md | 90 +++ .../references/motion-principles.md | 69 ++ .../references/transcript-guide.md | 151 +++++ .../hyperframes/references/transitions.md | 112 ++++ .../references/transitions/catalog.md | 117 ++++ .../references/transitions/css-3d.md | 12 + .../references/transitions/css-blur.md | 51 ++ .../references/transitions/css-cover.md | 43 ++ .../references/transitions/css-destruction.md | 95 +++ .../references/transitions/css-dissolve.md | 66 ++ .../references/transitions/css-distortion.md | 45 ++ .../references/transitions/css-grid.md | 10 + .../references/transitions/css-light.md | 49 ++ .../references/transitions/css-mechanical.md | 30 + .../references/transitions/css-other.md | 25 + .../references/transitions/css-push.md | 41 ++ .../references/transitions/css-radial.md | 37 ++ .../references/transitions/css-scale.md | 24 + .../skills/hyperframes/references/tts.md | 75 +++ .../hyperframes/references/typography.md | 175 +++++ .../hyperframes/scripts/animation-map.mjs | 596 ++++++++++++++++++ .../hyperframes/scripts/contrast-report.mjs | 335 ++++++++++ .../skills/hyperframes/visual-styles.md | 211 +++++++ .../skills/website-to-hyperframes/SKILL.md | 121 ++++ .../references/step-1-capture.md | 74 +++ .../references/step-2-design.md | 178 ++++++ .../references/step-3-script.md | 96 +++ .../references/step-4-storyboard.md | 247 ++++++++ .../references/step-5-vo.md | 42 ++ .../references/step-6-build.md | 166 +++++ .../references/step-7-validate.md | 88 +++ .../references/techniques.md | 387 ++++++++++++ 68 files changed, 6578 insertions(+), 8 deletions(-) create mode 100644 packages/codex-plugin/.codex-plugin/plugin.json create mode 100644 packages/codex-plugin/README.md create mode 100644 packages/codex-plugin/assets/icon.png create mode 100644 packages/codex-plugin/assets/logo.png create mode 100644 packages/codex-plugin/build.mts create mode 100644 packages/codex-plugin/package.json create mode 100644 packages/codex-plugin/skills/gsap/SKILL.md create mode 100644 packages/codex-plugin/skills/gsap/references/effects.md create mode 100644 packages/codex-plugin/skills/gsap/scripts/extract-audio-data.py create mode 100644 packages/codex-plugin/skills/hyperframes-cli/SKILL.md create mode 100644 packages/codex-plugin/skills/hyperframes-registry/SKILL.md create mode 100644 packages/codex-plugin/skills/hyperframes-registry/references/demo-html-pattern.md create mode 100644 packages/codex-plugin/skills/hyperframes-registry/references/discovery.md create mode 100644 packages/codex-plugin/skills/hyperframes-registry/references/install-locations.md create mode 100644 packages/codex-plugin/skills/hyperframes-registry/references/wiring-blocks.md create mode 100644 packages/codex-plugin/skills/hyperframes-registry/references/wiring-components.md create mode 100644 packages/codex-plugin/skills/hyperframes/SKILL.md create mode 100644 packages/codex-plugin/skills/hyperframes/data-in-motion.md create mode 100644 packages/codex-plugin/skills/hyperframes/house-style.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/bold-energetic.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/clean-corporate.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/dark-premium.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/jewel-rich.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/monochrome.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/nature-earth.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/neon-electric.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/pastel-soft.md create mode 100644 packages/codex-plugin/skills/hyperframes/palettes/warm-editorial.md create mode 100644 packages/codex-plugin/skills/hyperframes/patterns.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/audio-reactive.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/captions.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/css-patterns.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/dynamic-techniques.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/motion-principles.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transcript-guide.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/catalog.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-3d.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-blur.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-cover.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-destruction.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-dissolve.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-distortion.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-grid.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-light.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-mechanical.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-other.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-push.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-radial.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/transitions/css-scale.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/tts.md create mode 100644 packages/codex-plugin/skills/hyperframes/references/typography.md create mode 100644 packages/codex-plugin/skills/hyperframes/scripts/animation-map.mjs create mode 100644 packages/codex-plugin/skills/hyperframes/scripts/contrast-report.mjs create mode 100644 packages/codex-plugin/skills/hyperframes/visual-styles.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/SKILL.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-1-capture.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-2-design.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-3-script.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-4-storyboard.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-5-vo.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-6-build.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/step-7-validate.md create mode 100644 packages/codex-plugin/skills/website-to-hyperframes/references/techniques.md diff --git a/README.md b/README.md index b25001e62..4622c23c3 100644 --- a/README.md +++ b/README.md @@ -33,6 +33,12 @@ npx skills add heygen-com/hyperframes This teaches your agent (Claude Code, Cursor, Gemini CLI, Codex) how to write correct compositions and GSAP animations. In Claude Code, the skills register as slash commands — invoke `/hyperframes` to author compositions, `/hyperframes-cli` for CLI commands, and `/gsap` for animation help. +For Codex specifically, the same skills are also packaged as an [OpenAI Codex plugin](./packages/codex-plugin/): + +```bash +codex plugin marketplace add heygen-com/hyperframes --sparse packages/codex-plugin +``` + #### Try it: example prompts Copy any of these into your agent to get started. The `/hyperframes` prefix loads the skill context explicitly so you get correct output the first time. diff --git a/bun.lock b/bun.lock index db108df88..2243bd99c 100644 --- a/bun.lock +++ b/bun.lock @@ -21,7 +21,7 @@ }, "packages/cli": { "name": "@hyperframes/cli", - "version": "0.4.5", + "version": "0.4.11", "bin": { "hyperframes": "./dist/cli.js", }, @@ -60,9 +60,16 @@ "@google/genai": "^1.50.0", }, }, + "packages/codex-plugin": { + "name": "@hyperframes/codex-plugin", + "version": "0.1.0", + "devDependencies": { + "tsx": "^4.19.2", + }, + }, "packages/core": { "name": "@hyperframes/core", - "version": "0.4.5", + "version": "0.4.11", "dependencies": { "@chenglou/pretext": "^0.0.5", }, @@ -88,7 +95,7 @@ }, "packages/engine": { "name": "@hyperframes/engine", - "version": "0.4.5", + "version": "0.4.11", "dependencies": { "@hono/node-server": "^1.13.0", "@hyperframes/core": "workspace:^", @@ -106,7 +113,7 @@ }, "packages/player": { "name": "@hyperframes/player", - "version": "0.4.5", + "version": "0.4.11", "devDependencies": { "tsup": "^8.0.0", "typescript": "^5.0.0", @@ -115,7 +122,7 @@ }, "packages/producer": { "name": "@hyperframes/producer", - "version": "0.4.5", + "version": "0.4.11", "dependencies": { "@fontsource/archivo-black": "^5.2.8", "@fontsource/eb-garamond": "^5.2.7", @@ -154,7 +161,7 @@ }, "packages/shader-transitions": { "name": "@hyperframes/shader-transitions", - "version": "0.4.5", + "version": "0.4.11", "dependencies": { "html2canvas": "^1.4.1", }, @@ -166,7 +173,7 @@ }, "packages/studio": { "name": "@hyperframes/studio", - "version": "0.4.5", + "version": "0.4.11", "dependencies": { "@codemirror/autocomplete": "^6.20.1", "@codemirror/commands": "^6.10.3", @@ -438,6 +445,8 @@ "@hyperframes/cli": ["@hyperframes/cli@workspace:packages/cli"], + "@hyperframes/codex-plugin": ["@hyperframes/codex-plugin@workspace:packages/codex-plugin"], + "@hyperframes/core": ["@hyperframes/core@workspace:packages/core"], "@hyperframes/engine": ["@hyperframes/engine@workspace:packages/engine"], diff --git a/lefthook.yml b/lefthook.yml index ac6c70ef0..e211b91dc 100644 --- a/lefthook.yml +++ b/lefthook.yml @@ -12,6 +12,11 @@ pre-commit: typecheck: glob: "*.{ts,tsx}" run: cd packages/core && bunx tsc --noEmit && cd ../studio && bunx tsc --noEmit + codex-plugin-sync: + # Rebuild the Codex plugin's skills/ copy whenever a source skill changes. + # Auto-stages the rebuilt output so the committed plugin never drifts. + glob: "skills/**/*" + run: bun run build:codex-plugin && git add packages/codex-plugin/skills commit-msg: commands: diff --git a/package.json b/package.json index 6a7c897c0..cfad59682 100644 --- a/package.json +++ b/package.json @@ -20,7 +20,9 @@ "set-version": "tsx scripts/set-version.ts", "sync-schemas": "tsx scripts/sync-schemas.ts", "sync-schemas:check": "tsx scripts/sync-schemas.ts --check", - "lint": "oxlint . && tsx scripts/lint-skills.ts", + "build:codex-plugin": "bun run --cwd packages/codex-plugin build", + "check:codex-plugin": "bun run --cwd packages/codex-plugin check", + "lint": "oxlint . && tsx scripts/lint-skills.ts && bun run check:codex-plugin", "lint:skills": "tsx scripts/lint-skills.ts", "lint:fix": "oxlint --fix .", "format": "oxfmt .", diff --git a/packages/codex-plugin/.codex-plugin/plugin.json b/packages/codex-plugin/.codex-plugin/plugin.json new file mode 100644 index 000000000..7e9aea197 --- /dev/null +++ b/packages/codex-plugin/.codex-plugin/plugin.json @@ -0,0 +1,42 @@ +{ + "name": "hyperframes", + "version": "0.1.0", + "description": "Write HTML, render video. Compositions, GSAP animations, captions, voiceovers, audio-reactive visuals, and website-to-video capture for HyperFrames.", + "author": { + "name": "HeyGen", + "url": "https://github.com/heygen-com" + }, + "homepage": "https://hyperframes.dev", + "repository": "https://github.com/heygen-com/hyperframes", + "license": "Apache-2.0", + "keywords": [ + "hyperframes", + "video", + "html", + "gsap", + "animation", + "composition", + "rendering", + "captions", + "tts", + "audio-reactive" + ], + "skills": "./skills/", + "interface": { + "displayName": "HyperFrames", + "shortDescription": "Write HTML, render video", + "longDescription": "Build videos from HTML with HyperFrames. Author compositions with HTML + CSS + GSAP, use the CLI for init/preview/render/transcribe/tts, install reusable registry blocks and components, follow the GSAP animation reference, and turn any website into a video with the 7-step capture-to-video pipeline.", + "developerName": "HeyGen", + "category": "Design", + "capabilities": ["Read", "Write"], + "websiteURL": "https://hyperframes.dev", + "defaultPrompt": [ + "Turn this website into a 20-second product promo", + "Create an animated title card with kinetic type", + "Add synced captions to this voiceover" + ], + "brandColor": "#0a0a0a", + "composerIcon": "./assets/icon.png", + "logo": "./assets/logo.png" + } +} diff --git a/packages/codex-plugin/README.md b/packages/codex-plugin/README.md new file mode 100644 index 000000000..290a19b94 --- /dev/null +++ b/packages/codex-plugin/README.md @@ -0,0 +1,54 @@ +# @hyperframes/codex-plugin + +[OpenAI Codex](https://developers.openai.com/codex/plugins) plugin that bundles the HyperFrames skills for AI-assisted video authoring. + +## Install (direct from this repo) + +```bash +codex plugin marketplace add heygen-com/hyperframes --sparse packages/codex-plugin +``` + +Then enable the `hyperframes` plugin in Codex. + +## Requirements + +The skills invoke the `hyperframes` CLI via `npx hyperframes`, which needs: + +- **Node.js >= 22** +- **FFmpeg** on `PATH` + +See [hyperframes.dev/quickstart](https://hyperframes.dev/quickstart) for full setup. + +## What's inside + +Five skills, source-of-truth in `skills/` at the repo root, copied verbatim by [`build.mts`](./build.mts): + +- **hyperframes** — composition authoring (HTML + GSAP + CSS), house style, visual styles, palettes +- **hyperframes-cli** — `hyperframes init / lint / preview / render / transcribe / tts / doctor` +- **hyperframes-registry** — `hyperframes add` to install registry blocks and components +- **gsap** — GSAP tween/timeline/performance reference +- **website-to-hyperframes** — 7-step pipeline turning any URL into a video + +## Structure + +``` +packages/codex-plugin/ + .codex-plugin/plugin.json # Codex manifest + assets/ # icon + logo (PNGs) + skills/ # built output — DO NOT edit directly + build.mts # copies ../../skills// → ./skills// +``` + +## Editing skills + +Edit the source in [`skills/`](../../skills/) at the repo root. Run the build to refresh the plugin copy: + +```bash +bun run --cwd packages/codex-plugin build +``` + +CI runs `bun run --cwd packages/codex-plugin check` on every PR to fail if `skills/` here has drifted from the sources. + +## Publishing + +The plugin ships in the main HyperFrames release. To submit to OpenAI's plugin directory, we fork [openai/plugins](https://github.com/openai/plugins), copy the contents of this package into `plugins/hyperframes/`, append an entry to `.agents/plugins/marketplace.json`, and open a PR. diff --git a/packages/codex-plugin/assets/icon.png b/packages/codex-plugin/assets/icon.png new file mode 100644 index 0000000000000000000000000000000000000000..05a8356a43f5abb76325dcb33be8e23b3356e706 GIT binary patch literal 42811 zcmeEt_fu1C&^AbyswhRkfFMCBp(8ak1r!M?(joLBO}f<3iy&2c2}MDrN$-T-A@mNB z-aDZL623g|%=<5VKfN=VN#;K1%)Qxr_uAcS_e5%Hs8HX1beDjDfLcve`8@%_9sJKb z1mypH`8-l8At0b5P*YaW`jWZlN0#|i#rXn(DwrQokjioy-g*F0yid*%L`AM#6-cWt z$P_H3%gw@K&SLIT(H79x#-L{rX%oUF59HwZ?y3P?%n~T{!|o5Od+NCBAbos1uRRV6 zaA2Fs$zk=62(L0lowpG1gS(-MqpD|0xtcr=zJ*8;gevCR@JtZ$Jh=N(DVIhbzx?Q~ zJHdZ9Y5pR@-}5z0ku5+mS5x%C4gvnzyj)GP9Q-nM=>I+Xe`@+allcGOIbXBrL@wa6 z*MGe#X);BcY`RAT#z{9J88y2{Terp489EM2XT9BT2PEWNhB|o5^jcQ_gikHC-2M8h zT|Re4QqV6<@`k5!KyfhJZezIXh^koMR;RSJV_>h@bJoC7nMiQ6_v%_^w^bH72eWCb zIzZ}sUZgj+`0Qq(n!LUTrpsTX|VWCP!1t-7eB&Gb7mv?oe7azRZVw%AWn^XAx);#KA$ld2TE z=hJh$+xt(j_z~~1aSZf$Zt7WKY^f#7YMxJ^K+~Q49+#jXkDg+v^hqDB z)72m>ru2a+S?AUhWW1BC>7n)G0E$+O&$KbFAnI|;8D64Kc(mJN+VYt#x`z72%=$r2 zepWB~1r|}|R^bi?U5v=TT;>jyeKEeSK$(TN6THPDSKIqo`xoOIB0muOqTwUi8NNcM%gc;eg;#HVP?P}0m5em4Ej;AY?p->Rmqu#ug=UrS1$yW zn%yL9tP@OAuYDj`X5x-=i_(4=^cHq(%?Z9DEMUTfT?4ScMLZ28pzl7cId~xSkek8L|?FlJSIhzg8;BmGxKG zn{(HNIKp$RC96_!9;=n`kl%s0G@@AS+y^T_#oJg<2T%JiCX4G%4O_XW1==neni4`D zZhEWbytdlgfht zE?T>%<-T)`+uJ*aiq=zDrK|v-s5RB=R_re11##lHRq*9Zb9OnnKyYdS(`#9VIEvqO z-;H2)|M10qH|`_1)qmu%ncEN% zp%Ee~`nmq4!m8iw` zYsmG)CK>$p$zZyXGVAaw9kSN(u+f*wfUQfWjFBntv{1APy0|RKf+I!kJB;w1 ztAvF=H+1whXSdt>SIc&q`yLAS-L7o9_e%#j4|FLLIotS|u*;+JmB)$(A*9YvjAZbao)8jVP;~v$oWATovkMcA+`{8G*+T^4< z;iMk|k!_neC9|^c*cnIoOl5Wy_7e@>y@z+bdFm4EIRLk2ZH2p;##+mi>uNcL^RRH_ zf$~qdN^9w!_zAsHed0|nshke^?w=QDw~B_p&s)~(3L@F!r@4&E@AgmlxCa(b_L8^U zHOEby6Sfu;ZDyYD7d}4O$yc{~UC8$~Op2h(BS5ci{>Vp8aFvmfT4sOSyj)yS=3u*@ z%q?P9q}^DYAPMlRMPB=w9y!|ThoG^w)%rpU@`<}P^LC1}` zr|fEv_2G3>a89b(f2Z;~m06@9>ohEl5h`_-Qr10^1Kp&sJLBn=zxF1GJf;AwEhGpe z%g(){;n<`vU!bWVjub888Lr?>&JUkKm-}!;o`uf9>_xfGC4N|${KzzUo#Rw@o9YR7 z87q7YPT!Ijvdp287xJf~p=n+LspR8wU6*Fk#dv27!E>yO0_kW6eU<>}u#qWv)Q+K6 zOPj~{bqp-X&^VFZnj$GO2z9WmO_yJOd-K%ia`Bsb{fd)~xcnu@xkMoJ2DwRWiHJ)x zdl>Z6de5)>${Hl&0O2Y4FT8gLx3sh|HhrEd12OyeNDtk3wY9gqNJW#IKb%gfVLJHo zCI&g`VhFKh))N>4hNLSySmycdi)}sJV5`kw-Yep3advKgYkn)BG6o8n`u>NGia*uk z#2Wtj3?ZYRpkUj!+6$a;;AO+znxt7VO>KPcmoXOx-9I#X!3hjH+-41yP; zwknn+qlgWinccb@m$6S8y{?L^mP)GZm9TIXLKhsM1Kk67`B3Jp(_ z{*Y1-(|7x&Hlm)lkJHbpULQl{>ZH0~3SdRZexpHVx7yt2o7IH zAA*VVAW_d_fx_E=+t;|+HqA3uJd_B%(z&wsy;-dU)tUS|E+opz?pBm_xJp-_A6KIx z6eBc} zf=Kj*pK>bI8RQVWVFZNV%mP0p+Qj1P-g_hZb)PXlDCrAE1e45e@n4CGzpHoj$}XgN zwM1XDy!Q0|Yf*wq(9oRsN+Ru}x6l&Uu@#u^cNrh8*$<|zIc%Y9Zf{z>2(abQpLEV9 z5HF!l)#Rl~3TXFD@rfAw0zV1GhVER+cD0Y6e`zPNj*ZbgyNmye=7O%98n7Pla6N*w zb>M%wfXh5v&=)z3zF8CG{NuUZnL|S`OM*;*j8WGP@keE+xHHeYRcPnt0pM1>R_Xc` zF>G$&KS@jmoInb+<^R^qE?`c(YP2?%jU<48_8XS>w1OdkX^Q7KCjB4cBSS4pvR(gf zZ1cTZZNx_;imAUg2S^1JbPT2n-FR*eUJrGR<5ocWO2PiHD&bOqbbHUqtFODR)qynT zRlzJsj!(7*d&jMx?)UbFrZxtw(`)N8?yO^w(b$c1B=ZTwf)$_iH*D7<0%5FT#3u@N zUxHT=*u@<>?jmg&ecCTKwApfa%rOhfrOBxm3=SU0og}$}uFz))KQ=3MkY_(%zCA*s zB6j7VtSvhi1A(!*-tW@6N4>**b z=$gHKIbfF>Hw@y9&i|NN%XzL1^g^Yj06%K%@HK@clI87*WyIJdc8U4lNP3i z{w)7VLN?X{F;{PQ(nG}34N?7z0bS5l3;2-VbfQ|bUfLY_bIjw#!ROAfn+tf$EhTNJ z*_|Ak90ews)nixK4K@vM{Tbzd-Cgd>wpf~HuVm(>w1hIFo22-|)6&nugNlI)l&LX2 zE3M_gf8c+aJmp3YjxosAOxy8`tWEy1$n4F#3{4MvaGgOal7W^;+VS+j->XF1JBh8d zas;vlY&#iKwU=;+_n9?Zu*&4ogZYPeRRrQzW74j%hl99w*ag5}HzL0|>zgD;DWuz^ zjnZI&=3evDq~I@Af79xg`tg8z8u{5Vj(3`LOV#BLsP==U*}%_cR{~0nS?P+B_;nKb z-P(eRMgAU%_%4-h`reyQU0|&{O=OB|fj6(ePbd1^QJ3+Mc0h^@gMCJseH6;EIW#6f zm0Q#B;j5F)4gXCRA{D^HSwF#suVp1s2hIG+Hwqcov>D~sD_iHTrM_W9>vZTV;_iFX zDKzbeoDNEI@!H(4Uey?SJ@pQ)cf|G>%m?uox3ZUawFQMI0dk?eS`Sxe=-s|z^8Go? z6R@Gr;onbyRKc#{8kh;@o|erC-1Zz-gCAas!mceT;Y@NH2?hhcU0Gb_Mut+@FK~iYdU069po6M&(9& z5M9e`w)Wg^?OpZk-|Z4tVr$iin95tUYQ9}O^>@aN=)r?uTNjJ z82nW5gKCn=?XseA`Hm=#oJRlLN7!0$!J*`;(Csxa{C2S%uaj>h>YoY~?Ql)ou+yDo z#%6-xVDsD3J(GG7_4?_iDvc)w1`iq)s``3e$Q_krjc(S8A}Zc?N8g%oE~s*CoKnbX z$CMh>t8&Fy-nVk4r6h2TK@pWOEdEB!98b(^4wvmY>9HNp2DqCK9P>wBlt3q+ftjtb zR~e*1;LkVZz>tztQT8N)I=;Xy+ubwJbc80_&;gQ~EV(wRvVUGDo6>@7udXc%GAqhq zZq>LW;<^zn(d3AG35MesL2du!|2msEOZ|@Mw&SG32-U7u?w%&^87)b%lw2uSYd18Qy=+;K?$r5`U{qlD zYGjlSWlrp04Cav?J>(HcS4K+xL(&H0MWC3Vp%r=Y;--QT`DKWAXhCvuapbvjnOXO9 zwajBVgVCnGN^;c4L9p5Zpf7b%mzsjV_q%E>Zp(%G>o7>fvTjx8w z3%ofz-Q+)(~>kD2TG10h|Wky+8+e>g3`dlJLA11kTfs_p`r)D@zac+KSb`|nUANZ?i zJj^%f+%cUC9>7=2zbSxr3BBPwd%@Aiu{AS*28>Uq`z&PpQnRfz`YLRNApcYsTkWdQ zUr-YM07aa9%SDnAsA;gDx?WRO3!kRP&9#;S4{;%s!%O%=OJ3;lmQR|eI+MtLn)MKN zttSkQjh-50{>)nvR?-|4^Xqzu8PJ%tTYq+`vFedvM`id+pWo8j7S=RA#YV3P6K-&O ziV2pEie|xGS8_@$g=Cu?IQ#4!7e;LZErj9D{9w21OQN^`{=VDLfvc;9XQ@Qd&cxk3 z+hf_OsCC{}lMLHNU`He-YB$HvGs-UpEg4bMOABFnls2S)S53DfW1*jMwVVEXety@< zl;oDg`%6DRj&wOqygV}(M~)NBU*-JRrhmV@@FJXjGl=!jwgNz8FZ_CX?>Rg`{o1J6*}*B$6xH^GZ2lJrU{24NHc%{X~HAQGOY1mdD+Q#Y4fl2=gM( zFM202hAz>t$1kEo*)KPL1gtq6qRrh)pKc~BW*2di)H`zNu)vj7-z#gx+3ang&pF^0G<(B1&!8GN}4IOonrxhQlTHA=z=|Xonrz z)~zpY)r2Oj)r8OBzb>_hIk?g%fcCXwkuDz9F6+KQBcm+5U6>Fgl{jeMh*VIY;2%~z zY{pC}&%6gRHB#worDRUYa!Sv_ruUhCn$+c*@}69RT%LV$@bMm~Dv7&EY~tcqEl3j0 zYppTwAoYM^LPXIY6wU4N0KW}KYc>w_M9L>A6!V5@U*_Hvm|i}4l`yxP z;L@7Jo^E$bss;H->0`mp#Qn=HV@Utu%5G*4o;fP}GSGE_UDMzp7bQy77ZkA`^j@`N zG$nBNd;Nm|8xrWyk3#?rT%U3p8*lIZtx zh~L)Kf&wL-p<_Cn7jqd^h1nx_Xc68t&r z;r_XSX+%K(o3mB$*$kd^)O@jJEdPPTYHlPtrTOzl5p*hDWWLMCGGtyLe{bbiu~C7S z%RP>oKHW{COp=Rx6)7)FU2Of@6V%&l*!DjWWs*wP{*`;lXP7zpGBoG=M>Rg=AFgu` zto7WIE15^u_McgQvd0D`c{y`NJ$eRjv~9@&*{COJamS6vagDZ-=vwaLdwOOL4qpaz z|0(^O|9hyJi?|?5E~ZiQrIXW7^J+t;Z$H~cxKp$lLr}03de6qaOenYsPnP4=cUKUL z6*0akY{X*1E0<0h-t8V1KaMQp*W0a49*^4+|EUl|Cb=};^oXwqt@cqbVc?X4>x%f> zZqzNATpzIbsiZ=oRA3w=U^LP&9ij!R&H(M368;PE%Pe_tzEPDngR(MPYgOsoJUsd7 zOOCv&H8nLFpxH(3{iF;wC}CkNGktHvvSAFGdiX+6pX$q?4$^mGUc-j^7kd0r>ssFA zBf0BmmA40FnDgxoSrwW05czrvhlLGLa%qMq+3_DS7wTT`=ETXLhR!ht&r*z|jAavS zBpiK@{;rN5yz98kJ(RPH1l0d*GGu+!%CoTt)-)VVtIt?&lWXExZjbHA!Nl*phyl*FaQpzTN)_w|K<<^wa=I) zZLd#@cS}E6wXBaWaB;(k%+G?5^-(fnf zJCs4^=&;VN3F@-7a+!SY6KeFo9iYTr2NQ)qj`I+Gm7k=t8)M5EogC9X`KLPmvtvc_ z{XKIt35U4dVwstSyOje*x?Wdh%8pqiDpu+;DhA1_GKQdO80iKL1pHaYYP=*;=goW6 zzdffevfVkpqB?T7sN%IcU=v+z#EVFz>JBojSmhmxo(*&*$gQJY@Wo!!9WGHh?FmI|a1;abA`;dO$wi zig|i`HBnYcD}OptQlw?iiJ_hG95yhPz5&L9ZfzGm#KK0p0GjTG-|Y&!7uloIN&N2w zuPvM%*|=GUNhx?jsr)_iG)wxsS7a5M|Cb90duVL_lNS#~N!iTxq>d!aocm-nVB0FH zfb4&*web4s0l>EdpZ%hbXkUoz>UAtK0U7F__XMSfSu^R0P1BPbM|jKU{GiTV{z_yt z0d_gPkB$aD-PQ~D0o9%MBM=s+f8N?q;XV5E*?zeVE_|te9@!#y11t;|^CrS^zhvxt zqx?|J2NuRXOE=jl`B+gu(bfugzu9ZAA$@N}Oyz^wr$~8h1PInTpa%z>o zT!r`WG&TE{otK0B8F0zf)*i^T0}$mQ@@`WNcg6Ns0W z?Tvj2UhA%ROpS|i9Ow6(u~^-3+2MuF=u_kp`upN2lN0{}3L0@BFSESf}4e5;x2HI@UVNz6^JL*`?;&ENeO0FJ`c~T6I1SHAFiB6d@yz zc3^{(w4Ti zgCxJxtj>fk9B0AX4hqMEGO(NL>0M9U4+eBA>xJ# z)npeuv^qC7qf6fqWq0#nn6$YAa6Y8Z^XjXlapgGO$%FQ$5*fQ(t|D&FX2XX&hG~ep zJVMjNMNg4%>1!wBDLf-g(c4C(!?>fx+Z2(4#hR5x#fH@RH@V-TOi=pHTdBa1xEtds zM&aS8EfAV?O3N+rZrm}p?y=FP04*!F?z~@RgnsX$)_U($qA=xTou~S2Hs&0_WTO!7 z8y8m*c;XZ`5)Dubx%iE}N zimWwC(6NM1*;hhEP%p6b@~@)Yy@s&j{pX$>oC7|u^$4z5zdEmN82DIW zRfMiX{0Pfq>G4^V9TBN5lsXUhZVp6jF*OqZo0@$bq;(;%Vo`jVeAGAr-YA?6cu~f% zJb+bHyK0ZZQYoJ_7J>LH!nt5zfZ6&k$tnn5#y@@?Vp+SapQ}R@m zs}t$25KX8 z*Eh*`cgu)Aj^7@?k4CE7H=JE;JyeI`;>32z^4cb8tfLdH14K683 zbfSl@jQ&*NlEjGTv2Or{Tr1tK?aZ;o{?B;{I^B~88NXmwKO>q0*`+}UQcGcs%IP(0}%Y zNhm?%)@9LIm&v}W49k>eJte@^`N0<15GOq>r?2S;fAr6DOv8*j;oco_Dr2U zcHODYX4dJTWBASxK@Dke6z%*z3w&n32*CWb9i!A>)jp-BY6xQGEdHi-EY{dk9m$uG zH2G0Z;go=3bv!+!@0?9izgR6UqGctj>TEBF!x>w*p2t#eaM09fJq3GE30VMt&7zLN zUforH!1=*{>;v+0w4Xw=78TRV=)J=X@9YgLgI~T|8DC`6MbJfJ^$?nw^z88605tjJ zaiD!^+=kQ`PEo0m27dAr$d4^OJ*!Q z!iGebr?(+6S4DBn4yfz(p8;doGUv2F4-L%_LBq@2$Bp|Z%dmyhO(?Vf*zYdS_`}?d zRqvS-*h7c7vE&Is%DIX^w&{SJBHue9fbo}ZPF`6#QR&KBz&N-Ofn>raot9KlD0{q} zxt?S_-KsYM+bIRiSIw!R$`eAM4z@A9T`1r0)@@SDd(+|hnfJkitG;6k<)*L$yFxjE zujskw!Byc#5${94JWu+` z34DRWLcS~|8#t;!{%(-~*)B`19S@n(gzmLGY?*FpkYzKohQ;O2;uQH_ElFWdy z$wd4;jB#%?F<fnx-N3~4$jf%?66-5H1s;rQi7s)#N?#I*C&6oHV zd)X5^hUxq~hf%1i4_$mWdRB;jnv;Hd+&3?xOwFOSuR6{5Nh!VVn%#Yh#SP!*jI$Vg zL2%EVwaaC@*Ok0orJ#pGbrrid8!9v>ccO|czU8q3&BD<|%0ey*DneFGH3E3!?dpAx zpo`72Gh4gCDVhkm6sD$c5dW~E+prYIYg&Dewj4_C?7&`&&~!8oJCfeG>kH$ME0a{= z|F84Va(GJ99z#JCN-QGdFbtJ(IxjLFk+8F>G6t6p1cX+&=cvr=Tp;#cr1QU%BsbRe zhfqWnwXd7&Wx0?`Y5Ouvt4M}yYr?lmHR2>au~!J+t&dI$WuM#2P~ufR3qgBK`*+pt zjgn3Dng(TfB2zj_5_ibD*gJb4f|KQ1lFl@IIX2_gr?%i(DXxSJqnl0c7}RZ4V=2Ba z&Cot~G;iC-gMwLVVBjpsZ+M9@OZxB|Y*fPM)fXLydgEBSj1*Lue)=SZ{4l?=9$iCp z4#nH`xze>qPipfhbIs)Nv~P}REUF}~P4^u`g71@&1xzcJcr~4vy$ijz@B$w9>Ao!u zo~UI>F^)7g{k*)__HGxF_l8Rr7y`hI3^0itc|l~0r6nTMDwzMth5d|G-9n{e;29Wa`AvaQI0PN$mRa)gNo(pvej_l(Y`Uv~kH#?FB6#o+qb zr!YLN4tK`QQbOkGr5{i|cw917Yz_80kOb|cT;J_Gzf)xnc5(2nV&24wlk&xBJe}yJ zUZ#$pwDeQ_KCd@)a)F!VMz9UsVuk67$hF=zyKfu+>{!jf~h)X!r1motMR7 zn#(QG?PKD(3Zc2!rKvvQf0D}IVZ@J?Q`75~+BfO5cIiirDYd1-w3O|&5|7+irwc_Z z&=hu`cIhqoXk7$S`7(;sX1=jVV2{1e?pCI-5{&rqcF@-c!p|nz{}2-UJHj64BIJ~@ zeUH4uklvep`d%@#M>4IFcOy~nU}>Of(g1yIqQ2?~%CW0O4Uabw*)dBOSK)HFaBHOO zC&oQ;K^vJcfKU641h#8rrDNo!A)Hys5R8kR$k}z}j8s0DQrlyQ?{wI4p0lby)7N{) z5&rpp9|enmG3MW`51V(W`7mttESS-hRmkOmPZ%P7t@rq}2tvkTNm)h5nL&k^5E=2f z?mQ6&<07PBj;q;>MNz3o4zsVZ`>NO7_za>OVrZ$-aOtQ1_)!)iz6NuPb@Fw)03kf* zR+Q{e@dl&5wBC}XWf*m-H9Ed-vSs#2JR8u6O5(Ur9iOpAoCz9p5a5k}v!%*VqSlF~ z;2c=-eG+vPAI<$oEua(KyC#sVI=Dv9g6EV6sE$TmG`lD{>pEV2X&{4|-np+wa&c zy`KX@@bC`_rt9J6e!s9 ze7;J$`i5RSBwHq#3trY+Yk^Ce-1M~;4%E1&uywzZZtZB7e!5QuK`C*})a8o=T1V4R&(tq187A0$;xE zmDlAu%_k6o1ffdkw@7n~eu$X`3wq*1ix#aa=2OEs>i&@Q;F}rfFArln__x1mP|aen zDk4!7S|{0!)kxmj-ompKDF-^J=)kU9my>+(#gAN)aY3}n2-6vVCivDJl#RV9%t6r7 zGf%6S57n>6v*c?#YH={7bUJV8an_O{)m2^ck%q0^yruQGdS!}bN57n!{WpZJ({oCn zKKC%RJ0JY2x;hkdqh8xm&L>Rn;^1Aq^9(Wets?2{dAIecOLt{NeB-!v(~a{EIL1qD zR=y8}7Lt3#TAQ7r?bvj}&$x)kXThYA2&x=0pN*i}W)~DCLUi+`b#B?mr9)v2P|)yP z-O}cE*YVSDU!4?O$RXEG!Isl~6BG?e29KR~*Mrssw0mjyYBuR3;5y`zZBsIhbX(3~ z5stoUwsl8C?}y%@QzNa>`R9?CaS^*MGlu{)W#qB&X865dD}IR+r2R7yPeuQi3t*a6 zt|oqS$zThJS7-`*tTHs&6ula}b$B_s$Afw~q(;p?09u0EC+L&I8MDgbuofF1WgPI4 z@rPH}#hXEwyK?hT8+vuyJ5(%fBgUuBhDzE#9juLsyuLNR7s#Mj zHSXYMa@n>$6@XjkZBxzrjxmsaLjkMVV99=U;l)rtl34YfM>gtrVawMLx~}00Twv)P zdo{=(*oF6d|C@VhZtpe33K02?sf^R`5x&~hPnxpROVyYY1QwB`BOCH8AWNJ`kO=Dv z#T8=(FegC>A2qv;&tA9vy;7ailRQn<{XG*m4%$N0+p-@r>A#&_iuWFCISK`933!FI z)_Bn<*pjKex+uixVrgW?XS@LaK)>gia|+O#`VuWvBhZ>5Q-lY%dzL zT4jCCiITE~ervLpozWLrF(mBWxN8f2TTp0hs8n_-{@E~^Z@;bYvcw*t`3Lsz6ze5_ zNq5CvglNT6lue*sOK}sdqWl~v#P`4JoA^_(>z1noT$dCAFcLi`cUw)YJNGu_Y$pyb z4$I#-gGz+B>s&jYyyt}m;QX2Z_oSy4d%3vk4S?kxdSWiXv)RyeJ&28^;`f)l#2HRa z=xiY3L36t))O#qihjqtX&9tj28|o$3(af!WG%_RJ;tuD=%ITjxn?5Om>D~?`Y*>zC zRDMD(rDEQ|q#b2@WXpuLc|Ine0WR?IaEcTQXjoesFus6g0MZ8cgFR}fOy-2b_fPX7 ze3(`ij3-p^-07B|Z`Dr4fFl{;X-J#qaXlK``%)bK?|!0GpcXvB@>nV`(NV!QLOm;> zZ$}Oumz{Q0$zU-$YW3-Uz5nyx9ZTRz|<%A{>q>6Zurs0C>POk9oDZ?n;suLb>AMOJB zT9mkG*-u^O+CM(vTkbvmbQPU@VAN^RGId|JuW4cHI#hla(3n}VxeB{1#!oSJt(k5* z6ZlbX2w;vf&T;4A1#IPr(6VjD^Df=Z$vzshq_Wc7B3FK|n%-_CL)$=}+mqTI6{^iA zoYV7-71`UD6Qv{??oB0Y+8%%3+Gwl1ol?)liY512_RNH6tcAG_Ky>4xq79v}`C_aA zWeQJOGqyddL(Kac67waD=cLI5S0kDS{{=W}Q%#Lj-&-w~d_52}L;r=PpO0_$y4Y}D zNUb_iS4!!U`$!=syi${Wa;}w@q5ehY&bGGwuVzlM_w7#cUe!OmBmw5_`fpQH-foi6 zBd6;g|5d&{`S8crls7$vLoRH`Nf}Ftcsf{nv|ElVZ6DmU3&M59D>c3aqZbPKU)bK` zrjQOM!gKhiP;B8f^s-(L>Dy$>+`=gHv(X#IXVI{~ZT4-+V=h=O!OpzCx0mG0+xBS+ zzP{(5PAG&tw{_DZTp81qG#5i21LuX{pM9lHFvLF=NIEMjS%a=9puoQj%z74vw;4;# zB=({=g~>D)CCq(3e$T+$MUf7?t2#C6JGG|AE`Os*&BWPHLj5_)Dwi?gTdBwt<)L>? z>ervt2O%2|XC0Xe9R9$5ibUYPgOEkqx2d2D%Jt2s+h9y=;FKC%(sjw1Qixd)|Ed6X zG|JNlg`(N3{1s81NP@f3UCKQHn}WS-AKAHX#sspi_1r%>jxJp*2zNzGc1C$nEHA_W zkPqS-`710OBv7ZVMul>K=5~c!bpMt0W23k;|G6B!BsOighJrnjft5SMeDgb>xa&{E zPer_AQUDw4*I9fCvzAvz+qO2of0MlXZ*<+_4E#!-doP^cy%^3&W%nGI8)5|Ck`TY16!~Ip29_jk7cctEH_yO`VQ7D5kvdepW`>}pACP@L(cXV zC1Z#=#)>NQB&m@4P>!g%D`tlP!#IrE%s7M|&4G)ie5uOq)?dp$XK^6m-LrRlnQ}gT zACxfTY0Gs@Wzd3fuK$ps$?{8kz8NmRUfzm)Y4o=Jbl#9m;=*zLi8sPL?qRHW^k>n>Js{74!B5HDEVeS6j3a(swkLCmFq}#eK#gw&K0A*xlcLgEQKncOmwr7Fe}th#k>>JK>dXb{G2DqUGD5v7daz^-CS@ zabj}yeG!5&7ip2sn*wKF1h;3cxht+;l{emxb(-4F&P_y1zm8xd| zrKr;m?42(f9UD*gKwPJ9A6>g*&7o!+0>)pqB!M~MGsmp0?Ve38hsc(EvBut{O#XtZ zO2=e_+Bo_NUEPkUAp2fmJ5Mop?y$AnFxOaR6Q9(Ec9kkke4o+I2?f5Fw<6s~E}<>t z&OEA@4+|KqKP9$aK51=I)FpRnxP zlR|?}D)!1sd9w|7SI~4;AF0a-xoT!SOoup1IosTdAtUL^zpORuYhFDOejGXl>=kS7E!6ptsEaxx))>d zn;FH$%M}UCLFNSY4CJlfzqEJY@`j`-{7q%XQe+*u9zJ`c?Gxg~EDYitdSm=E5-riW zLMBJ&737V-Jjh!QjjO2rN9&u+ekS)VMg_6yT+$|YMl2mow3Q)i4tKqE=r)fHq?|K0PCWZ;#qBYCkK6WhVyr>dBeS*nQ zhB=xp`g$30NtZGFA|Xqq?cG5=xk(l+o4IgHXS|?}#C7-#N^O0#yqyU#@A*uCrQF!N z-oMZQ7@etbm#R1z9`4<1GGgBxlOk;B=a>uY+hJ!cH?*TSNUPjVOIOYejrMz`=-9UE zJ$^hJ>Jg?P3&lqtHp8;`0buvDl;TApww3q|0mtIux2_S?Ddz%2sr73nTteJXy8=Hk zmuP!+2ZZK*^C>i?UYmydv{T|VKpEy`?4mKA3nedKL=IA}Jo3)_5AYUzrd&?N#+sYoZD%UciNi7_ ztctjpM|H1pPo?RJenOf1yU9fDdRYef+%a9~e+=|a)}@SJf5KTC3?CSY*tv2JB`}5< z19_0tEwI(@Z#t+DW77%3FY_F-S<2dR_Y4f2VWLs$2)b%cUse@Bd=+ICpb}%fxxW89 z61^$newuX2wqe=(I(`}QlUV^a9J%>%cy3$$CT)Aee==tD?!!aI1glLIkrB@=mSmF^ zp-HyZe>k7Li^U7-y=J(xha1$Ci+8@kBk&Q?-#4*`FH3$BRS zXO_P*UYOwyDwRq7AuTea-DcG%R8j2$k=;zZr?wZ_bf|_-@lm>W zO=@cVYNr}>rf$Ada^&zcJ9!s0STt48F3wi6YS%A&6@|GV0;1qm+oZ zwqzp`uyZ-6cTek$gh*24Z~Qe zv&h+(33u}`adf*~PmYLLS_?%n%^p*Ki{Wiq9(C@keTRtKRAH$68lS(zy9*X_o?t!u zfvNOLGdZS|<{6^2F#c5$*ebC=^bWh~7vA2~Dnppzw8i-RjB&%#0=s(s@=T=!XWvrt z4;=mXFs^#VZ_R&T8@1!Kk8V;DYD;aPX{DmsJH3K2E!dgXIybB9Airymib_g+-BqHw zcgZ1c?We*f963WOdEGUo79ERUC466Ld0v#*%mO#;baBu@s?Vq&B+Pivh=zZ5ZmvBE zJ>1h``E~r+T`W=*J~Pr`9bg?DdfUzA{IvR6NQXU70@L_NUFxfzkmO6}Xxj+CNC#>H z`c2|$UU9|PIDFF9X$>bp3)vg@?t`=%591)YDs0pKleuZ>*{sV zI^$zM1S8<17o4JF!kxg!f6XbJ5x`XJ#X7A5!PHf=r`3F{GG3=HRy2`{Hp`Ni%P!Uk zNxB-chy>ewxir(OBM@8vf+xm|V*2SM?3hWfb4}4#C1Ki+BKk}bm5s#&Z zeB;t@#8uegXWwLS>}K0RY|4Gh`K=Nma9v)d?6t3EgJ0n@ZB?fsM}4mwqO$H?u>jay zHTIiq#7*)(g7Oge&XC9Sh|s$GacLlOgro7`^`qOWw$w}QMeLnMXvBf}PgvB#TZHs) z1yQ>^n54$M-$v7_XOFVkSE+zdr!Q@pg))z1N>$WizqPj9vWKq8sE-RR%^1ueOWVknI7 zlf|BAK+jcxQx%H@{@n?$F;lK&X1 zEkj=9NuU5G^cRk9dSv%<7N4|D%k|%n-WCNxPyb!ha7>eta17i#J2XS(#`tNT$yIp8 zUpubr54VqBv?-fCmkOvy`Y)80`O(msT|K99XZ@1igKm?ETeY)&b0yvfOf`I^BF#jU zWp_Ut(1X^}_LSA&^HG(Yd>EZ!{vE>PLX~96#I_bT7Z3+wN^{)Qwtb>s9IDAC9}^x~ zMnv!ZEpuUJycqFcOC_~s>dc&$yu7gL54~PUr*@5Sm6{Enx<77Ed9A|v$%b?Mr=zfu zvf2EErb3Qe(LWzWh#smSOj=9cfSo_8sm^-BiiDxEK_ zE|vww?q+q>P5=UC>{96I(sj@GRL`T8&phtE7O8hOg!b{rNHFVhKhl%!=$al*A@+Z1 zAbx_KR@GpS-IJtP*4yF<|7CCUi#kBA@!#d*8vTk0eJxdetFjN{#;pAm!`k z7HwqYzGc>a(Ss$D!ggD~amhPQk%!+mlh*DkYAEeVt2zQtW}9Y==~<<9gD#-yb6UXuJ?N(s>UI+ep+r!=Fh}A7p=~?oMlVYbQv;q zL_!Pmb+f-|$L#)Gb+Wq#ju@4Y|+z#$XTdZJH00PL;*Xc(&we#jR8nd*1wgU3Hp zm|LUiY0jseLHKA)*r)!Qn4^uNpkD(Qzw)A9Urr;ragAmqGVCB zz@z|M2_%P~Q&tDzOpUuOHtd8kKwqWHf>LO2!quyXcg!5#{y>}xanE^@n@+{!{2APR zlZUE^v3jWOKo;PF{i~Qw(I!S+O&)k3G5Q${JZof{-o59Sd2J@7b zyH6f`c2 zhOM}F>wiXo;eBZieWy33RGZU)Q1tdz<5jsmHP`m&tJJ6?3o&VVo#WZ%{EX}EE7vr{ zPp}ASb_CJi_pVsD%>uJ~xaFFBIAn7PeQUE_^hzF8FiZPACnRqBZ78fdg=D$g6pqH9 z(~>#nL9zR!Q71$zIJe|~KzC1=QZ$AeL4N#VCTS!tY-o*H>o7;Rr&;)Tyo0OwevO~O zk2Gr3#5&1{A^|yKF@YVAW@7Aq>WAR3vg_IR(p-myg7*ISCnLtNTW1!3M<+aJO{QeI zpo|^GIK^kAky+X3sHJy%xC%wxiugsd_1o>bF}*8lYmdllYcIO)A{q_gKHFt0dcI9Z zAkKn2yKW2&+uUcln*UPzg?5Si49=F-x>7JHaSK)cW>IdsXRR8~DQVo8f&4eaEcGEX z7_eGaNu6bdvm{8LB-?>IZk^Lc)fK3Mn=DFpRhL<;Vj?R2geM%ay404IxZ}g@B zVyr4r`Bl?jPndlq&Nl*?LMM|cKxmq)Tct}O44&A5oW(y#7gQ%A5Ihr79tN+^cJWu} znN;9xaN93bDt_t>$yT#%D#k0OOxHx?xuF#TJUJ5_mTV{YM6^COrai2oI*sb8XFb-( z$4#RIX0sgv=8LE(C|K4qXz)Ry7${TgO(Pe3+Oa>la-$fIjWfC+-N)5$kRq+SE^eS7( zN;3G61&qcY?Xe4o!Yi+hUoiwd_5+6TFmcmKJCmMk+IEGT)*ak*TJ=_3LH|M zu1SxKw^KqE2bw+ZhZOb?9F4)`2)TbXD1<_I>N-XpFCGlho|ZELS02^_YGh20P!agW zc}AcQ4rRFtZKusGs^=i`jZvb(CkwlKq7B@V`>uHZ2=%I?_%jKt6ZtZMXFFMm*vwu{0Sel$PL;c(0tJ8mkn2PcuPB82RZKh_IqP2HBEgv6c=w_32-BBI# zQ)U{#Lilg}J!jgpqvolcPnp_-X+UD>h&fO2cojrtV1dg3F+dJ{Y$&}3Q`-T$e z8d;a?hn|uTMuu9uf`*ObD?3ZX+oAYSa8L|48Ql2i%S0nWU#Z5_z6WgjTSdLiLBC=vvu^%e=^{63ht!<% zmz6}v8Ud)Zz8f&*;V~SD#=xV!YF05c%wExFy-IIJWzUtBV4XC}8(Bju z@EM`G@yLg;T;WTU2T95U=T`;vup$=q?z!xim)ibucb%$QW|rik4}&~54twg-Y%?CG zmWv0@a=Co#LM4TkDdqFYtot%-P$GE)`$)2#e5`{F4oU=MU%vgR`b1ud`s|Zy93_jN z#!*`cLUoz5A9?D!G_m|4F#3-9uf^7}^(SYcVpB0TV&cw-D07=gKXnd2_{4|P!>-RhR;*Wx6Sj8 zemgNxBfQhJ3SB|9F^*w{`GXn0t(;q%r$lXu)qHL_A4;LB|IH3{Cvtc#Euiu19ar7= zg_)rWaRprCCbibP^4TjPuy*fmHuKSoPAbm&ouh!-5#du170V;Xt+_?vWQ*&8AHlU8|_Gb$8!9aW_N z?*q3?y;fqZ2`2kYBwl3OkZSjq9|3sMP1hvaWGSvwL@3pifCh>1Dt(*?G ztgYf$1qD(;DY3sC2?&jil0YEjNG(v!y77&2$8VC!I!np<81;iVxCJ>b zXbxhd&@9j{@usH7@$<4z#k}YLPMnH)aA11R5-+xdLXRwb! z0oD5^pXpr}3ud}=iAv^y3m~&Yv)+dp241IJTtRC@8bq>dC$$S@eDv65ZabHlvo|ot zFiZX=JsAG!dWgdl2gVUPp2)*8MKp_YFGwP0P~GRHK}xOX@pUn>Sq!T^zk$vE&J0a! zE*(`bF`jRJ^bwQzSJjl{s`=B7$#I8ocmOL{7+hJ$W%5ouKnyqbhfW;o!AXio27F)t z=)CCcp~cXVy-E1_1G@VkG+=>z$VT7U>t)e+OGrvC*?E12k5KGaOyrs|r$U>vNPy_- z^)h#oKaH!9v+rjnF8riTENjq(w){0OQk>~KAO6WhL7kdSGgNpBX;&g>gYWDa6t4@2 zxWci>wkTziwcc;x0O64QhNtF6q@7yG)TGDcz66b;Wuyhw6^*njz7CH~l0KppDh6*B z^_;o&e;LUmsU3+EwG`amxx6n`a9V6w{hbB0hI(%4^>03<^3*znJ&19L67&I+U15EV zW}VvzmcIsvfs~zut#w&n%#%4-^|d@YQ=)4YI3ph~q!|0aoSd*@JI!-ri~PXFe#*|E z@yhirqH?6Z+r$hC?y(!ew5?-6D3rSP*~r1#BbMiFb@&MjxJwG8lnoA_cl z8_awpZMis~oJBJZ+mDtxe{}YuY_q`boj$aeW^=<@_Bzsl3VvOBs=|&UKKW0_OoErQ zUJi~9E-|HhdV_Bi!6Mj6N7LA=<&h~{#qP@97=XRA7ZLE)sT zU3m;X7S*PDPV-6!Xy^s3(Fu1^1mrtGj6VDCpP)3|6$VUn_2L>fW5sIivBg%d`CI0+Z z_^w7r%78_>fU{BYJu8+Zjj)Y_J;gCqu%Pgc9T?lq`NTLL$);?mPz>USN801=@i$`w z%?D-NvrcJqhqaZ8poY%e`POWW{!@~!-<=mXe^#{ z9VKd=F^IGP<&_DH>~;rQZk%TK|60J_VcND|mw!Z$U)zMwc(e6YS><%YW>18mSvn(u z;8sjQzss^poq&bSk^w2xeA>Tot-po*5i856zAFUZ{p#WFm z_$i`CwjQH?8S^dTEL~T9Jt!xWr0q9w6$jQ|C67^2nw~gaH7#-P@)iziK|@$dJni3^ zgcdt7v?43I?7e+J-$>SKX^O?(mDn91tQ4D2eJhtSs~=z^uwi%%&gbwd8(&Eb;9k&R ze$pj&t_SPkzJ9Fe{|0qQd=o%)S(!2oKapN8_b~ZjY?*B2H~AMhn&AHmLH!~oRo;*D z!*t7rSu;eum4T@x`FIP--;x;F3@XiPo~4xxC1-@5#~M*ia*c4~wXOMQQQ5woD?7#4 z;r{1fo4~A*Yuq>Hku%J-H33*Z;s4x9CyU;Jv0v+M8FU_DGKc`0fkkh5+opK4YySM^ z+Zd_9kNWD3;d>Vubn6kOwde2O2OQzqTq|6g{{gMJe(#bvuGMIg`><&^-)`Q&UW%|Ep~81?V4 zU4)`^<=0kuqxM3K1mifzoeniZm?^9_HVm|#w`{m%kiw5oGA(&)9}KcO7TIuxE?OCr zGRSDaR*owpB0)Rilos>PQboPvGVy&Qn}%B)`G-o2<|) zBkrwLzE^iiA|YG7fslhAeW=z*ZFcPP#mEL%O4O9$f~H$>!}9gidmS27lnFJ~4<^q% zI(iOjNS&Cai>3k5r<=Z1{o}rIBwCc=AI}a8l;b9b54U>9qq}7~L(9=w)ayG$_`^Hl z;933g%}w122OjL5+ch83#yGBDcyC(GZZOxhO8GWT+{Uty^RuV}>s&kn!9C>Pk8Aed zn3n`3s|0keY8PUtB1MRzo!KxlhUy+%X73th^EIsWg_2~DUnerVW_y`ha*7Jn^DT5L z5421X{fQemZn{+Y7THF^WoV5faqJwh0ru1Da#epBK1g+v;7RW1YtgOYswnQw<>4(m&9nF zN?x-$%}uN!nfqfiKh9#j#_?0rfULO(?emUY+H+m;S{36S(si6_$tt2VZp0r{%$77% zc&(3}Qa34E)>U3URpnYeBQnWgDNwcDa*>*8cPRY(p>e^D*CoSL-O_W26|27WXp7d9 zF^ExrBJ1$;N6QWR^M{jZjO4xty<@Krz%Pr|Z1Sy>EQvl9gH-nY8t1X1MVp>nrL+mP znD&L5_+hDv$4{GXGMwV$P<|j3J|p3I8g-|gI>`puMUnWHt&y99FEwF8Swfa3N)4v* z=c6OhIyfD-J-~ihe6T--7ZItF0;W=3gt}gH6fRTt{}N&zLvQc z&^fw;(J*Q7wKdORzFUHY2wK6X(26eP*vqJb=bak27vuCa{&wTc20vFxevRE@RwQN2 zQ)Rx;+`*01>gcAAjj-43zFB8Z32ikWTiu&M{(rpejvivk6C5pN3O#!6#b0XD&CH<4 zKo9A75<5D-Jz%Z_W$?`U+-k`=NjsMwxP4;Oot zTVE4{tPm8)yuG^quya%$YTF8C2!#J))t3`1zv^E_25@3Mp#^;V^53?tjuJ?UwVe;dMv z8Nb42OL+MaTwc{mWqQQ2-e^42MqyVYE#1G5it3Lh5f1;0B}3N|%+)p42R<@)tPOm; zWDl@vRDYs7AM=3w26&skYdsls7*o*^N&l{SoBB#X$}}Gm4^J}!FiSk;ux>5;K^CZ) zN#k|cP0Q8eJ2RzH{=>9BQmHsc`o?={SQlOa;_=5%aX#HJ9)_>>HOs*?F+==`I)!{& zNn3Io4Xricg|`f|{9;kaVVxXKuXtkZG9sTx>hZJBBl^bBLci2muAkIU?c$BtSlY8P zA$4lw`FIeHO$FZnRc&hthZ2tJSHleUDD@sMOZ12w)uVOYiS+z+pTm~xywJWw#p^Xi+j8OVzcl@ z0VcP#HT@gGED1gVY47ub=yai~im>)`n}F}d;Aa?o9QHr5tnO`9H~${isrmEth{SPQ@|+qNb_N0rl};zt!ZE$E@&EmO$cK)3|#>v z*2udbfTGS8o(BoO<4Id?JWM58#OeF*F@5o84MES#iYz*~rN=4DSr9~D)+L=BIJn%x zBxZ!8&=>&zbLt8*e>`kF(U7}ACaeL7rBQp@3L333tIq zcKvnWErv*MOJLfsWh*RD_>_2@Fw*5?>ebXFRz|aulT?c`!8rUB-4fqustyw_Z*Hr3 z0+JntxA9Mv1=VBSpp2QY3xEFGy^`fTAXNOSh(9RvzHQy{GWvVq%g#GQvY_0TX@Sjk5zk_fTVhM;YT!pe{ zb2^a|Pg5W=&z82k>%-@iPksfHSp=7k;t@Z{iyhr(_xatvXF7MA+7NztZQqnm5MkwU==kBIF{Cpw3!Sv@#zRa-ptuwh}iNjx!!Aibu1M6sG$`?}MpCeVY@JPQ{ zwic|1U+eH)YsYzs?APMjqUwKpi|e1(LDcdcc?>>i{hgy%y9&aWY%Poqx}XnpcSqw% zZBZg=*9!?n{k*0I- zbq!XKtvG(zv1_~w-?pxX}|Efi~`NtJ+pQ6O2nyy z`tlj#`kAw1d9YUkA$^yGXRqy~q?0meY+;~+YWMR!Nj41+@u<6{Mdg0A6wyw`B>xCY zQ`!?d@i7DU$tr@8=7;}#6x-a~dSNH$5MD?6L2eyq3iCWVh)9ul+w`1>DNEf;jM2Y+wdNsnr%@xvWYzh{`lGW zu5EArZDE`YGGZ{*J_C zrKEb!`twub5jR^mU z-_L~WbbMN2_|ukd_iSwK6+pLV4?CZQ^Pz za~(!ePB}^wDln>cOhDb4Iq`+q#RT2FFfM=o3Rkb1)qpaOy%Oh}rc4QiX@QJAaejRt zeldX(t|b*nO%B?%|E?<8v2^X!XLJp!^zlZYlf`?`xAV4{7CoAPJVlRUqze^DUg2qk2q z^%teIy;rql|Gt{=N%=^ZeL4vo2EOS-y z@|O0vEaHQ{JRz^0*@|(Uei`x1a0GVg7`UiLfsJ*aKXgeelZ_IK`lrkzkPe*K)&XFX zHG%(3AcpZL7*@w6ail5Ta=`W6(V<6~Yi8kq1=*c8Do5XgvAeHuH1u+c04&)WNzHRd zYg_(@6-txQX@%=}Brbo|F~eesJ9UXitjFZ?d<9^>fCZY`-l%0dOy_L;zZT#kTHbQ^ zK;w${ItmrH^*|V~{|Q@~gpbxMLTUHdt1_-_*L`Xu2`J9Xp>Mk-f=Ev#-_OVT1UI)G zXL(8p$h(X?5LB$r5NzCT%Lbg`%NYUOKu7+Fe`vg&HjX_C@#@6*G>7$u+yPH&zQjGg zJX%Ec2QCK1OYy4oUrVHJ`v2L49m}}P!qkC2%aU_u0M#nOK8WtrD`RydNw8WY4_et2 zixO%odJwoCThT?&7L|58@A^c*=E?(>TWA_-AX<@)l^gP`NwkaN0e$K*--d`<#MFf1 zxX`%%nCWKqP2Ob7quJlO+^vXL!){e)6uE@Nz`z+9Kz9PhQN57A4px^Es5G@5{z|%h zu}nPXem;*~(uTepXuV#kpb9u3U$ItV{>CoS#G@zd8<2LCQT0t2d+c_bb(3<)TG6XC z=j)zEXVe+3crYW8aAx{bQw^LUulnXBh7ms641>twJ`J6T+13h6V8S;b27=6AWUBe4 zVXGUSKATIeq@4^zP4!COvKa^csgSaSjAhD+Fx%(B8DmX z{PFa4u5H-w^_B+I7Dj#sLg%o>d67u()}~H#SsY*z4q+Xq*Kb2N>B6R+pURGoyAymG z#C45l-=So1N8?O{($(p&p}61_g)Mk?WUq0$~%KU`>@R&lC;{-3T~GIyR6Bj z^rj~+$fc@M#H&(*jbv)7wI2t)@Fu!Q-UEV;@A)fIK$@vYN;_V-FN+R$-qZKzUKT5< zVrqVRSOD=+>W+}IJImRhrL9QLQDTr?BatAP={Z-F#8+nQE{9>~+919AQN_Q!+NA8}T)>+6FNfl&+Za8(Ar+~VM)0JjzBAPm$|cHBs>dA}ry!hK$G{s=;h{l9 zWPH8ospBjFc={3uvc9o0^d_>JP_Jw_hzMCPPFFTu=vs50GJZ(J|e_HInYP`SlSTa;&4f9g>**>wPL|@ofC5Fj5Mken0 zO39dsY2K}vVqt7dP3T7NdpGh)j);F^QKv>67Tp4)Ro|#7dLHMt1G=U@n*WUGQ{#~P z`2_YOxLynH+p50MqXI_*BEnm_p&t!G3owcgI4xo3&4kC&c3}5ujCjz6 zYtvYxUY07$my-k}W#q<$NkKkk@+M_p-|KbXfLF9%Ct|t9h_nXLcvYf47^&2~*l&^w z-P|ZMe(y=+L2<|xp>LR-S}abb0@v3}Rj0txM^C87@D|L3p>8eJukF%6TlE7d%7)LA z$71p6TCai{s_>KG$dnFq2&SnHWJI#I7werKVcnsX6f9lW%fN>2<4i|K2SbtkvqTh z&}6T=);5A)J&IQko>T!86A2dii0xb>=96o^bfYr)pJB72eibpeZ6+>0kFf)Eo~-Ml z0#A+h)*Vbq`=&VF&g&m%LgnI#U(f)Yr46>34)^#eIp4W^kuP@KPnl*cRa&0Mi7RJ4 z;?t+zReWnuVO#o0!$T<_Hc}U!V2N8fl1ti(ctafN`0Rwlt}XsoWp_$M$ffG{+$oEM z%4A~5JByaMlA5pz;Sl8VhyVIW{_@w%dtlD^{Rm%rhcejGeaWBAr|RNd9oR8zLHCV# z$6_W~^R<>vy=Uyix&5qILhxs3{)h^W9_~Aqv4cmTWyXAsxQO@gikMg=S1tAnV!Fm@ z+N$_dj0bPI>h{ljHzTnqNB{hW)ljcqzW{BTZ<3)8Ojn6?FLJq#7mCv^oCzX3LN3MJ zZBnci59=#`$OG8Q)|v=TGJpcal0+#8CWJ5e%ff<4IT~XNVZeXbKL%le4gnosgU?#) zo72W^8p)GyZmocf?>B>6_e-H%n*IY8gxld&dtBN*(q0`K7(M9utde9M8k6s9ZxE5K z`R2`3oi%{Z?(}%#QVs7PqZLZqcI2FoZ4{rb$0@X=J}`U-mx(OzwjkReOMTBp=KHn9 zDEGthqrZvTiCJ_$RD}Z;zuf`!wN$WC$Cl94VKh)DURGfV)>u~BnmSt9ki?~hO+SKs z@e(xVV}%Nd2x>b`F8ru5{W&301k0oY;vTB7kzat`fMIau1m%_|6R1EbKj0MlY#gSo z3*Ls=oP=gVx2{ui5V%_=zN6f*s9!<-u2=;+yFu&wAR(; zvUDLlFN|qr_XBj@Iy_{aAd1|ur;wB~pS`{%XU%Z`G-9DacuRPdsp2(uqgF=-UWJfQ z4=!X%1u3?2t?mG-iE~3Kpqc;}DZq(lW43XZ#L@GJXXlkyxZ_?m&(4rIw#qBA=v$4_ z14<{($(7*V5phcC1@)Nh%Bb#EP?uKxT!s`gzl$Mth}Z73pumtYd`+4v|YG zX%5erZ1B-gxgl1^ed2y8+O>w|enrC(?nG40YyM)@$fL;ev_S_759Xrm#h1m$jE$Ny z_7p6m{O@NVNTHy*V%S166N}&`7WLV7VhK|c?Oa8zwRI!TN^kninc=vBdj~ftxx@W^ zrKMg-EV@vasYlnB2h{k$8yMQt{($(jSu`Dy!E@pgzE-dAwESMr@E5JR8)0uk*X291 zY>)E2+^W8G?!o-s(Ld3!Hu-f2_>^A?Y>h8mY)af60$_{k+ zvsNA0|s@mlZqBwfL*`b{J2FJDl0Fnk~7??W8@2AbDFct&0=+4HhA=!;QqGNyOq z;!lmAu2%)IX#dt~7-t(`eds$pVK~O4E{%T4Y{3?lDG3KPBOQSdvS1fALskqGq%x6} z;a~CYztl){diK%kj1i#QuUOBjlSnQT7K>7>q>s^9=5PH>I3W;PQNBEhF<|xkc+HiN z{U{g`iGCHKrZbHMEJYq|S1sa@md)KQsK>O6A5DozTt9#^Z`e3kp>XV)HFb1+*W@$$ zggHu#2pziN3bY063(a2fwt%5ig_^mG+E4AZ?w3$60Yb|0i>!P!LJm6t(% z69FWc(R4iQm&mHXcLhWLg}2Y~gS1A(4zD^be3}|!P*u+~kQDkK5u@W9jbU|0I^P>o zkUB7uttQjt>%cy$KHF>;_bm*$tj=p)@xJF<0TIUp{r^qJ8IzKlck3lU(B-!POLrNQ zITPH-h5ixmQgGiycf}&C&VXbW1WE~J?VLP%Vuv+5viaUVGmn|Y$=D;*s&DZ2^CA9K z#NOy0oAFSzE$zv=Psi}!eIzd5u*Aelyngyn%PD5?QafAsiVJc)9qo+C5{M}=i2Lk;rt73jurC3ZDBLoPf#Zw0M^rddwUhZvC z0ZNp)*P48bX@84>IxtF>UJ)?+P)ayG1XxTYZc$s#uwDr+tE8<%A@t*qPb69nVG^K* z-&Icn{IRr|S+{=&7xDO2YIx$X=%-aU{$Wd0n~|a@ZK5YZdLuJqj2gK*96gq+ltG%V zIV|szevY7W@jG}_+%%M&@6s22-*@krGZ#$*lz#bcOx%4f@Nvn}Qwlu&YR0ouVj6i( z(JPU2mEtdI>7xA?t|6AHH_=Y8>UUc_4m7)!>q(<(nJ!)?`@aZ#i*Ciaa?5|hmi|lX znC1KuyXNu~6Ogyx#4n*>Su7{<20@4Cj!f^?d4x8EI2qeFsT|6E=C%MioY(oyHaYJ# z&g8s|5jUa7l&LL^98U4h170Yk?Im%XlTO8x#f$!%pELHL z2m?KOf#_8FK0PIgx{vx=zAvAK#=VrP8`s-mo_-b69*4>6Y-tG7KwIU^=zr>=L8Wc& z0VnpM1(I>a+F5jFeo6c`HEG+lg)hs^aJT>ljK(<1gN z{-!(6K<<7u-sk_dfI23sAIiyXQp$#x#E*226^%6*v(u+4o7mMMS_}YTzzuQ%g&v7;3Wp(xP>+Tg4e6r9)eov$s8~z+Bdwo-M}Z z>FRkb+u+AZ;-s)`ByZn()KUi>$+lYMpktzrb1Dvr-}*0xk9k7(ctjQMKlu$v+xC`k zU7@x@6CRI@4$@n^(r8_q33d6!xuQ3+LJxWl8BaQsSVoT>!0x9%q%8(A5xir=eO9@9 z>1}lOMPOdTa9}h!W)9FKu5gUYI4T%B^X0+gn>C8wu79F8WW9`@#nr1ab0qAQTZn)( z?TkpN5fPpg@mHBh6604QvR0nS3!84G&uFwC%Q~>KCefP4y>q4GCaz1{gIL>sI{x_= ztLs)g(D0om#!2Q%m6z_4m zA$CZ=QOj*^0A?qjeo5K)XP<6>pnuV|Sd*{o<@Lz*I_14jZ`ED;Ie#7@!;wiP=8N-`+| z6-g`p5c;m$W11Fj^}X_m=O}kT-Rece7^S-O&g{;pAN`&PYa1*H`h^Z~_`&w<%hzWz zA1cxQrL3dsedE`A9Tp7IxVEQd?t!NunAazPj7vC*JR|l@aZzAux|_y)t}CPi3Q!XjUNexsvg885b<@VBcpVl zvg_U|&KN8IHc_&mVArJ!nFYvZ$1qOco&p4=s8U_VY{uDKI$YNp#!BcHZIC(X@i&6p z;$BY+9Y6%^j9d1uVFL(=m{9&hH2f8OLILy4~#>>7C~ z5ggejT!u+{-nag+&$|R;Jo}jG`)se>L;%NcDVGXu|8z5I_Ctwgt}kjJOT|k(l37);lJsOt#P>@8!hu)wUGc-3`{&*-n7S+%gJkbD@A>o2*5b1 zf|7qAN|I8Z>Z^*<0ak7RE-RSq!rFsEex_Kjw&7~$p#*BbrY#QM)d-7VUSN3uNOSu) z7i<4Um5%95^D%Sj%s(P!U@Y8qQpF0TJj15!&Eg->TpJ)`0#N!}?USNP;;Jo1JGgb> zxnE=VP&Av-|7BUH-%AbFh|VwC_8B#Oqs)<>Tr(MSl1GV66d}$?unK z&lyWRtzU!K?m@sdzA>Nw?Dsw1W~UA~B96?rmSdpMRT2`ufAzNYtNGA&pVNA8eh46q z^%+W+ALJMHmESJ{!S|LDtxs}~-!mZvCM9X+ZvG8i-=7hX=J+bp0RSCW z#8=qxG}KfT3Cy1};*m;6&Z0~knF{=iCwBfgawZg@5GR1brTtVsmb5Xt)c32-|2&x6 zT^45j)jv`%3K^2R2E$ep5Mz`9KTIPJ1g2gWSYWNuORdK(J+%0%L1}IY7Kr!hP!Xys zX=B9@pRtYqDX;(&uGokOTAI{Q%Uw~IP@5-$+1LSh$kupY3(}iBEjvMZX6dZe#l#)6 zkm|pYCY^-pz4}|J`uAU%2oyXXA%%X@a5g zbbr{H;Zh${FQbT?hOq{lyi1Es_eRmJv~So0_+9&PMp3~aZS@uI-Y**{gai}iY|N{L zrV<*Vo$CRil9@NDJgk9upeKNNDa|!LDP={!yNZ2Z9QNt#VGWV#Z0y-@BVYtYG6=p* zB7pec^=1BQhL6^L{B48vG%K|f{g)U5J8KTGlyK%R@Z5tFKRC6hV)myWjKGi!4k1h4pt|tJu{=pTz+4)RwnJtUJa?Mw7j4}9Fu_uJ6?OWl2*etJEXBKH3=X@3-( zTtB<`Z{oS(<9WuvqQk{4n6qiMznMS)xtw%hzTUhBpXcOEV{oN4fq$LbcH_9KxUJ!r z9%gCeKflH$t?wO~DTb$3(->oh#_-P_9icq?>A=JL?&!qt!YBw_2(fn?FZQShkui6>Q{XDD{` z8w9l(7%SpA}va~BZ0 zg+;|PRc_JPM34f=Ryk&a`OxnOA-?#DCNq>t0QXoAb={nL*Ut?|z|3EavW|)$oj!We z^MShpSlsjZF}9b#mYUn1rK0N}D>pk{m!yVPs4{;)H`&_lC+CI$BZfT2=y>@z^v{iZ zPz!I8`uY`_Ko@a(WlVebjU=G1_QZ9m^9&@${u$(uMvlf|d;7O(WLF>l(t)%kQJdPW z@jfE}=u2(idho05X=p0iTyrwWrLP4?1Ctk-`?jq5;!__&u)%Dbi6ZbqK5Dl> zh>%L3gGRUEVj7uW`!}$~qcojf_Rf~A>&TiPH0B#g&^I%2%=Bw`Y-?)-k10XDzNt0V z|GAF1gkG?n3gu7ViGjM;YPy>y30DPU+-S*K@NRXZ{?^Uxpv;Z za+vJOQXs@`C1K=#ean1>*)_ez*64nmjs2Oo`5#6NeiN0_*2wpz7W>|~WLYHai_3RC zli0l^#(Z}$_K%Qkz}TJCux!rgiSw5D`PrT*{8KH#Yxluxm^EjphH=q zx*1qfu@?7s$2!}Qm{N%4Wij<**lmty(!X!sGXm|E4l>M*4`XQgdg8Z zgragI1nHU3)T>EV$g`^TnWPgw0)2*d2!-n0}-r2R$x_M^Fn5*#=37}SbPgBTr0jI7fH78UGrbB zGJo@*H0i^~xu0Ks<#ggot#K^}&HWW#%1Sy#H#I)!lEU6%ppaGjbHQ?Bi89)+6}Vd3 z?@-<1EGb%0FMGEmj~YFZ-6XCs!Ug~p%nJ96E8!W=`WJovM>m~c;~+NO60q+xZJ`~S z1e1vQdl#*Rtq(=%Iy{g0*T)slIe-K^)JR|x4<#!+O|Cj*SY&PWAgsCx{i+GVli*NS z+G{60_-CCR(sl+3h{|mrO0Qg~ZTs%}42Q2PW$DzZsMhV>f&j-J?wW%;^c0kszwfk8 zd@HAunbTfA3F$gr_-W%eJs$x^Ek!`?*Do+%-;nLUr;~}&UJ2MExIg-`1+!cK!z0hD zrF6PLeAHKq{8M`(?G_|j7Ll!$#|6idS@R79)~`K}bx9KHC*=i#;SEO7g<#exd`Ef& zuw!g@=Qu_P==BMiHubW!&Hsd58wyU0O_n1U7wu42hGJhQE`D za_manZu6EwrU^nyXqrwl>h=Xvk8t|D zO}VvnHH+ZC?((s}eMh_BvYiF~+$Wvea_t`X!_}W?o#qW+!TP^81&TcDt7+!N0T)$b z&*0O=<9!^-`aNhTi)1}dBWvrX+jGa^OSvnS-6@492wm9R7aN-W)P!i7Hi!S9^dDzd z+RS>2O?2X2V|I2lZ$8QvglMlMjZ}x8GA2@vl~)18LeEDpT0jL$&9N3sGB*%8bXjfP#gMN4=>=O9ZEF z4Da!}5jN9x*o)SL>o6ea*J?_L@|Rr#R>-*BuKS#{=4tyD{yEKRYuTwmEZ&cd{z8>U z$l?k%e8j9C!7#zSOzHT4R%Selsg%)M!FNOF7g-3^g&)U0;@arD7%>xdENxP zey0c5qMhnNtr1E}?ou#~9jC!Uj$6$$WReMgeeWpG)$BFQ|asexp8l~$OJ@Hv`Frqn@5nwU=ZX1eImJrB?4Y?%nM(hiOX5YCga%_f7 zlw1;JEEm<%nE33y8)>{2JJQ`-0b#~z7tbKNy!R+4W&ay}k+|7Xy#z;^49A7!aZvCF zCh5#c*3RFp8j@0GofZ$LYnGhm;^mlejxzRt! zSJ-z+I|=0I7u?S1r`xE`Nn0zjY4BPRl=*j4x{@Glc-wqDDz)ti<<+OTiY@6|gt6=2 zkbBG=N_P>f`{j5f@TtBv=1BwZMc3bNbz799%r5%oV{#JrjdhPo!>WzMj{5joXBBkh z&N3CUHppbS4az7Lo{sdL1Bdgr@t<2l3J<7RW&_8;1^RJtVELdQ{AnUlvg(p*U9KrWsx<>rEkk z{D{N4XZ;H56xO&t=c2RM<|$~)VJ)zHtIh1+bG+ zSeik=I+1;ulQsM`#L}m*1=aQz~))f%S{W9CnDspbl&q39QwCWJM4qwhKvTi3? zD~}(ZM}k3 zHP4hrE=FnF41%wWTW6R?b6EwMu6Vr;&0VyGbsW6k?q~1w4O}w4MjctNur~g;FeKxk z-sSC2PRGWh)$5g0=6cYMSTXF_q4sWpFvQxT!|s|HeOE-npZ>Uz^_OeScmF-!vZ6d# z9`>ia?pJvOtZzzt4BCG0oe|0MH^x=+);l0R{npPU+ z53PuUhNfkou4<5$$!ZDSd`TO=>8($FCw9o+5C>MM&B<+ffbQy;SyGKr2Uonzw7%J8 zZcRj@r}e0D~s!Y*Lt}~Jl-)G+W*r~WG9=x9 zZ>Mzi0W6{R=l9hlPz=)!+kn@JTSwhp$RmjbGH?lvQY&H7Z`eW0e)w0Ow&On=@R#*} z?D?K7FldZoy*fnCe`xdHrBQOg{V???bQ?T&_or5I?BQPA^D=G4nNPjFM}4KjM5$WH zUh$$RULiPEv1_=~%JxB$>B!ho^_nE@_-42NJpcX#PX(=SyYDDf zwAM6+5t*E@*V<+{>|H(Lq}$NCz=tv`J=mizqk+y?w$riawY2fVWER)1OE zoi)+#dS0H?=6wEI$W4i_Nqx((hv>WY6>QfPtv@3m_u-k{A`cr!RvQcEnqMlwj(U}g z7tXl(ZAr;XK_jwlmVb(JK*;Qp$~#H7w^L$(91%@BHa*7c;XvFoDh)($g7VW<27TM2QgG@a=#p&`o#*% z6wMy`5_H<<#|$M0OP;RARBi6_$fm^4`!j?4&<8J)Kgjj4pH@sAv_(D1Jk#x=D&wK5 z=t~6{YdxYea_kw&$z@^oi=99^hjpXd#xIUMlVvq0_O<%pp7mcBQF+wNd*i6DRdsFI zqz;B>06?bxcIGai0uY>#dR~J4Ij|x1(a*r_H}A&<6QWEm;(na9C%wa0btjpFHchRY zqF&|ZI?WvRZq$&G7szO6q2qB2>XkD!4d$I{YzvD$%w+j8 z>Rlv z7!xub7b0B8uqBVaQ=m}#Hk=M$kW|W&td%zZoF*Hy(7jKUu@u6)_jJYw*q>d(xq%`$UUKk|_1U zci6;6sBs6h|0L_D^+w|k2l`tdw_P7iP2a@HegCHd#gQZCD6a|3-d> zzyJHw8`k9?4&ursFv<$^sOD1Nv$+hMC=;?{%51+)<~*eO^U|lsv!S~sM!Utxm{o18 zt@WR^Xz)$e-(1gOFCue*uw{$I@;z1@cH6*)e)T`54W1nTPj9UOg*LDdPorm)V>d7+ z%}3dk_jp-=m&EhVoZ_xZ)Hv!96GyxmM#?*JovGZ%cijLVyS@5TTKe=k!Mm~o3gqRy{z zC=vO}Z2C3-FPyF=JwqAEnD+#{*Nw9@Ogo40>tsxEe!-)lLIM27|C%nZ9kY*ddMQQ$}Lb*ceCqMa&8B^d7h0L*3}o#TVqj&1UFpg>=z-w z+ot9m4@+0+*ag-bbLHDeXU-u5ly5fX$mzs4rDnJZ{C|_*<8w zh~M&A-pt>^Qf}QbD%x_%Ujy{pTW0Yvfz$M;gmH7H~{?Wmbk{$iB zS>|5E@?&!ic}T&{Gf1+DE}(yA#i?!IKivLgqDmN#_6yYc-e^n`edq*jBG$W4z(|eWx7O8ZBdMB!_<(JxO2Pq2n#|y>zhqd1M9s+R3w! zP+%zt!Pm3CQ-pcaw)aDk#Rc#KUs=O?g;F?`HoVPl!j4+>jAEQ5iQN zsRA20N`+9x0h~R8+Dzw)^JVP*Ij`h}HhETAE^i|-DaafLl@5SCw|*aRa_<>8xlSC> z`Ku5wEHw7bq9TjFkjblAS4y5Z0F%23npXn&qHnMRnLbxIps+F20Btvvcgl86>8@+| z*{-0>J}2HWVnAEB7q1+A>P8yTxOud%BP<=`w2awT# zG&B%hSNgK`ylZqu2QTK325Pi_Pn|!s*^wDHe>SDoe)t6<@(+G3$e&B$QVmv7n}BvZ zu0g1~?w)UoZ(E=tG_S3 zly~vF&-WIBRUj@MYUM|65v(d-O}|#|Yhc2I&3SVg5go1e{dg_9rO2oCt4;x*O&31GP`j0xMmYtkT0N^F|Akq@u<0mZiCv0h$m1 zLV}X<^x-Q6M50Dy6?z`c_eZY(p0zZJ5TRR`3|ia0=Cbt&v^@6tHDdnpXP#sEj81Ws z8HvuAc!T?5xNbOj+6MiiRA+A(i(O2ILg9Tddtr*?yuTBRA}`yYP#{%eB1mtQMw`wPea?FC>oJ}Z2D#Xbb22MBKnpCqbh4!bn2)J0g6zp@S>)6E$I zoC{}uacAK!Ei3Qo#m{UxXi-zvPkMXt7)Al>%XoILNYpg;UCe0E%j1xU$TcqGUO^cc z`hnNTRq-#3!_HHis#{8XEQole*MH-YpQ3y&a@~jPrsv{nP9^rKg}=Jb2e>k1^7`Go zlP=B`5RT+S{^dvOkiGxh^O9?;uc1p96H=77tVwwWWk9#nu|98H8W=kJT5w{w_ZH^R zS4zdb{m401urffM`OTl2Nzwj~HdMoY#5zpeKMfc=dyW?b(9%Gw@-kOGvZ8^{33@J11T$#ztYcRPn=ruM zG?P61Pg;B`Kna@m`HG<`PIDE!;m`+6-FWa@n&&8jp!NF5*U|CB?cn7c#fHO{c&OLk z;1RCGkwmBKZhm(aHN*3s)baHddLUq%4K^?j1TQSiQ>${PwU*)A`wG}63EZt_ zHc*Y}Y}P(T*ddlRgX09o=<4sF)`j`E>N2v`-?y%S!HUs9%HC+eJkc5ZwtTvRGg<)r ziRtmxy3#*;R4it;*AuB!<)1|Q0Fx|_Cx*%sCJ*^QrSh}F{y;xF8n66=q=Nvq(C;q} z1ql5BpsiPv#mCrF80Ego_hR!Izq?YN+ztSr1&T_kQP91Gr8EOd*f?E%7eKxpg(dj8 zo0LPv3e}n>e8m8=u?P*P3}oD|#?ud&3n8w*3Nb~afrY`F$7Sal4yX#ZuzNFh5ZVv2 zu8$OQ#C1XI8ui*KS(>D5(P+rzQva=FhM2FXXBWJwg3XoJ#NUhrn_J$sgeGY(3MH?F z94;X!ns}SnyHh9x9?X10bmP z*oN&qQB~(9e6v{@L!X1W!``Om%A56wfwa2G*j0R|ZwMNC$T$C&QIp=8vg7>%Zv|m` z)m0*Ao8!u6a(i9l`eBx+r;bnX0Z~#w=W5BD?H47CEo*roEhCkVBw?Fh5sk*~;Z6hZ zy`m{*4+A~wQP?DJ>Vcw`m|c6C9_237UiA$q`f-7fXqsI*&EKua;@GS<13;IYgX8`& za9S7RkTT0Wnqz^qOG-nR?)@ad-wYx7&q1I@z5xKT=jPZ8^>tv>*s}cr4daErok<|P z_rn^>qJ9N7zwC%%z{sA^(oyUaL)P0)d{*n2-S^mLTCLi2^J_bMk#=*#hwZSK9u=l@ zWn`BP$6ICsFk$5r0HYdG+cRz^l&own=;%JN+ik=FyjNLoquE>1Ug&iuesVQU-w8Me zDaNT;g+<%-Mpxp!8*P#n?ug&!_%8h}&gQX9O96SJ6!%?jlgyYukua!Ts9B?$}uaFy^(w2+*sHq1L96vbh z&YZ%fmp0EMm50!xx>S9oKULQ73%vX{_BrG)?fCnri0I~Q=f$jYZs zA3u{1SHC-YC0bMzYjf3{<%}cHt}4W8JD(5#&F)4&Tv9;V-|!F3a~gJU7~89*6|Rnz zP60Wle~2xcS&orDV}juJhQkr9r!t%n{rdVMs%eR1u2-%cZ5FCGi1n)LODUtqGt3zy zo|O)#xX*;Y1U)-mkJ$MH%_5bJC`j_oQ<{{5Nq!nn`E$t+5;ZDRgp&-|U|}?7Tg&)3 z0AZ}uy4&Y(3VOTu?JG*kTtNXTp!v5;l@!a-#<8XzXK?o-M(1Ew;%-~@vGBhv_F-LY zjp{M0cx!U^!~yaR*)F?+9s_-{h5`mdSjGOn5f(yiwt$c#a1IH**LQIfmIt2$Cu3Eb z#3SMLiu_rRKGOho&u{jSk6x~<%NTu|$qj4cAPdo~N5(%bJP;#hLq#Wo1|7gV;%E*n z^rL$Y__^io%}~z(5fG@1>Cos#mqK1&^>47Yam2msD?wJ+7aM`}vE66-(1R$0KWF%J z4S~e@*N*N$qHA!Uq+ct?=yY|{Z8m^D8%lc8TmbC14k@$rIeFNVS-N+e&N5|-4^k=; z48|WrT1&BpNv7T#8XND0mwC*)a!e&Zr5QfEwcxNY+DnFH-J2Dz>j%kDH-2wQEs___ zg8QjwKS(bdP{R`a${1~_CYk<}rUS%axfydb1G%(CV^s;%xDt7qsrU!ZPHCrpwo0p;m*nnC2iHjMU z9Iaq|h&DO$BKDpUglVOEASOF@uSHzd3YVkEGyz?|L?xz9s3=HL!6bAScRPN*Np`+YHAS5DW4zt zqK;XxiNP`CB7La@T%3MKeI@SnC8@C^DqOQ1 zd*$TATR>Fw>iFSyxGtR*QoCm1piY&>_ zk(Mc0Ufouk)E-yWYI?zK&%Y1FS3nyDT1MsY2j=~37+wNb+cPZ`<*bIw{8ysF= zWi2~-$$ms+k5h1_qxIlaaQWQTy;D3BEY6!iqYu(X6%a?J9s#MDwH>iR8{ zJx5WI4t{o}vKGv1!7$SMb2+rq?NJBlf{9sEsVay**zf;`%Y2;_W619)|y14$wy|>07 zz;1KE>Wz3HcGN2mbWS0(?WRCVmr-F>(~2{}_OmaCftZgBpO}xSgpwV~4)3`-+brl3 z$PLM#So=yzG-Xp&+jO+kkczI<9_XH54MgvaN?FFtv|aXJoE?z#MLt#5*95+@t=G zhWt^B^I>hh#MKmjdd{Ro=Q_FOG+rQbN5Or9mMm2U@6fHI?S(v10EO8q15r7UC=J(?mUMw-kZJTN6q> z9;Y&K*~@V^LS|odjr68cCGR5EZ5o>mmAsU`FK6N&(6^@0vL>)UGqVw0?08p}c|2XZ z>73U(+`!7K(>m?a`1UO2$(kCEZD2f3Bsch!WP8mVO;Fu(i#+0KOAGpr`b6vV2eo;= z$#mbscoN|=jZ1KRh&s-~@kGTPJNx!-Rka4!_M7)2!mhR}Y{0jsY3_`QmG^XMJom9)k>?TPfqxU-7c}EoHBbP56*4nD81@V)C9wU{#=#bSJQA z)B2q!k0onP3`+b|w52D;L)QXtWhc=#H9ZK9A6U7m0L*Wao-^Q1{=vEtdZ3Y)CS%AtXt?gby4lsjkRJ9C z=wzAA(*-xI<{%Y}{8hVRaQPda(5Zt_-}JB;z%Yu2Pbj+yyr*dOAE}H!u3W8G&pVsj zvz#OSrn@9wY1=`o=0yGEC?^MPHJhJKEaowhY>$g`YDWwpU##(zJhnYF6{evGL4^bX$YLQB22@&>fOxX)SK&;j;i}nWrdlp zagbIC2!@DT8>?j;!l_5@aRF;3rezuO;$M}1Gl_NHUOllNxbuSAGMf-=7CN0HSB30r z;hj=^O#xGW9o4y3R<~dF&iu1=uX+qK?u;L`rp9%U0oQJPDIYv-Hr#pL+vJah+dK8j zg?j=)Lb>h@fpc~TX@Nh>iE-23{hUNfm6-Q08h@ErIr;6+Doho>Hs)R5xBduP8zZ6sDf%DE4*K_`^xy*YZW+ zqPb5#)W|z~FV9fy(n)U+a2N-aqK0DJ(c$U>LF##ivfQg6?vr8pd4{aHz~M7t|9g`E z{`J3)>Ayqb|LM)?V6cKf3aS7u^=4p=NO_zcbw_5p=b*lV86`Y72>Gzw z#Im%1dhL^{AHCEq{^ldTzkk5{RjP#peZon{DXNcUmh`kXYaD}q{%qo#te*lu0oSV=;PKGZ1;BqM?!Bqt%pa_+a{t@A=5La$HldP~uU^rrW1tXA7>j?8&A6mCGOZ z&tl~7wY`BFPiUzPI6|D{wyyblXCA(9u&UL=m4HLWAV+OBSbvHP!dnlY`TsraUfbrR zYmWYMGp=x2Lwe4}QHNjogUA7F>iAkNJciKyi2XkMa0R!@EJE9Y3+=90OQ{-QT>PY8 zzPD6M+&zsi`M=ja?X`L)K(FqM^|eh8h$jJAwj znSLxZh8S-XK2;c)fhddr(@r(w_L2<8Ga+yWprN7oPxST`JrT5@*(TT6PldcyA4G_L z0`nph8SggZb|DgP6u6IO z$_*qJsfdj&H1EmXfPeBN^_2tj3wQO)?c$UNpOijieRUgdjN%GLOLcyYi>zzH=0UMr za0)v9y>8HW8|cFM7j+*+v|5H;pluZEDHgf%j#RFAKhWfVGH&0|io)^rs>cc zcwF;)d!cIu(h8+lg>z1gaIOm&Yn;}0Y}G5?{^HH@Yoa=f{_(^d zAo9QK^t^o0F`}J8aoHf!i?tUMEBC7MgWT#(-2TF=`5VE0FAHwJDOjYOIrdbrhYP** zbgIE`y1gfH4OD7;p!`zY;qU3tEJ6l!h9jfM(nwYQzfHO8PIuXz%p0jhScx{gW{3$Ubc`RN6@W1tD z->GkRsww65YkRzhuO1hj?t8y_ecGx{7wC7#_|Kn~IvP*&taPhT8<7YLECI)4j}o@F zJ@P{;_S$W-@fbhPpXAwKIGH>SW&sxCn9{_V7^$Gy_FMg zO8zf*56~UqJ(?Hq1g}EgtSR|koA`0%3CQ&c-1bDPGCRJ8{{>B|e~9dg6dmId6G{sV zvYJVk60Xcch9WYr0hF~l3L$(s(5Dhk4eq4~rJ#*1(9=^1|GiG|SkT@`%#`omE$<8~ zreudFbb@`S3=Pq;1G62>KCdgjERqLZNKeT8`}=q zXMMayr{hi%&uM>Y#=~(BEL%o^FkJ@tzQisgdLNAKeKSsP(sdjU*~|cRAz29;z^V7Z zh^B>-I2{GmR$_KYXYS-Z(f`Vk#)x0Y$gRo1pdT+lMai5G>YqXB*X!s*yW#SQDLYP8 z7L6|)SINRzg6j-RSI8PR&Ch_nNVA`A9Qlm6biB~0F$0fKmd;r`NcIGC36Sf5CAk2(-Gs0j`& z&`EebTY`XKsnH8f%CXpo#Xa=Pop=`a&DLn+kI^G>J#l35V>1?CkLCKaKLJV5j~dj| z{_WA)_526n)ghm^SZOkfq31pWya4}H$ImZlWi8QZ$hU4gr%IdB-|yq(;_|bv?594Q z*963^z!gUTcrU(s6s&Z5EaGCQvY~kEPXPJD!_?1gCfhfR8z*{4uiBlOol%zO!wsHy zYeI3g_;WsZ@)a7(0oqHHY&Q-9ksaM`rD|i`kB`yMPPr@nvtyp3V-LowXd~~ph)-_O zF&1ZPl3gyTy%;k<`Fhs=2HjR0m09@%@o-+>Wz%My54I5;{N1?ft_$okWBfSZ5XWu| zaqwYEw?`c}c@S$U_&~dQI2n9#5GBvvpg3Au0~D`I2Dv7L)aDt-Ece27qm4)qm{Ctj z;jf-0@oruCS6(Y>srjQ=lDk(+Y!Q?e)%-V;q4-Cq5g~Lb+#q^ z^_s=QW$Vi>_ra`Dk=e`SH2)?n$%`*ij{i?@O`ryQVLP07?Xm53zE321{*3OYYu&5+ zPj>>yJ~cPHEsKEt3KkN;;A)pKHFbzX7HZkozn0=9>|1>U)$A#Qz{SgyI+ zNtmd)Q{jd5NB=he%V_;T(jlkj?_)ITg|~gMJV5xx&6Q`H%-YWQhEvRjfM1*eL;&J2 z{fS_pYTltj^&DCQcQTsdVy6+~7D0ZbZ66p^tk;IgU05;5IJgmExc)T)5NCpnb4TC_ z5*5?8ZlB5VZDj2>2Ra0OA81l+{zF>ezhI<2710a~*{fRgZ@l+&|C>>?j>RIZ)b2|b zU%2&Uws4X-|4qD-rt0)>D{5(zlnNY$SK5m(OR%4DJvyq8?(3|1_P{%KJk<(^tO?FH2@1 zjE$RAD{+r+FhajuL`ZSsli~6HO6Cquuoi~Zl`cqR{9MsR{U z{p&tdpS4PYlR4S|zR(?1VqZ*jvbto)Sjr{lPJMkuTx0xgIWFi9d-cTE38vVxgpyb8E;36XAM4n zOhcyS{2^>&-*fg{b|^@#C*j)Bmazv39#0Bt0G9+gaX~!*Pb8u)$Nh7wyJ0q&)fvh5 z_cja-5_-2~biI7pHE8y!TBI0G?7t-4!6Ht zp(m#giwf6E_Ox-1PCC=*N{+I#o`4dOLH3X+3&TqEUAy zdxbLKYuEGG9SPvlz6EbX$`Mn~o#U(*9XbL-yQ3cH{RhgvCZD^VSvd+j&t)nQoJqYu z_2mO~aMO*IsJHnY?LB}X4WNqA)z{Y4Wts{%U?FZuEpwmYW(I#5d*y*PHE5Zi4De;Jrq#yYBPNt5AEj9)&%GOrk^58AF^f zz~v0Szs~27?u&+rg1H;J{zF(VlaCNX1NU(H3jbS0*5Om{pC2~+7d_`2k#8(c(_e(T zjjoo#Dl6NK@41eh%$Njeb;Stl`QuN`>H|{~4|*?b=P3l87b{L_piqE}3`c>YmObe4 zEpAW74%uMQIIS1*ptpJQTd^FVgM52m!A6sXacYrVs;|pp+XZcY1`>h4cHrsW0t95Z z`0FZy`}HciN!wBMzw^GJew(n6j^3K;DIc39;%d2%Mx(b%)Jynr8VoPKFz0656M#z@ z{4F<*<}YqjFE6{m4%JKy^rf^@+{DSmP6691pW_FHrvE$wSO~j1^eq;k5xOQd0+3bN zOtzgj3NTuX^Tn7wwcnq5CJ*Is4gl`~L}u`BM3TCJp2P$Q{6k_+mGz6cOkj#*b2KMP z*fnfD_rX!KM$LbEcfR~ZH~0=5Hx`XmuZp;h8aKog{px)o=|OJ{T`FFK82Qx07l*l} z1pgY65bnY1N6(#xUB&cw(&yF;=*IH7e+tdN@9KpI8n75=7gsv2f^i*mPVRnR;Ri0m zXamk&ocg}-d){y0b2bsX3*rw%{>vjjf?&$OT!7ZMg`eEU1a(fIhQ+D~?&PFi=Gt0a zu2@`!u|@^}KG7^Qpw^{za+m;P5Nh=yVJhRFM6=+b8Gs@s9_GBpE#hR{lI31igZyf7 zEc%gwZ`vFr=g9+3AB;-3#*FZ3VLov~Lf41BGIcjo?x>}G2Dut3J88D|0Pxl_wlEcL z@CSJGzH9cf5i~%v8&s(mvi@r4tc=xvnRL7M&ITc??hc3G_D*?~iab^06mi9HfE<1G z*C*5>>yK#IcCJ*JTt7{8qsPLn?>eZg<$M_5Hi3H1{LU2yxBH7W{cL6^HL%a8x4CIm zo(n%}+INTUyU%lwZRBCB0}nKW;mW5rj*)0}tGmLwd>4mY*c4tyZ#Fcm8H54lpZ(h} zt{n3qcIP`u=yh;n}L@c4efdX~TyWc&%#+g$$&N3I9= ze!|KNA-UJp|4zZ7_g)_Ip0X*;j!LPladq$hk}-2+<(Q&E#;9Wto@%8URp5WE1VnvK zBTe_tnX+?Fj?xqEPj32%Z!aN6-cQGrLiawJLvuZM8t07`-*Z5PucXERRoddc8W!{1)&*w^*XNwWwbjM7224dybXNOf7g)Z(UF#68P-5FqU_vmM}Ki~6S&x}Dh9qMb492Hxy z%iH?#kU6l2^5ehK*bH$o+0z!lJbDa9LX;9N)q9!5y6Tc{kCSFc&aU61;xg9zZ{4A# zuz7UK@j|Kp`sJqz&+$&*0N9>fpmT!-UBg47xm{L);)T4U>|C#=9TJXep$eGb>n-EA zT8Gkxyth%|Dy;-bIx%rO-jq!NvfCdrx2a^B&xxEbdtA+{tJoKdQ%!q`2=>(;oMqCRuXa8qjTT>> zf+VdKF4WqkilmaknR+=dwqI^L!_+os==E^MG%!O%kkMWSKTrcDMlCKC3e;G(tY>#w z_Rq;}nWLaOA!W`UlP@v8YLC_|W~82DgtLII^G4Ep^vOpXGM?v^w~a4+-<48Ayg!tK z?jKoZ@nX3xQYb6<4|k9ujvE}0-UFeB>>{D|l;fR8{10W{0NFg17y(!0v%=GrdkugL z!nGb?G-$DH7h0WZP1VCtO)18G2>}h6sf({ga%do1;F&sCCTc-!i(y`-iQ39tm;x*4 zCCKban$MVpuCeh3nCDFjdwuuMhgpEI5go)+N{I8)%35%G^fMeNusFSNl4=}?y($4a zdHVl$wf`(;W!UnpBb?#jGHo>Q?rhW8zAx`C1`AYws69PX%QP66;@o+~{xi9z)Pgsp zd`TbaJ3JA(orMojRsWd9VUo{SB3^T(`t36~eC$QXEKN=tXKu;C`!Wv>h+ie?QU828 zxU7lu8O2{@HYHY41cmr0&tBd~5VJk4rI-u-EmIso@8)g)yip9gk#6|<`M>0|nES?yA?WDznjP-I z1hgLP*BFkU@1L2$ljCtG#eBOiKO~sZ-3-_6GU!D6yWC)XEIp*rf2aqQaX7VQstwbV zt~RwQ&nrvmO0>|>7S%B*NYcunJxo~602bTKqIBuuc1SDKW_3>u@PTXP(Bu@Fg?xem zheOSy?#6&Bwu~7`5PEdL5&piI-sR-z(JPZjs>P1Civ2@ySsk4$(GBnX^}9f+z`mT* zR2j8%D`;qggvSy^txo{7sxkohCCUSA*V)9y`v3smldm-qBfKqzCJun65OwtkH88Mi z2)Z3+;?}@`X^J1Xe^@z9~#x9lmuenlM|m)jVl-}n<`u$N^T69;)`t4qWjHH zJDtlRE(7C0QPAG~ z(R%r#f9}YvBQK3Vije`M-BA9c`e<>2?IoxxY+wpkUUngoNg?t zAyY*P@0EO0EO#IUtKhhN;R4*?Y+@j+vRNWljWj-0x6m zPSFj`sjx7yLu7Qv2bM$;9LsmxtkGC;u*ttj?nd{k`>N$d3w?-BiAmTW?}Qb=r*vf; zP6a>Fjy7_8*BgkMPDb9&x??m`JG%|aXv!tsay;JL$FXCk_+KdJejn>CpcnMMD5UR3 z|A_fT2T{({j}w<@iOh5TlC(fh3~BS}RUHdKL)M^;i-&a>=(p^r316wH+jd@7+j}^s zlpg;P4`<;_ft{`86KESo)RVr1DN2{u`(yO!oQ=aN*asLr)r7P)v>Z2X>!}-v6B)>S zvF1a=q}6gK`sI#h^8GKi!oRNS`Gk$U;tgL6ai`hg1jg&BjawO)!q*}|>}$~&9>%mW zXB1v>OQ6jywGTm^R+oHj+$0AHU^0c6%NOCq+4!+=kS_9>kT5tlcW7mQ84sLVb{su@ z@SPw3!#M+;>mXM~ixFrx30yayE&BDeB5#s-f=gN1MS&0ziKgDHFMYe3PXWS7AVE^! z`O^AZ1&@h5O=Umv<w#6gjv{cNMtCZ(Bm-tgY$8!$-Snvt#6qN@QedNk5r4(#@HVT$RG+dX&3sI^?~ zz%8OZY+=e9x$`&pp~cPm0O1t5rM(gUkEG^}ymZ~cQ$c@368hv2H$&%EZ5qfGRHb~$ z_iccM_4HWS`nEylz<`};ccw-y&}GkM!XnXoMF(gQZSo`~!9=KIcWr+|vGcT{L6 zy9N4XUyJmMkm*!6FpvJpLF6;`kOn_8vaw5A~F?)J^;SCUD0~YMa>^Ew&dp^RW>MY+Q=pykgrm z0F}|^WTZU$vMy_bJVm>?cX(~rBk)$3(XGW)_W9_6s{_LS04DtW$CMR6EbA!sB3J*T zckuC}BhPi8AhrGy(fNyV`L8*PV`-$c1C>MHB$4gNV;ySsPwBsap3l-{^)e(f>CZ~c zN%H2c``9ur1_EbKyqQ$=F=Mn3)NzTi7-Uz#RBYNf>Ti>8;iZXyro4o&gql3Lm0xe zvl)-b6v;t35N3Gv3bW5scI;xQ`s`8%D(a2JvZkC65LQRTv?%pMzA-XJS3x|M{~$Tf znVno52toh3z%9pk|2EFp7t2%kz*HmB5E28<1YUU13;QD~LHZ$7l-Z`IN!Nnit#nv_D29ut;UKfC} zY<}~NVjTQ)a=lh_xU_E_uuo7=beyZG;b#q=saHH_>QDYKZIeqL*hJ^V&ByO z#u`n2`zcT#ij3hrIZY0GyO1%Vo{@CTaCP*}dQe zlQ6j2Ce`Z#+4;LmE<)(>2fbN^IlLzzta-ybOVxn{$$IvY&pIiLHK71u;j!q>RHxM( z(lR!RdLyO#5ZnGATJV~f!Fle667DwZP1La4fl;zUwX1Juz>mq=!zNbPdT#5|ue{L7 z%3$fDGoH26q#IBFU`V9jif(-~u(6SZEX;Iz%UW31zew}M zQ$_nD*zZqH*j(u2?AMUujOm{05+JXpFS9D8el0CdL}H1JOfzjEtyfk?VJ9w?ry_?{ z**-e{N^we`5n^kQbYRYY4KbcNxK8`+oO>d(;TEd%Vw+Pgy#O3LE52c%F5)W?`NhGO6fL zA`-m8Sq9>hiOF?5QZiCojE_WR5X#+IRvN%K=Xv5!O#ePF0G<|`#*JG1i*Mi2n#xL4 zX_DVTb`0BU)NE8HHVcJNaPXS{^MAUGz}VV1X`Qw&O$dJ_@Wt1w-^ock-gTu>p2+J@ zkA)P4U+h;5^tFLpu}>g0vQN$rO%4Hk3K1?jDejIiEQbwm+T*+-U_+RzDz4&3DM$M4 zgyJVH*k|5gl3V->5!)ka9Pj?3??8b65SP?nJIk?2P1VedVey;g#+!_b`9=!g*z<@IkJ4Sk5cgu^l#0NTvEu@eKZwZ>jKty_|-quNBf|;EKE<) zqf?zb9zZhT-d?tLz}8-+-ika#7>1${Xn^xw5r`)Z+huQA{jg3ZNA^WVWSy9vXRT+R zfhAim|LB}SByXt(yFS5zsL&a#4(H<*G|xYhY9=WQ(Vz#a#&GpXRDEiBfeadyeB%t% z2o(!hTFeEBcicYkX2@2_QdWmNh=34 z&djgXZrVETQ9Rl#l9i~*Q~SBQ^T*MM=#t}dfS6pvi{`YF#E=e*(!Ns(;m9L?^LiBn zoW>~UV<4?f9k82vND|7xl=ptAiT4+;CQptF>?4k^;-q@Yq>qL<)}IIdHveNaLU{Zd zz#gJq%@TV&Tqv}vH2F1l!xP<9Xqo+Qm zx-1$8#Pc}i;1Wl3wDnX3-#OmDW&+;~F`fe1-{MdATP%XK;7J!a*W%lbc7W7=5TL9X zW}FTjI+aU#zVg@e&iXg2;OXAxlpYDjBFgtWT8}42bZ1-2?jcbK*{DiXc)#%)KV@I) zDPW4w-wvJ=FS69QGn6D$y%}`?qONa5`Xk$Jf0p)ND5ZqV;8N+WlUdc&opSFHglZ0& z0wG@xVkGqH8NA}WDNsX=6uP}}Vw)IPr{3hXmOF0rB{yLP!IfKtzcM_3Ll#o_(8aO&?!A;EtF{3R%cb|`JQXGo&_P)%XV^9|u zmey*iZMN?o&dLuZyQ)>U)@EE2`tBfR_t~_+GnD|E!YWlFvI5^g;cB5d9uNXgQ!{Ye zEX_g&*Ylt|;g8NyQ}FGlJoMLWn=fyjKM|-%8qvc(+DxLS2BT9jzR)A`z&wMl%VdQp zXsx2Eh0eHy-u}Uj_MZGWnLRqLT+>>N${MIud<#{4oUwm=g9k|HYx&6&CplAkYVT6# z?)c3oSVRc^olI8kD_H>}c|DYUU;^B>Ezua$y2tcC=WU}u9&zmAK38^aG*$y2#5$)y6 z8<|Fu^11wbv=sxZ{bQfXn~D`Ylg(<8`$rt|?7>YlurR|x%CmXn=i9GVt!HUu<3#<6 z$u*uX5P_kmDIzwFLrC^ZyIB{bjFc=j;NC}IGnNZ5A1X}1IANO+!AA=103pGlS=H#r z4U&P05Gc=Iqw%goe$U_mOPY7y^N)vl>6fcZ(zZ&Zqzc%S*fU)lVE~;=ARFm835)F* z?V*HTfptysuw3IzUQZ6UI}t~Oe1&G;B>tj-mQ;Du6y#33$)cwC*AF781cGco;)FC{ zDJo|xnrE5l`|4DBCW9NOvUoKV@xk%Un_$LC-B%+xr|^tT4^l{{dBnb?6ob4`|THfYAOZMR1BdW!8H z9`ULnwJ}|TP?N`zMy!Pa4h*7Y^c8qT0(B&4|SA&qAlAWL&Zi__KVEQaswt` zUj8-|x~*;kVsF!)GlIF%74X7}>@NA9#9){F6gVS#hasCI=}BiYg0^oIbe77TjhcYi ziD<^$Pog>3Fik_(rv3GD9_#fqNOrr}Wbu^^%ygy*i}Ei{c_;ZHSS_e*MTiEjStmO) zEsq@CxEi-1T<-18TE~e$wLSN2?c5g6>gKu6*lXT*miRxGL&xt~^eN6w5B`#L`N`~! ztYALn6@j>3GT6sp0B&l|(9c0%#;0)@Jc6c)7jZXp(wf^(0Nz_(mknNXTikvVp$SY4 z&pyUz%t1tE2t~2)h%(i_XwMd~y)}T11-X2@i3}?9UUN!E{WMBjKb%LDRZ6|TyeX{; ztU^JFcTy&1Mz+A~@bB^gG#Ftl961+eC_j#mD)Tb&)G9VPVHUu`Cu8Xk5B9WKUfe9Z z)GMaw(@aS;S-2E)Ihg^8_$)Ao<#dj2KWewLg>Tc?4CNC6`bQ;TlhE_;xtN50fP^lD~TzQYO2Z zfYeg|R1k!O$)=o9J^KaZrm z4ATGb@Bm0ym5-<+4M?Ur-fVAO1+`XW2(52VjY#VM&1T_7O;MzE3mo}G5$gZU+46HUJWPj4_?y(&@4Aw77AOr$6@ z3_L&RQ-9v4TqmRfRc-oZo+}r1-s+YiAVTlQ_0$>q2f6VvyuL+1Tr(p&igtE9W%EgHQK?mlckowrzrmm|tv z;^79Cu70vBPELKe{sf2<{x26m>t}=eA$%4~WOf0L5X4SkP$XEf^MYIDkgVE1<6>J4 zgEHL!aLHeEoAeiy6F8_3#_{GoftcP8o~$e~`$>{b-Srge2|_7%V?EPsG}6K=b9`g6 z)?k(uz{O``cVVr2O(&zdK`HdiGd#AYM-e1%Ik0DjejH=iIBg))59|l*chBpx(WpvC zm8u)e&v6MB$vT@Z-4g0&BQ7`KF|R(@Dc(D|_9!IPS61UB=;3_mgXEt`mUu*i>BGSN zWc;>3@R81%u}-{Me}L!jCj|;oDl0Gqs9PdfDARquz{2`cxU7w?m#svBc<$Cda9>=r z#eVf8yo!8og&y}+hsQqR!*Xel)2#F{R{B%;79JtzS=-K-Kn@$Q^C}QxWQbw%@X4W! zYjH`ue7fbzJ0}|AeK^@WZe>zg z$UES13;BlI?G|M>R1+gI6>Rxc2n|U9)y-_Va?stB4v=ie0g?@PHCOBzChg1zx!Gi5 zk>ac~gvTuM$|+BqOYQRv>HyJ5$!ej^W;Ogl%MT3{Gb+!J0fmPhbU4}-lDic5HLLt> z2h{UQhB(;miMHr7^R+g&+^If}i^a_2W!-Ohik(Ec`3$h)uou-f;~YVLuLYHcWXZJ~ z*usDN|1_%Ql`wYX*T)@lJml*A@yIcQ1#^_u?Nz8Ng^@qDmIWyFgW2(I$ zQsl>F@~UCS)O1v&fO|@*SnO%VAYN;8d=mAQsI5JiiPt{(>RT8Jwe}gLY9V&SqtL#` z*xJU18D9XxruG5$ZBxywwoo^He^?!B{-FM?oiT3J$c^T5YP`EC@u+AlaeOD`q)W^& zXNaQ8E?)1By+>l&=upSgjpB*|AF>KQP*`9ys=ZG3m^CTfJpEL)1fgUd2`KoEhB zQHc70I^(X%r}TWoRKuY~?89$$1rpIRGqU_*NF_=pj+cx#-IE^26qDVIh9=$aoj)Hd zTsLAj?<{j9L8~fzMXjH}zfA+Mr#Nvf%LO1RHLpAt20hYF_K09yDQ;(U4J05kpE>RJ z%T`ZM4uARE(Ijxy|CqsfjEv>Po9fb!tA)xrm=CS`bUVaDPyL6t<~@Uxe;yPP7`z1C z*qiqMoz#AcEQ5x~VKtTL?C*;Sq0a8Sf;JQRhy<-o+h9!2?D7+;gIK4vIIYaiq`E%Z z&EnzYN$->*nV4IyI>x9ce+Uuv-Of!=z$^*z-Y^m|<# zTo+k>$PyY&T4hZhRz<&CxiF|ws{z(-ug)) z(!7a%Mj4ykJcRHHe}sVt*2KHUHx(id(aKdcJG4K})#>|c32$o_%N=8qI8Ao$!xVcN z8d0$dQ7lvbHkC@$UXuzsFR=a9cyZHQ?W|Ew^?KC}nFkpbUoKfxc&~huFP)x2b!k>* zUDWUbLTPbhcOo+P%L6^eA8K0jO(t~d$>`6GIvJUO6|ucZ5EzJTK5vGLg|jBc*L(z$ z)>ylb2}20bNk^mB@WW&NF)*e}-!|)_G@v81d2-+kJ0?ydZZow(>(vXyh4-dwqUsub z3CIg{k~7%*qqA$WPo!KI$s?Z766b^BKrd`Y4gb`xIJNaxB(<(Eng1@8+3`{C`#M6n z1nlngH7$kLCMZIMhlAf{-ENQK1x@SwtNUozA=#H%*XE5LHa~}#trnCUH5)VZhs*Qa zUIXJkUW>a4D&w?v1IxKdaCC6eFvNo578Tt!B*~*^w{IAI`oaS`JeqS2XSL#)w4P*F zbRg+Yd*V%I>FMW~#iZG|jqG5)dW*Ut6^~&RX;Z>lBrCWWqmDHBZuIaaPR&)33L}5q z5S|gf$%E9wWgfUIxtj#(EFj^={TWM3l4kDT0SGz3KOZ)>FL20D(02+O5jRs{d;T8a zylKPp4W2WoJlr1NJ}3$xwj5-t8%TU^;bqu>Z=xEW#Cw|b?%K!Sj^f?J&>Wn8cn}o|_LvvW zlnp*{X(5_HvYF{nMKrB4&x=g+g#o(76Tc?jb+`}2jBf^m%PG}dVT-cMQTVLe5rY&@ZGTs#AiN1E6fc7?`h0wC;D0qOvK85kFm(Ct0^N#nyrcTuS1cHNW!hlY?+=tqD)z$@kKj}t#IO2{3R zQ&g^fg-xdGeYF@>T<`mn2{m@EcQuO+Q`JDKTOCmIy{UZ$it{!PK(7)nbiI%|I^Wz0 zLtqKjmY|9A{0{k1qa$7@Lpiw9O%0&3MTG?Qh( zqsG(pQst+QF2?oB{}$w#_=Dxe3Q~+2bMJ*tU62V5*k}*<>M3l&s2* z(H82awUAlo++x9}bGOzxp`%+CkyIa{#hK#Rc_W|jA=^2HFHF0&UjBjMl5H3vs6^sKhAl;#N8F3T5w_!?uG zw8Y`JskB9%OH9IVO}QnU>aW| z`V~m}TA-9&ogK)JqK`#d<5ugEx0KTl6Gc4F@A|v5%z7H=GnPS)CT%=n>Di-=CW4yJ zt@?;oH#oM(fFTsM1dvs9cy?`czgn>dYy|Ip_RSpVz?f?GkH8$~uJ=Ogf4+XKtDC&z znx;o#=Eb@SHf@uRRb`>jd6_=U@QSx6H`xZ_zK+*>uzI=ombZlqqo3~QO-+vb&( zg$yr%Sh`4@Ro^H8zubZ1PDN?pLUItq$Y%w@<%meq`TjmSnqZkv?977rAG(wwR2Ir( z8GtIy`G_|JLIAgmf~|NK-{epK&~hdi5cX!G9eEQf+|vwIp?S|+x_$omf~hp*woF`(`_8*UA%G+OYVrg%yBvPic|UshQ8cjD0bJo{iGr0iSa#8 z2Mg>zw?m=dKZ&ws)HCpduT(u)A3GHk_Vho0AjCNUc<1_VSYKb6YI(C(2-2@jPmNz0 znwc(g!3xByfD-vl!boSrzVm~G7q=rvK(%_mRK$4IejxRpsANavrZSi+fq}=PZrGlGd|O%ea>fq6I{#ze6-76u1gAN6W;T#N*^Gs9O=guW7As(ewv zvv?vT&bBn(?{oYRWQr4>vr0hB>?SiULpZUY<9B>z$pNjr8!lTnAq4ACwu!iqXE`?*>lXVxU0t9VEAqMn zf}G;@aFx)^3G#cC1JN4j-LpxMd-y%K(>5BLny(Y`r>}}``@O}c6&)hvW!`&K zpfElebDh!LgExGgaMM|>c*k@y^WL}HgI@jA5CTJfFMC4?{!oa0_M+Zct+ zxHy#~(xM)@sYbo3yB$5O0{hliQ|5Xb0*wrGq9y%Kf?q3nH-0-_abRtBZq!*~DdE%_ z1lnT4eJqZi>$r=IJ%s7z*`A;1aq6pOuq9!O>~BA%v8%ydnvYA}MPhJqXFK_zI`kiyjlT8N+1DidqvE`MjvqXeDGAzSO z*>{*K-jp2UU}cIah%E1%3J4ZGYoYfX1v~leDpmZK`N_cJ@qGfWlM26ybm|XJ6_X%`)Zlf9B5E`r1^1U-v*FKY!wlT4~LhH?jXu*$>$Pi8dc{kFqGXlrnk8{=cHuT7@pIeR&Mr*dLsDq1KBHcj9pZ6$c-WHeA;J}prf zv-9^k?Y#SWOX}D|4o53ai|>80VCq0ZTyM!#X_^aP`KSG|gZ>odMeV@z_@FOQv>g4tSUJ0z`Sg}y=e%S!asH--Ij}AFm zKW6tj01b>xHQ-A3=1pGHjY#0$O!{Nqz&sxSp>SQo8|PSCKk!P+M?5Ni7=mxOR)sE*BWf zezl=b&>cgen^bvnVDJy6 zs14!tF#Ro#e^PwL_ANp*=-Ewd>yC}TV!c`+Yf03gVv_B*kje@2uY}{H4h*Ms`WYx5 zOkF*3acwyD*{!z4{Gd&@!_B`sPiF9LR|R9Ec-PJ{vd<-Z6mTP)MxX`^D0TDy@%CO( zO+{_jXlSC+RC+H#y7b;c(SU*!6%pyuK}32_KoA5%34(yqLFojkA|)yaq4!>b(xgi$ z0RqX{zVG+_W1Mq)#`y2J2z#%+>hsKJ&bgFPM{;X&sE+1)8_F>`p?RFt(tp1!ZS4b` zUEOn7Ef8eu)#RP4I4jV9Yw?Kh`ryrC6=Y0J$Zq@FFlEQ`^6EFcwWlMRYvP$L`%+v9 z%~RlO%fIhNPJMg^%i`%SdG>PFzrPC!V-I7sFS44vBQFErwp7SrMYdBi{3^nQzLkSx%D|2vvl3p7IW7e!VACkCyZkTO?o19i#&Oc zo088GvSnr($10rj?mL*z_~DBav*oom{n)>RM0|)z_T!4&@A%XQwhXsjoh?3t$-dxI zyY<7_2rEYvKqHp-_@(rhOcT< zT>tE3avIaL3O(e8#RDmDcZcNnW7#zGY*gvV(uJv{i^B`odd93t6DZ}=HGGBzpD8is zvBL}Xsv*tSIAjQqEb|%!%;j5+R4TM;r2H2iELW2}{|bFhJUWdt%wq)R^r{X zgmDfwzr3Ct5?mx!&W=VbW}0nPc1q!Cg|u>CekFnU5?o=z!^(bobFGUQer)zDC0G(F zM|5b9Z^qh6*53F*4rDV~B7=)NAH~=GC&;?gzXmI5CRCC!U-`j8LRyEyX3_EjCYUSOyx@(R zI#@lvwSxznIBVCrFy#1cX+N-`78$hD#_M(7h6x44iGDX6$6+Oz8grEQJPf3efzfcJ6N4{mH#2=m^U9h zcD)xFhvlG zzV-lI;z?K&Ph72SlF{_~^WjPa3=rl6HSLf)XA9Sl47LHmpwc?!-!lLKp_HQrgcI7~ z)}_3gwnd?zB=+B=&OH0c&Y4A~$>(q=zBS4^Q z5;>`ms}|WfbTV;yd5ZXjfPkbmFQFOmXLKBJ)ug%mgqS)O^s_0srr_0Jkt?d8c#C+S zOwZRR+Sy6Xb$Z{ooR~Eb!KYk!h6IU=siBz%0g+9k=wi!n%O>>Q`5e73#kP||PDIwa zP^OYV0a6DOjf?e6`ht1Fk*s7CL0Xd~`>j|Pof9xR=sDNpc-5~61D97?7#VV6Q$G87 z5$J-2qYa~qcVgt|WV2{Rq&Js|^)8jMM`IimW$mK{h7^#xd^kz2{QW9P)f!;MkawZH zxb%?tA1Zjam04&lbNt$Tch^fH+@ciYw*f8_#Rr-%ciZxs^U1Sl{tE0-{X4E*x@jmq z*n*auGDZ?ks2g)y!oQt_vR}u_DNUx?#Ypk|;f@W+mU4%Jqvu=2xj*k2a(QDiuX0_xI z8*tQ%DY)(K7uAs|w+&J28X2$~;uvbNgp_hky02 zg5@08r6iyvymK3Gy}vJ_tm%G#sf2J`$J$ok|7!bJsK8g2v3}jebA`WWRU+_PyN&#J zCgR^wzg@)%(&S4BBiXPYzB@~H>SKS;h!rX488)yzYBh4d&XtS{x_L>B`|PuDFHo?x z+;v%Tcd>LAoPXW3-UQ`g47#8PdlYavHh*vKCnY1zC|QcQ1Y3$yf9n7l&}?^ZM!T2s z$D=?5(ih^`pC?`BO_|nM9bQeUd29CtMgS_=E;-C>1tj~IIRd96ua9CkkLTrB(B9^0 z{Xm*yT~l6{G%bJdz~#=i#>hqh!wy?_fEsr$VK@FTzOgQxc~fP%H_7-?K9U_Cmfgg? zi{&O=TaJKh;JRJ9IQM)}B@p7um+jM{V_EQ?s>?_^Hb%9!e|4JX7gL1vvr?n4`%;IN z9EOF-PLon8g;7H|n^3XdihfcSzu${H&6QC-zsWM`8%2WJ3|(9;9=g5&PPJ>YG_af7l0O-zH zGY195+0caa4$<|vzyX77pPbqU!1=hB{&roEKtX`Zpj+iFvLb?Eq;WY)c6 zMBzT-MeyY|*^FmQLY|&SZq0jpmVeJ`c1iXkeKU`o3xJY0FL0>KEDg=^1o!Q|Yu%zV zS!hdC@G+bGfb3=@(Aai1RO1ff>Xv~$MSgc0U{rw5VigZMg50B@E=ln5J&MX zGbD6eZGCYeAaNJwyGsKw=uh@mnwAe(2fv#Sewp_Sg#~rNy5b^I#yS+%7WU~mAnL*~ zXL;T%t^VIz8(8yNPVVHr{itp#=Ld?OVH)cQKihu03WCStQtJOG_qKLoG(}>Q8OqE9 z_!Dd2Nj|3;^TDEA9AQgGvT~r^+bU@rbdm=b*L5zlgMo3xU>80fC9S!vs%7c7^}7Mc znP8VkD{p{xs4>KA`xBr8Z~^!{aLF+(>Zc{PLF`wCy$24TlK_*UO~@ZdEhES{8il&Qp9-x`Kll-&=k)VwjNb-q=C;E^Adl1gwHuc}l8jmMksr%FEc(A!^pj zT#KN=Le{W9{kQTgChLAyHorxbf@cJSlO=C zB^#8b-@tfR>P+pS?pP(y@G~zOcW+Ps#KefOF5-br2ortK;+b^5 zUnCVamv_{fwa2#XYi#+Bs|^!iwaV4@_>}ZOtL@u*yt}TYQ>}GrLE5Rt$khAbbdzbn4QD)PNKo#NLiLEBs$9 zpyDVtS?@bxsFG)x6D0@x)?%Bzd_nk`ROFFFdTc%v+I>2wesHz^vLpqYv*xE*`!zB6 z4@|cSy+?;C^Q4|inEmYwCy^mCzHNsULMAs!x>m`#TMDtA;PWfo_+Qb5481_SFlYbZ zv==TJPVhi9;Itj8F!!`99Ikwp`NO8swliG{?RpYj!t3RNz?qQ#o*4^lzC;9jd1IH-o@Q1v=PDhDOHmgtUWP@YB1oLM4PX!;O1x`xUW1H3+u|9oDJ|s#$b&@=kF&87ZIDX~rKPv7MDR~$(7aSKNTm2{K zTMKOQ#CjPLY7H;159td|$P4Cw3JAxeMdj|d?$ADD+v}?_C{F*mD?%;Ps+R+-&fe9T zOs*j8KOWoItIuXe3v1zdnOU*ciTl1bL*pCldTtyo_XNs~R71BZF&p670Q{W$X^%T7 zw>>^_$tsbo1RiujL8egxN6bQEHX)@9D3aAIymH-`RsTpWT?I7G13W;(H+46+c(286 zAE&LR09yJq(&2q$WqsbDA#A~AF#LHZSW}aSnrU7IzuRck=-^=WaoxWQ1xUV&NSkB@ ze#GsQ&`-c26?*Y%c`pagV6CwCef?I*;4p^ldVcmwCrr1V?#{w3~-N%J7nE( zGe#x<2Q(hWOmV!i)C0I}BYJ4HPyj~$d-!YPre`61d9N|JTlP_e3=~h#QC3@r%|0^R zVx6QJXpVJxuGSSa*t7H)OZ}i|))VX!t+orqVWE4vgymg1BI0d<-TCfO9}4z2;{|o{ z#nf*XNe%cfxk2r|f`1>N0*VPo$RX=Hn?B(vo}*qdoqaSseH0edis6=?nCyUL+W zMdrC*S17z>jYA9)@A{fO58FbF&<|Z%;vlr!lry9cP`*5>&u~kL`%>{s;OLDU9Y80% zE2O0~!Yj#WR|6}tC4Jtv%~VNICNsK2_1tFz-K50&5bAP>^lZNKf7OEImWNHcrZXLx$qbn z-2Hv0B*Op1fA{plF7Ocxd_49ZV**{{a&v`%( zVd%+2)aRlH3tmp68|f~T-B3*sd{%2QND3%ktr7MzuB6OCfJ#$w^5`L88_*JMR|*%V z_+|nqH4IBy{7NwmIDtK}4>7!mRu9KSg2|-_9Kr_>^aef8Rybft@yg^bqsX0NWmy61fro`#T% z5`;SY0NCJT74@DY2Bg2)rjjE3IE};^Fgr3`oS+WPCjR} z9tm47T$IOczgS{qplEUE3-z-8}vG@b1skexsu^!Jsr zkd(mKAC7;;!K`dEWCK3(YmaMoLu&>RvK5PHmSbNcbr)+KqAwZ7E-s%PkK4NSkEmBH z1Zx|hhPUFKya4s(W|BHqUFbtsb>Zc9oGjTj{gUKF`EHR*wa^TBYy6ox)kQ@sZ1T@> zVzUX_h?KPdR6fb??sve|VT;v<)U$s!*)6e=ZF!#x5MIn|9s_tv6H)4F&H8G{6 zs<(<~X;*JDjuE}fHuV>=8w4rbpc+~A?czLoThCo=Kz*qTbtw_DxVzGw=dE*R`ENJF z{P@HPhWlXmY$e@N^S}t9l1*;XZy^#JC2bK#J|H03KjiS@d%M|GyBq}gxI0@;vew`_+R;W=CAynmCj zHjCn`mrz9R#-q0v3=Q&etrS|4zZX9{U5-Kqx~R!AGPhzv2Q8MMM6=16C1>*6-Jr-ig3VFvr-OSh%+hyW`WKvTzvlrrxG(b_}kxS1X)n?QwC<-A+!8-py+lDHGS<{l{sJF;apx zs8><1ZGvrb4d2$%s$7fG>!)Oc+D_%$0;*2H#OSo1`0Fv`&jxBV^HfeutFO*{Bm@rw zXfJA~jZ>DcMx*8`UoOd0`8<5+3g_vWe~$)3dz%HJ||z5y25tA1;+hsXYNY&DpH3z zul&bO4Ac3tH#Ir$Gp<@Kko++m+(W7V&-wSg+@2%(7l2Dpg39r<{U_}>CFamZeSuTu z^ck9vU07YxMRi4tp^wjo#-&AixrR?!7SP#^e=sd3vLDU~*sUz&?jw0>NY+7;>mW>$ z0tD!P0XlWrKHZnD=9`?S3BPAlm9lw=)jDludehF^yKWC9x~+IZ zS?t12IC>p4z;Joo^YK0!L>TkGg;5qg7I1d%iGGTU(@ivOLCbu60f?P{Usi7gEJrY* zP+Nuq6k&_N4uX+FB48qnAS&dAK)8aIZ2=I!PSCLEpTUKdsUb{mvFM#ft)Rj1`W%@b z;lkRE&te2fcefI$0XwN?@~B7*`WO)gKZjLJSp4zh3GDfyC)MUx(M@80NXZSlX#S32 zHId|vs?tKUP1K^1zIF*~JJC2K=CrvqS<2ys0g0Kv?>L-(g8$NA0>X;U61_y0srK9O z3v0Pg!8m%Vu16*&wdcy#3jPSyKIhLi4mjyZ6esb2*#N~}kNQbDd3%<;`wsB(i@=SY^ zOPVlbOLtUnz`Ao#W6-p?K9#8hlxF(7O*9{f?A~}%)M%&|!KThD?Q`?_I7z$oU2;Cp z%3f11A_|PKN^qYU#{457cFG*EfJ146D~pFA3z!P~IzFPq%JtHB3;=7wapl4EM?odM z;BDG1E98V|F$hct|956VjcokJ8_-s`xNZ){?es3LcDcmy?kbP*P-M6t&0+kZKRjF` zpJaC_;)9zB6X2mU`OR8&O%K4;^5>v{+asPLk0W|HzYZ5nRIrjIL8J|xYU?LygMx> zd25j^SIuOx!A5w~t2#Rlm#vW;e+C97y4bRJMyvar5i3>+I#)gW)T?c(NaDmVK4>U& zTw{6gJ$R!r;Lx)w!bW0t2DPI&N&Ba;CT_h8!raATEH~tQxu9CEJreJAmZyLYI)2eoMY+NL(oxzgCh^qCwvphVCmb(x#)?gQQ~Cu5o$br=%6V}4qldm-Bh^rFdQf> zuJEyS(R^t&_DcyzJ9G zcefSlOb~jV1yJH;NN>%(+UK|#tta9CN|s3^j^IQjs)(mC!IA5v5E;l}!8pP9@Fwha z$jPPWey0Rs zOr8y|CIqxy5v2A2(GJP5VH*~h%XI}Bog_2<99$Xi^GH0uEC_YKhwz9@6Wa>~mS;5W zQrgWt&E>(C$9u&g+VY9p55g9-XdDGDUY}_>zksK*FKrZQ@m9l>h97_Q;xSsQQsrhI zzRv|>b05@5_2g@DF!2} z8^MP32aA4%MbqPxcURF?Cp{j(D(M@Ax5QTLsdWW*DdVqRIAw#%kX3*kfpS-{Fq*re zt(6+y09sWgtF%#Sdw`K#FqenT&C)VdcrBs?DZ@ODHV0FgX?O!p{0#!oJjGIeAViFW zE?#h$4c82sxE=TNM5-udwp4%iJyX4-P{gV;I9W-@SnOY#<@|G~>Eb-jdSYQD^EtZb z_F)6-uPGZLtzSg)a3x&sSn|`ukwEZS-Q<*gcs_8Z(Y+(BN3O|MWmL8AaV*$0gga(n z9u{m8M)rqi`Xj>t`-dy}OgHa!E1!W9L9 zCJy>TdFVKFHJzl!gyB>EOeFE&M$?!kiK!{7=3z`bboW0NR~QLpryf=m=`?i+@_1t+ zmh<6mQsqu=x(TAM^>T|a63b0}pA}{QJMRtLYk#Qsqrz?ZL z-h1~>@=a@pLg9jmiPSzUCcp{@qJJMN%?+L`0YEF<0`*98ktmAI2_D( z%5oAFJjBozvPI1sbo4vc=CbS!fQw2mn$)P&Yn@bcPriKJrKXSW-A^eylo| zAJL=JTHN$Zp7?Af4oZyO&rc%c{+lRZT%9NS@HA;t6O-w;UTg%CdX`rYbTU*bpFxAK z?wc3QKe=4g;XwoKVgIK4RZxJc0ps$b@dte3hDoP}9Ny;joRtvhsyj6hT8V8LMp){C zHk4(vXfVlY1CA%+(g!CVjf?&p3YHo{!?L@@{|Kc+So?~1$-0s9uQ!qCsW}s#`%qH2 zBl9b2HWlW#-<#~PL#6bfU-HugqLeGdfBqk$cR7nn*z@3{Q=nCFUaWr_EKg?du}fn3 zR4D`dMcG3T(nk2Lie2Ihz&j!0LtM%LrPg>rSa{F{HDSO}az3cHaEQpsY$tG%{MmQY zO#YWvugW!h1eQ>jS|tDreCy7Kkj0pNr)H!Jy9%4H@1&KJ zW8gG1xw6K8qUSoDcQop5JsFE5rVDx;4yU8?JG>SSssZ#MwrrWD2^O};aJ#jL+>fxz zrnA7(h$C|=bZ535Sf(+ZQ=8X$Py?FfyVs;u>Ku`GMS6ad>}vA_#r8?Y2YCD6*{Xoq zjrX?#i8A`4K-&UWG}AwB@{oUE9)@n}zlMQ(bqh)Uh+xGxBQ4EHsSXwzE^0|R+zqe+ zf-BbNPt_jH)@9d@M~T-w+hYHnJilN5Rdi?GD!+J zs{foV63X5OfCJo2qb=-nT-u(O;ZIK7##s`QPO6r-pKBQGOx|yXE>(>5zv`z}{{EEo zZPcn$LwQ|O9&b~yHy?T!caDF`>`H-z19*%f;9lR_wn3Hi;r_vKu*ZD+pRZMcC;g{t zJAMrM8^dHKn$KD#qZQt~w@T8X-AHD$a9r^%eWz5Qo+RT#(-qZ+;3D_1PYsHeT;RjcZRdw93K; zKpw^GCCq?{WVIZ*skRo=IIVt{Z%J3b-s6CEsq>-7csOa|!`@`4?9xNsy-9$wKD*If z^t4f{c>>StUfe5|N?;Erdwz7-V5;H5aTK$_?1MNCe||Zw zqo|MwOtqBHOQNjY$FdF(SbwI3hu&jHqi*=bY__+fN^xW>6%%J!6AU07M=^OkyKl7Zc{EJ^^9_jL?Q`HznbZbBGkH|5h zhes!#+%r3dbU?MXI>SWq@qF(hD@*?>8`9%c_8c6go-BKi^)5r8MoK<6w%4(a<88cK z94erI_%~KjJ0|o#sO)DemHN&ehb@?Og(x3oy49KrWM}~wF=oG~`rok@qJCP~!SaD^ z9}nCGGMsyTMy}oeS+sCh;B&ujDaS+Hnb;S!Ey)cH`!K<9NYQ%XVvq!b+;iSxR{Ef2 zuc6T)s&$Ygk+0C2nSx?c=AntB!NX$iNUR+?mVz0J|4pz5ymS4Jh7s zHZ;C@wODjAHv9{=AJYX@jCZ5}b3#3zBQW!xGAEsw7Dabd^njJC&;8>gJMkZ=Sf6XU z`;>-&{KFuo<8jRlx#bVov5(Zexckq$e0fKSNKES)>5c#@;kxh$QCub$qalgE5USY= z@A%U|2S#pPGrpf1Z2Q2t1HJci#4P6fzd}Tcj2+$ zcn2T@{9xL0A|xl;DLfKuc$)ZAc#$NYQQ*i@`DN#pvmtLi)zueBR2O8gl2iA*dtI7+8l+_t*Ps z)^Ky0qZ_44F<^pNxoLj()Wr^roy@e8N54AOn(ljh>JZb*vxFS*VCtI~NM@03=)q%4 zGnJ$Fv}6gO)ENO^qHQwnME<{e`{)#Mqj}&SsNJs)N0QusQ2Y*IP^$BMin&EG^6wW# zn-o#zqBUpSe(1>Z%niR3H*6 zy70MIMb*2}6EA8%P|6HWeR_k&>17kL=pV8RR1SEJf|ouwk^JVo+V#bcZaX;H@^W6h zOoThK))^qqtv)2$f%CC(k9qaa%=cj%XwHG~z%#QfBFZMtAGkJ08hU!Pet85+lk`6Dz9Pjp&+D#cv+hz3d{Y8|$AB_>A&XI0#M>)&XI3xfDPtE}e})Hs4gR$T zA$Q}hB-3*4U__jAt?q-s5|sZ2{hAKcnUuf)iyPf!tm*qZ8D%R6n)wm&NrQB9vwJ)G z*Cyp)yn^|BLzT+donP+f?tfHBzjh8!+8fA2z#E>nUSWKD=+((B^qZW3uhhj2(i9Q> zF-rOBM%0P_(sY0pU|{sbXFJ2?u+`g#X&T#VP@qKMLnW zVS9$Y?=^@GpD9oPWc8OGmk#OsC$35R)Iry|w+1Vp`t?nI2!G!NCjgFvh+au3PZaKi z_CFtRIg0`Sj6E_2v1>_s98rhXLb7dF@7q6Le!)6)-V@Mm6Tu|^7PMeOMc)^{5{t=yAlqhY^U6O4rC>u6w}j2+|ijwq;8qC|~9& z`pwxBrypyv6Yl3U_o@`Vi6qwP>*=@rGH|{$$IVn1pujT7fo9{hZ-#U3t=1!>D@PI z_rJvPjL3}Vau+zX-9s@;O=l3*(#TR~8eh{JZ@XT+n`^X5_HPjaVp2>^rOS%yKU&KQ z&8GB!o_g&8zj51AN@B}FTJZ5Wq2M_atjc^K^!pV&X?DUj^Hh8{7%8*cF8gGR<8wsW4i<75L)wah{gYrGL+Dox#MK@GvO)(EqoS%OWW<+LrEcEEsbzx?1<= zSR(e%zvjIlB%%uM^rLaP)S?R&GY!ix^`hNDrOWA~d0#ye8jPs2lmY9A4(TC`nTVAC zW-|2ne{R-Nz>Oy`ELK+R8#%P1J2Ss}*g1|jKixo8m$7gWqCAt@*>5=wFpKBeWpktC zm%~Txc5)e)?)hr8t9u=lOJP61%<<3?VgVW|&2^Rb{X_X1LyEZY|9<}{h}!D>$E|`E zFU79CUli%-stnfI!#Sqid0z%(T7-HOa>8vplt5r5%Tx z`+G;_j*yhyJ~*I2{ER@Q;J?2IYWhsl4FCK0<*cCpb7#~woln5{{kLy{jb8Nscqju% zJb<$TTHpVf+~og%|HT``73X{sjr=fTv*=bFb#|JDB>mI13)Yq(^&GyQ#v*wEemUTc zR)lbs@6ZAFjo3V9&(gk-kny^wg~tF@;*E`P-Lf{6g(Eluxs&+Z zbHxTasZeV(UQHJ|;Fb6kN#bahO=X4xX^a|h%BgI-uCzlHV=j!I@*cPRQ7lGNC~-9Z z!2INU`|WWJS{s+;TIu^{duLp&@G>ERAl&Lq|3X$#B87?aZbL|9?eLw z4_Y$fg3vN?J@~l&8-_a*3~)hZ+ujtA*uX8{InHszO^Q%m9%AS9mq|znn+jcL35CZB zWFFH|__)y}MqLx;+HA|q9h`aEfXri8xpGXMk=8+xsJ@6{(T8U{tY@mPV3PMYphlOn z0&-nfkWj@tA*g<(!bB^l726to4|aqnK%tiDIMB}JN%x8fCh8Bwo~~35an5vHkXO&@ zy{rBcKr8|x71p5Ih*b1<4j**n#ey3Jj`Nfj%n%u#8+;k63sV|W!V&aWV7Xm`&;`J}y;XcPWq(S!1eZ={-0e;A?u?``nG+NzNWV{cb7{-6rbnXgc zKRR#&L%O!Fp?;4c)Tb~-rgSCjr&UM?ySDbR_@Ex!XDipHdB;_eyi?2eBq%X5;IGl6 zz#(Su0%XQBzm+<*c??+%GTHvx(D?^Mm_x*(8lshYY#sPy06Rl|EM|~&E-+ptVWhsjGn*P@OYQaLyX~PXAvUjP_+#e9O(0TTe=X>Eqr+2B zjNYod$s^6lNY-z##&-gy24gO^@{eWb|GY*YqOo=W5I z_^NZ)DJfU{POpi1xuCyteEtWEb2v~MAF4ovT`J;dD8t>X=iMx#AxQrFzX`hcWvcy28iR<)2Cowy$gMz z&Kcd!c+)3=5i9VTteYKtc`fSbp0+m5P3?AyRkdBgM+wT#SQCp>y$|P#3|rqSZ}faQK_UbBy=W2{Q_TqO;32(mY?EZ_M=-kXwoa8Vh+7%J34Fry-^9bOR#hwvPPmKC!ANPpCTZ^*UD8ZO3Mhgvf8j$0mJVq(`NXxY;Vj| z3(P*DlD(zPMdzd!9;^LIg!;BXl74V0ZyG=2*f{oXAmxH0>0R6y$ssd5e*ZaARPe?? zy8cR^T}fk*vN6`2f|8wVj3On@_+6aS038)-K-Mh!o)!;L47U~P-(X!rxUa*PCTl+zA z-_xIeAxt}3@qfb@CoeODxBLt*-H1ct1-nQ!zANu_G^0@8lJJR8BhYohO{wd#rt9Aa z-+e8)^%K*20M=f6#m22yz>Tcc)~CEz3p>_PyQa4yQ-e*Hy|6doj{Zb6f_m2|DRjZ! zklnnF&d9eAq%c|wK+|v3_cZez_j=gX(Yz;CF?U$dYREj0`WD0=lW)~w>JdJk7`7;t zc$>>2RUwdOHlcrqbmRGOmFM$oA)m`G3$fGf%PJwSvVtK#Qo5n8&m%KAhH}IJ7pBFxEs9zjsL4pmh%+> zmu-J@9en9XnT@m;bwTHvu)in3fJqh7bUo`(b_CVaWfh7{HM#*Qc=mMw3sfBXI2lb@ zAV_$S?d5q|#oYZF;ijnI{9cQ=TWpf8+%os#oY2a1Y+;al@gej50>hQYySJoITRe*P zZ7*|!mOG~G*ZCA)pXXP`tsy_k=Ej{}Pd-ytXXA2Q8FH1hbbPWRr_MVN_RjatRq38- zC&{H4Hd{d*F!V?1k)ROO{vV%X11xI4)qWDtY>k-V0CI%WuPMDqvqL5W9P|j#46! z%Jx5pb3GT*Yp=(CFmwr(`FE{Y7j;<|T>iaoYqh8JWO0i1!b9_7feqFX*djTnMPBm0 z*5fZJ`!LSs$e<7>`i9#>!lKV49?BL!S*)4ZbunP1uP=;s3f6y@F4kKep*c9x(KtM= zAo10_cxfB3@pwk@_c|MIr~R#;WV-O!`%WN)&!SK^Q-N=<3!G~>i>2>#t_#xZv}2sw7VO12l_JmoV6rDD0OfT3^lwrKz;m-fB`z zd=~}zmfU&>>O8@+?X0%f9NF-^?%@j{2Rs&+O(<*bg0-?Cw%tJj7I2jKwD6Jmrp<4Z zp&;Xdz3Zhb#h6|ieD@)rMb-_rDlo;GoxE36`XykM#7g8%uGE5K^RvOXwbgRsvrz z)U%#yq+#Hk?ZSPxvq4w{QG@;t=)=krMBw$O7PFp#dkWsPtqjYF=dgPkF{bX^X_4nP zeUEy*`wB%Vt~@yhO$dmWV*qy~yDsGfSo%lO8S|u@*n^wv z3uKBq3UT_52H7CXLiWkXoVRCOhDx0-Og=a7zI!Yz-yj4rQGdkOS zlYD+$41`m!m+IopTIZFj5JltPrg}ChJhx8znj)GH`QZo70s*?86`qSrLck?^U5uid zBaWtDzm|}_>8^_N_xy+>m1`0{AWRS*o7fsmhZdtjeUDXdUSZCbi&A7Iwv`ch@{q=$ z+s9%}$MP8#S1CyM%2Z8?%2G={`}^D}$YmMmtWBoR6Cm(q*p^GAv=r*m7RQ1^3qy)I zF4navMF;WxX+zX@IjW3Qz^KGY7~oapW0}aq1UHxOmZDGc*oaYL2*q?VzSquO!5UNc z$OfCO+*kZYdEKu%F%&aiCWBbpw;s-Dnj_`6voF*U6!)i0gVbjGx)aqY$G#W1#+Y&u z?o*O=l1blqg!TO+;iA*hEe%f}D)m$Agb(uI7 ziYMI-x$vj(dtcs4&@yC5-8?ZI@355SPWRPwlA8{>ysV19|LIm6=tx@TfTHA@_acUC zWnkFoSPu8}nrtu^qQ_C`ky4^jZG;hU74z$}-XNoQ*aHXY0`wM8jgxwot+;W@SX7y$ z5c_d@c>0L_Pn(B@u*$fQwbj1Xg|B-LEspnSQyTe1 zlOITwib+)}XSH9Ey<9ffO3{_9>hW$k2-y1zAG_=%o9;3PXF8Gfh1jt$h`#MEI-gK2 z=397{0>dzkHdmjGJ&t|(uNJeIo#0Goc6R`U>wd61B=n#bLN}`DXuU91#If=#gd#<| zGP2q1hvyIFcSk9_5obVv5Ed-C=`@Ws^0w2s8lB?3`cZ@-OLC+$b$S`|0ik}mPM2Jl zVE-yBb}i-bsAa_F{d=s2JDFhC*64UgpEwNgZvBMVfQW&>w*h*gtZd3Ts_1yIS@v9Ka9WMz9r>*dMDeU>03@f}t!9XADrH zQgnM|xj=*2u@Wv!zC%yuD@A_!1m{}sPK`DeNJ5pikGLPv$|@@$H{#!Z&ls^9f4W;B zcGU^f+Wj?QUq9sZYD(h)4DTE$$5768l8BwmbaInDJE{_tmXP)CZ+k zXx`{YSIlqnJNits*6GE+P|aJ={$DIWY2zg;K_^#5G4@t}xIl~?k*de_ob34yd0Dsb zjZMA5(51^~IIGEbhBPwASof9hH8c6CWMU18f|muCTUje9v}6qd=Kz;T%~y>~yh}n9 zchOk}-qp2Zb(A}0e+Z1a8o_sMHlHSXObQQM)vp;z#($6s>Y%iXC*%MSe9b(#Y&?q8~{}xf`np?3!86>|8~#n zF5=3)vb9mpc=}A0u1IF;mE`@GaY#6a3WMV7z5%Z5!baIP?MmC~)r|sF{>Rp~%Fgj2`P6o721w$&4VW@u)ED`DMpP@1Kfly(?0s8xUZBOdUmA7~jy*w;}{a zzgD1MA@q}OrMc-ONBfWOBKDu9VUHS7XZyV1k`7I5_EpX+X=(P}OTDsg)b_3B-;l1g z>fPtUH@r*xLf`5)FqJNH?&y3v%GR9aZyQ=qbiEWOVt!o$Dbk?KT)cFgTT0#Y?a5`e z9iM9k!|FaHB1@fatz@xlBjuY#azayX6n#X`FLE&YRbrB2q$63{DoDh)C(S-!Ee>tJO87a_EOgdCLy2CzylZqwq71y7ffp z4cMYRL$Ct1`&Dx+UOl15ML}fYDQxcMTZ3BOz1V%NUEQopB3%w>>=^z#OX1tqs1GLt z4E$v8Oac3Vd6XJ=_|0wk9@_%d>X-|gYt(~LrVP(4Tm3J$H4IDO<~1yrv3EW(VEi_@ zb<|X->}pS#-s=_`SlaKtS+9~zwnIFU)L}<{{FbI3cPwh9;``y#22%jpmkaFZR5rLyJ@!Vq}4#J(R1)m4o;%!<$I)NfYdXCvH(jY~Ym*JT--_Bpv{f+Uay{g(&*HFRt`Dzi$(|SU^s}>Z zmr20hAQ^{^R59Pk9OZl{-K(^L!s|yjbsgl!6T(Q38tMZtmpPLp#-o&F z0(o^%JuZaV$0dancC{(IMaG3Hf<7{=Ls^ehHJFZMGdw1{I6$dMw|u3NZ=c3|wGtP# z6TY-_%ax}rO!Xdj#DYK>l zczwZH`bz4~#ZW`(ld<}nm-)oC2PV7);eo0Afn>9><8&(QzPhg*jJ4|!ssq0jN?Dt` zY_%`+->op1GIUg4G6{^-$(WV(1jazUlO|03z4leS24mfNGh1t}(~Q1M>Wt*YR3n-< z%T|1%h!pRBeebVsqE+Nt@aW~QLwDNr@2wh-(8?O)M8-SDV#hrQY3wr4;_1}!_LS0u zu#7mRc9B(!QBS9v)ZRFKM?K>46DDuflvnOI8fGaK+n$X2p_6}KE-D^@QyvrCHKz9p zX(&Rd^Mf6Yb{A{dL1qH}rk%}`ic!8yMeTH0H(U-F`Mb6Ycd4ExtM0-&t-}J^9=FE! zlZ@}DT%Z=<9PIoU8(&uOcSG7D;Xb7&qWHlh1}rWBMkKW+TbwS{WXet12yhUF~F6!$fNTqQKgbFTctQSiM* zD93gFx=E#xEayYv9LpwOP=97RSzb-<9T4~Sx8!)#xy??(+QT{*gE4l{jqPBn-aCcg z;J4LMHfVWun&nShBz~^OSbr?pT~BA|xJ3V6C->%7sanmoD088#9sHRU66ET6MEwVp zX;^G>?_sXnUH|L>j`u$O9i~#Xf{Rvbn5LU%W@6#@esz9D*LNC-^<7p7<}UtmILd!` z9Fs#bV|^B;l*8}Mym?nR^bPWq&Dhb?TIT%Y7Ypu(LYNbQ89pCWnsvS?^ug~Bm7S8; z;aQ7jftLy#mo%<7;(|C`Jv{UNpQ64z9?JIne-J}4ma+_LjIm@1G02kLn2AxAltN($ zS+XVT2wBH6BpI?3Dntp{CmBg&DG&# z+4l^?xc=MYCnAqKo_v%QnMOz7B^h0N7Qm-^Ue}iJqH>6ylr~9}%40Ghm~tJhxN{?O zK+3$TEa~7vCEcYwwu!TkM+8!c80zs*Il`6SzLY5U2y1Wki;Yzz=cDI@h^jugoi<2I zk5qi;_W7&v8*`thV%&Z=x^@t{xWE$Z=1fcZ@3;`5*ouWF#RbEn-(KaZV*6a1e<<*P zv5nAmuF-(osDp8noZlI5lb7N`Y(9@3>x1|HP;1f7Kf$)oN~xi2UVGrdGuN5lFMnBw z40&th#Lub1TJUL(f9kIr*l+Uls=UUpVB=bMMuKbk3kYyyn~`)v#-p+!MQDR6qR-$V zwC0Z65vT3_!*_f%i)3W{u+MoH5!^l1hW_Y{#>t@U5_)RW{wF)PA;Tr*VKPEVCztK* z7`xn7hJLf~rqHpq162Z|?&!PtF!0#@{XEr!APoq$Yi((j)!NjGE$vPdaoyr%`n(^4 zJBK=>Mrpg6!2#MrH@h22IURP+^$PNi0wsE) zf%-Qy?qO=7eZ1AImKz&pUh~b(8xurT-{#lz@Y)o!Ovf(@KG9^`pU@i|gRWzD@+q4A zihAMNNBdSa0HZOYbAiqAE5Y#Q^OTK!sjOT3ER`mrqFD2{G6qF$ftoC^XxC)>e8)V= z4>#d^lee(u34CwS5%|{ohnebXiIn`3gchzou`Uo8h^EHw$vnGbo`K3U`j(RmY!KoA zdQG(cgID&3zxeX~kcy$uNdeqZs_U<|+$%%oycoF`8{o``1|qq)YPWV&%TvKcP+U)M znev;&9Tm^LT_s!N%l!eXZyn3YO#Rl90omzmx2!K(t@!W+g5~RzabrL_s5acO3l^PE zi=ns#KiiabC+j0mL1sEmOE#BJyYkegxVHtXEf&*Xn@f6CJ%9LAiV;Dwn=OR38L4>OlrL-vBwA;?sY9QW(cJ zwd>S2>1-1!VGb2>4-9`H8J%)Iq3HVBQufh}! zzq-N-evZV!_c;6Irj?iOE|(S@GKlusYwId$z6JA~k&42N;LTc9QX$6hEeuP4k>bZt zy$mi-`L&64i@kn{8{5*Rzx`27EKuoVkoV`V)8LmZvUV>BkjVLh9?Hm6_||9p{flKr zVbGaWA(fHhC&Ooh6paS5c9CWEs$DJMC7-qJ*b2ss;Sg04yN^Py_A9bo%BjS^&2`2} zk{m_zl6>?wk*|$B>>-ik1Liv)ExB_b6(h9th0Z#lnk_$|5&>de1Egz1fgGKeN4FEV zJ|_qYZf;w}ZQ~V?ubV#!+#exbc<}_qc2J%G;c3rYx|FOg z_lMkyM5&QrSZtZi0<^2a+ey>W9)Noe~1hg9(3Mb9Qoo=+@ zJP8_#{-!-r2+i}rIp4ZY59e7Zop0238N3T_=XZH;zOe zV9(z<;KJp)tu>Y;RQO@TP3#ly`m2&0`4W$0tj18~Hfz7LHvx9i7ISvI_+8O&>_@Gn zV1cpg6ti>ueV0ltb|n!j1}c{a7*q<_5pTzB(t`d;zIaIod%nE>fT^@%GtdwU!Wxg1 z5%t+3tRQu2)Ih0VQqx6T-mT-A9XZ!ZOP1FuSBjw|KWo0uxy&u`*Xnt%sFChas7AUq*3=R|#$gmlLuw>H zP|VMO7fhCdL0qGg56RE_+EgSiT5b2ty2X~Y7Q6l88&ue7wp5JHvke0<#5b>MRbo<* zegZ!mM9%@r4TG8ZH* z`dF(I$C;~N2fmoZ(0(%g&7pJ>A9{b9e~lhCP?aC&I_2v8^%~YiVItP`Dl8^VZQvc> zhoKL4WuYyn`mo1SiB6GDlOHuRfb7}xw4ukRD#NO=Pyw%PKJ{5pgX*7aR@@^)6b9Rp z&-8X#g1a5g8@XvWINUNg95j5IwT4tsL5&__#aR6bF4bW>U%g@2ihv;fcuBMOYc@fy zKrC*k2ZK`RqWMczo&kM%2><@@{(1T6N72s`ug8TYTaydSa->c@-xX1hK)OV#y4+FA zEPcg0hP9e;7FKgVs#87OZI?CF+C@2DRFC=2w#Y67)e*%?^EKwh%16rXOkkw{iv>KQ z-)0dhnE2@KFeuvKMHxY>a$fsfnlPCob!;O2YX@g!$mpqMpfbF&*P*EN=th62r<9(R z+#~k2P#G&Gl!6!r(BU-~v#l_M?cNHw{85?shabr4ZC)FJ$SYb)f(l> z3vNfty#)eh6BJ}5X5((s6ob`FA-$NbEm`^Mf(Y5(k9K31r$hhI}Aw8$TcOqy3hzFoju^$nSG=5fx~&<9TgG1wTeT6S)5BS5^gWG7G~a{B2wpP zuACO+(EhN#qB~&sq)J_6S3CA&1W^Ai>}bXu=<)D0UV&=-R}^GvvKZbbhqWdz z+vM%20SJ_H3i>J(mc=PI*hMgWsOLesU=OnnUkPWGvfrU|$C)D%oI-StvnYF-5x>n~ zHiI~YJxK_0^V@qzck^y+5r0P$}^)$ zCBNtRv+IL>(rV8Q0JQa#!Y5Aw1j5wJW6=e>?=?M6`p5diW-aoiy8Z9whtCx`NQA%k z`UEvVhiZWPRdL#|m^A&KqCb^kqGTsz4UK3C4NUJ}85@eJXKomJkr-be( zl*NqzJox1w`lu}f74&MG%HEPsFnOIaWICM*0`*p_ftY=Ds!oi=Z5qfi+t z+}^AA!RV+)Uc4($&JQ^G^dOVok+TJ@F{>uu$rF!H(YOy<+`!l`_G!#&2Z?!TmoMWM ziYrF`LvVKZOt(7_R{?#=))-0U7O`cCF;fo|{22xkdUZMnR-|!Cg}`?bB!Nm~f?U?9 zEk!r=Jyi;C7B&p|D@*HgyQ@(#2^m$~4xG>}Uj#VCz+4EK8~RgTmChB~ZXx}Y*x^{* znTURhrkKlsHCtp&fRLmn2u+aB87b-ZsM!i=uvfD&_-Orh>xragaih(2V z99=NURT&3B{cQfiB;(qKdLtEd{jmA~mvZ*{_=Z0weJAu>JLLWG@625wn*@0k}kO3(miq4|-yzc*9r) ztyF=6W7nAhLG!O;LYHQXqINNf`hmLSbF@CM|_2FN@^04Ez$ zD_y4;Gm+x3{L5X=$@^rAO}RkAj_H=okg8|cj-A<9Ky5Zln6u`!AnVj zw%o&jeJLa6i+nw@90LIj3Z$U?DU)s5&X}LL@j8$TzpM<~8EVUT@yX_seSSk?z;Oc| zPj%KNc*D{BX!%jG`Uyq?3L*4CMx|c@u5wZ2AWxEO1Vdj0B`b$qVYYNm)Z%6kl zsHyu&c+Ot)W%lMYKu;id`(XwPy=Y&99>}+o#faD#d44%#h>PkYlSm$APQiA~UJ{K# zuydj*y-JNH>_gR#2f*ewE%N##dwuZ_dqWA@pg!e0tjvevh&Y~Sc$tVF$6o#_D(%3Q z^GQ->eeKRJ3K18g$(PJaoi7#-Kf5J~;@a5ry{WoadTUx%_1v+qDzL5cQd6-Cr;%#$74 zuDtt^HCrQLRXq=OGI&qqE()e)>m~Aim~7QXd;_eM5n?jx30=;-C(h9u-O)T#+$!Mw zJARb`LH72MPxrCF9ZQP#3*FSQ@&|Lu-Vh{a2A@zj!De``{-aCFHB4Vf3X#<v808GW zvBf1e(M3^SsysCA^%1e5cwEFjf1OG5D9m`>|COBtA0rkm!F-QzDfFSDpG?!aagTQ8 zO=qo9m}J5S%u%1~bKm(4-%`}Vx1?{@K0W+<>`1a=uyY>_2;Vv=hiCdrvHL=Z%5N?e+SzT-j+DlZE;)a7K>DMWCEu_7%GP zD-}0hCzXe{hf_OGQ{^*u!u+R|F61s6?ZF4WTr3W!C}V;m%|wDaY&_2--4Nadc8ZcG z6l7^8a4n&g&bs5oGYHFW4c}Bs1SUa0jCWbNUh?(R`DuI!v}A9oy%dX*a9GJZK@NP$ z;U3#dXcmd#af%kW!32GW)U6BixoQlUKQUaZZ_XnLO=$7upAs1zlNQ(su%&MUH$>mKG$zr_QR{h#uP;f;SnP3<1 zK|K1l8-R$8b@QY0xGWDb{`Ga=C>f56;>5R$^VXzJp{oF;78)|ET z!{j8wK9D+NPzTqQzI{_OEi*zV%Js)XX=O;RBMo$CPyD6mZQ8yC1OUf!_EHCyzJ0J$9IhL~P{7UB?Q1?agD za_$ciHW;lZ4T6`1e{-dA)C4j~C7TkOmG*+Hh>ewreSDVuYtQ`(h?!zTv1U{qXdM6q zRz9i%1Yd$ZX9n@uV90Fq8qba!LQct+Vx4h*AgpWga}7~VrqY0_2`|A?w&~uePhGS> zx`>N}YlRh)SAdcYzO-AYQ|>gsH20|hM0L0Pl;BC|gDhcN?#-bur3_NyrFV0g5A(Y_ zt6X$B!UiZA{dpCq_&Cy48(>WIaz2BxE<9l=rKf^J%&Pj4#yFPI%BSp&-lBa%rLaIB z`5uHa)R+9yo}D!Xl#8rPrQEjA<=bUwjA1gMptFJq}XJL0@sKa-f)o^D&>hx;!#$Y(Auy?c1uPoga#(`*@N2exPd7t0y^k)aQ#jfY0{5h9?fE4? z{7h%HBi_0#a(+q*5Ehv5iBA+;A#OSjCy2l+j82wKUNLyjn=K_VBrA!yX<3#CAwPMn zE%BML3*;wCKm0)SiJ!m~tv~05KYo|dR-dDiKFL`uun?M$E~AtMaKF>#?UhsBmwwBr zHGgf^(g4cn#V0qj)i)`@OAaWadvP5Cu^#I_bK^JLbQwb?anC?JmJrM1dU;b_m~$1X z3bgR>UW(1*=IPPe>7?PmvVp)e7JT+(#LlA_;0fjxq#~v8{m5g3s5sMj4c>Y{sD5eDc zV?@aotv1h+44@m7>5}E!cdyR%xu1p}nipEH6Kt4T-T$eS@%{n6BjWDb13U-GxXh2j zGARxy4{bFZ@mBcjv8}jY3Iyr3Z7KhR74X}=ju&Dv1i9%unVps6ou9zoe7jsByh~vW zd^FGCi8asD_@r&sHHXpp;+c5uTNqSySQ|!dKKSuTlTMxddKu>nwThF@xZ8*obw734 zL-mM%Qw$wz+Q>H@F9YqMX*R85S?xBqRay&l?on%Mmxb0m+bpoB$E)4nG|G?KLRcx` z0$F6lWwN1`*f731D&X6peXmxqE0|x9sSG{2U}$qGSa`{&zr^??2lQ46Xim#9pNiA4 z=n>e10e5fU+1(3ybrS8WBhw@5fJljkMtK>fDK=JAJ!x|58*Fm(G5+dlF`$06kxCe| znq=t<3Ji4|!iQ<}>V-`iT&0Fjj7(h^4c7L`et)}e9<2km+aCmb?&)KQ#4|;@!w?0CA%Ef!{@nJN7T+=;`-b2Ct!50tBjXLH^O0;YoS8q zbvuB=&SSpL;(pw^8-t2C@^{?s8TO;c(-N2yu9bNc!Aj6c*-}Y?fM6A!`#hs0rnTq> z*5_6~z8i{$@_O<%Uwv1z;#3<10MGe{g5?w%8 zF;b{jt`%4?{+VPdYn~$~)NK~UQUIGin2<mCgvKpGI?F`3ENXNTK9GO_GNCfEC4m%s{+ zOT&KrRL$oxQ;TBkv^Y}VZZ+)X0TTr0X(qOlB_mppAy8B0pu4w zq?}Q%0-M-suIGqh@ty{R$Gw#2W>!VCtybY=|hPPaKUIUu?Fn? z(MAm51o>~*h7Y5{J=cMl(8i?TVVP+r#m !SKILLS.includes(n) && statSync(join(repoSkillsDir, n)).isDirectory(), + ); + if (extras.length > 0) { + console.log( + `\nNote: skills/ contains directories not in the plugin allowlist: ${extras.join(", ")}.`, + ); + console.log("Add to SKILLS in build.mts if they should ship."); + } +} + +if (checkMode) { + const tmp = resolve(__dirname, ".skills-check"); + console.log("Assembling into temp dir for drift check..."); + assemble(tmp); + try { + execSync(`diff -r "${outDir}" "${tmp}"`, { stdio: "pipe" }); + console.log("\n✓ Codex plugin skills/ is up to date."); + rmSync(tmp, { recursive: true }); + } catch (err) { + const e = err as { stdout?: Buffer; stderr?: Buffer }; + const out = (e.stdout?.toString() || "") + (e.stderr?.toString() || ""); + rmSync(tmp, { recursive: true }); + console.error("\n✗ Codex plugin skills/ is out of sync with repo skills/.\n"); + console.error(out.slice(0, 2000)); + console.error("\nRun `bun run --cwd packages/codex-plugin build` and commit the result."); + process.exit(1); + } +} else { + console.log(`Assembling Codex plugin skills/ from ${repoSkillsDir}\n`); + assemble(outDir); + console.log(`\n✓ Wrote ${SKILLS.length} skills to ${outDir}`); +} diff --git a/packages/codex-plugin/package.json b/packages/codex-plugin/package.json new file mode 100644 index 000000000..5ac298103 --- /dev/null +++ b/packages/codex-plugin/package.json @@ -0,0 +1,20 @@ +{ + "name": "@hyperframes/codex-plugin", + "version": "0.1.0", + "private": true, + "description": "HyperFrames packaged as an OpenAI Codex plugin", + "license": "Apache-2.0", + "repository": { + "type": "git", + "url": "https://github.com/heygen-com/hyperframes", + "directory": "packages/codex-plugin" + }, + "type": "module", + "scripts": { + "build": "tsx build.mts", + "check": "tsx build.mts --check" + }, + "devDependencies": { + "tsx": "^4.19.2" + } +} diff --git a/packages/codex-plugin/skills/gsap/SKILL.md b/packages/codex-plugin/skills/gsap/SKILL.md new file mode 100644 index 000000000..84fbdacbe --- /dev/null +++ b/packages/codex-plugin/skills/gsap/SKILL.md @@ -0,0 +1,211 @@ +--- +name: gsap +description: GSAP animation reference for HyperFrames. Covers gsap.to(), from(), fromTo(), easing, stagger, defaults, timelines (gsap.timeline(), position parameter, labels, nesting, playback), and performance (transforms, will-change, quickTo). Use when writing GSAP animations in HyperFrames compositions. +--- + +# GSAP + +## Core Tween Methods + +- **gsap.to(targets, vars)** — animate from current state to `vars`. Most common. +- **gsap.from(targets, vars)** — animate from `vars` to current state (entrances). +- **gsap.fromTo(targets, fromVars, toVars)** — explicit start and end. +- **gsap.set(targets, vars)** — apply immediately (duration 0). + +Always use **camelCase** property names (e.g. `backgroundColor`, `rotationX`). + +## Common vars + +- **duration** — seconds (default 0.5). +- **delay** — seconds before start. +- **ease** — `"power1.out"` (default), `"power3.inOut"`, `"back.out(1.7)"`, `"elastic.out(1, 0.3)"`, `"none"`. +- **stagger** — number `0.1` or object: `{ amount: 0.3, from: "center" }`, `{ each: 0.1, from: "random" }`. +- **overwrite** — `false` (default), `true`, or `"auto"`. +- **repeat** — number or `-1` for infinite. **yoyo** — alternates direction with repeat. +- **onComplete**, **onStart**, **onUpdate** — callbacks. +- **immediateRender** — default `true` for from()/fromTo(). Set `false` on later tweens targeting the same property+element to avoid overwrite. + +## Transforms and CSS + +Prefer GSAP's **transform aliases** over raw `transform` string: + +| GSAP property | Equivalent | +| --------------------------- | ------------------- | +| `x`, `y`, `z` | translateX/Y/Z (px) | +| `xPercent`, `yPercent` | translateX/Y in % | +| `scale`, `scaleX`, `scaleY` | scale | +| `rotation` | rotate (deg) | +| `rotationX`, `rotationY` | 3D rotate | +| `skewX`, `skewY` | skew | +| `transformOrigin` | transform-origin | + +- **autoAlpha** — prefer over `opacity`. At 0: also sets `visibility: hidden`. +- **CSS variables** — `"--hue": 180`. +- **svgOrigin** _(SVG only)_ — global SVG coordinate space origin. Don't combine with `transformOrigin`. +- **Directional rotation** — `"360_cw"`, `"-170_short"`, `"90_ccw"`. +- **clearProps** — `"all"` or comma-separated; removes inline styles on complete. +- **Relative values** — `"+=20"`, `"-=10"`, `"*=2"`. + +## Function-Based Values + +```javascript +gsap.to(".item", { + x: (i, target, targets) => i * 50, + stagger: 0.1, +}); +``` + +## Easing + +Built-in eases: `power1`–`power4`, `back`, `bounce`, `circ`, `elastic`, `expo`, `sine`. Each has `.in`, `.out`, `.inOut`. + +## Defaults + +```javascript +gsap.defaults({ duration: 0.6, ease: "power2.out" }); +``` + +## Controlling Tweens + +```javascript +const tween = gsap.to(".box", { x: 100 }); +tween.pause(); +tween.play(); +tween.reverse(); +tween.kill(); +tween.progress(0.5); +tween.time(0.2); +``` + +## gsap.matchMedia() (Responsive + Accessibility) + +Runs setup only when a media query matches; auto-reverts when it stops matching. + +```javascript +let mm = gsap.matchMedia(); +mm.add( + { + isDesktop: "(min-width: 800px)", + reduceMotion: "(prefers-reduced-motion: reduce)", + }, + (context) => { + const { isDesktop, reduceMotion } = context.conditions; + gsap.to(".box", { + rotation: isDesktop ? 360 : 180, + duration: reduceMotion ? 0 : 2, + }); + }, +); +``` + +--- + +## Timelines + +### Creating a Timeline + +```javascript +const tl = gsap.timeline({ defaults: { duration: 0.5, ease: "power2.out" } }); +tl.to(".a", { x: 100 }).to(".b", { y: 50 }).to(".c", { opacity: 0 }); +``` + +### Position Parameter + +Third argument controls placement: + +- **Absolute**: `1` — at 1s +- **Relative**: `"+=0.5"` — after end; `"-=0.2"` — before end +- **Label**: `"intro"`, `"intro+=0.3"` +- **Alignment**: `"<"` — same start as previous; `">"` — after previous ends; `"<0.2"` — 0.2s after previous starts + +```javascript +tl.to(".a", { x: 100 }, 0); +tl.to(".b", { y: 50 }, "<"); // same start as .a +tl.to(".c", { opacity: 0 }, "<0.2"); // 0.2s after .b starts +``` + +### Labels + +```javascript +tl.addLabel("intro", 0); +tl.to(".a", { x: 100 }, "intro"); +tl.addLabel("outro", "+=0.5"); +tl.play("outro"); +tl.tweenFromTo("intro", "outro"); +``` + +### Timeline Options + +- **paused: true** — create paused; call `.play()` to start. +- **repeat**, **yoyo** — apply to whole timeline. +- **defaults** — vars merged into every child tween. + +### Nesting Timelines + +```javascript +const master = gsap.timeline(); +const child = gsap.timeline(); +child.to(".a", { x: 100 }).to(".b", { y: 50 }); +master.add(child, 0); +``` + +### Playback Control + +`tl.play()`, `tl.pause()`, `tl.reverse()`, `tl.restart()`, `tl.time(2)`, `tl.progress(0.5)`, `tl.kill()`. + +--- + +## Performance + +### Prefer Transform and Opacity + +Animating `x`, `y`, `scale`, `rotation`, `opacity` stays on the compositor. Avoid `width`, `height`, `top`, `left` when transforms achieve the same effect. + +### will-change + +```css +will-change: transform; +``` + +Only on elements that actually animate. + +### gsap.quickTo() for Frequent Updates + +```javascript +let xTo = gsap.quickTo("#id", "x", { duration: 0.4, ease: "power3" }), + yTo = gsap.quickTo("#id", "y", { duration: 0.4, ease: "power3" }); +container.addEventListener("mousemove", (e) => { + xTo(e.pageX); + yTo(e.pageY); +}); +``` + +### Stagger > Many Tweens + +Use `stagger` instead of separate tweens with manual delays. + +### Cleanup + +Pause or kill off-screen animations. + +--- + +## References (loaded on demand) + +- **[references/effects.md](references/effects.md)** — Drop-in effects: typewriter text, audio visualizer. Read when needing ready-made effect patterns for HyperFrames. + +## Best Practices + +- Use camelCase property names; prefer transform aliases and autoAlpha. +- Prefer timelines over chaining with delay; use the position parameter. +- Add labels with `addLabel()` for readable sequencing. +- Pass defaults into timeline constructor. +- Store tween/timeline return value when controlling playback. + +## Do Not + +- Animate layout properties (width/height/top/left) when transforms suffice. +- Use both svgOrigin and transformOrigin on the same SVG element. +- Chain animations with delay when a timeline can sequence them. +- Create tweens before the DOM exists. +- Skip cleanup — always kill tweens when no longer needed. diff --git a/packages/codex-plugin/skills/gsap/references/effects.md b/packages/codex-plugin/skills/gsap/references/effects.md new file mode 100644 index 000000000..82c0ebafb --- /dev/null +++ b/packages/codex-plugin/skills/gsap/references/effects.md @@ -0,0 +1,297 @@ +# GSAP Effects for HyperFrames + +Drop-in animation patterns for HyperFrames compositions. Each effect is self-contained with HTML, CSS, and code. + +All effects follow HyperFrames composition rules — deterministic, no randomness, timelines registered via `window.__timelines`. + +## Table of Contents + +- [Typewriter](#typewriter) +- [Audio Visualizer](#audio-visualizer) + +--- + +## Typewriter + +Reveal text character by character using GSAP's TextPlugin. + +### Required Plugin + +```html + + + +``` + +### Basic Typewriter + +```js +const text = "Hello, world!"; +const cps = 10; // chars per second: 3-5 dramatic, 8-12 conversational, 15-20 energetic +tl.to( + "#typed-text", + { text: { value: text }, duration: text.length / cps, ease: "none" }, + startTime, +); +``` + +### With Blinking Cursor + +Three rules: + +1. **One cursor visible at a time** — hide previous before showing next. +2. **Cursor must blink when idle** — after typing, during pauses. +3. **No gap between text and cursor** — elements must be flush in HTML. + +```html +| +``` + +```css +@keyframes blink { + 0%, + 100% { + opacity: 1; + } + 50% { + opacity: 0; + } +} +.cursor-blink { + animation: blink 0.8s step-end infinite; +} +.cursor-solid { + animation: none; + opacity: 1; +} +.cursor-hide { + animation: none; + opacity: 0; +} +``` + +Pattern: blink → solid (typing starts) → type → solid → blink (typing done). + +```js +tl.call(() => cursor.classList.replace("cursor-blink", "cursor-solid"), [], startTime); +tl.to("#typed-text", { text: { value: text }, duration: dur, ease: "none" }, startTime); +tl.call(() => cursor.classList.replace("cursor-solid", "cursor-blink"), [], startTime + dur); +``` + +### Backspacing + +TextPlugin removes from front — wrong for backspace. Use manual substring removal: + +```js +function backspace(tl, selector, word, startTime, cps) { + const el = document.querySelector(selector); + const interval = 1 / cps; + for (let i = word.length - 1; i >= 0; i--) { + tl.call( + () => { + el.textContent = word.slice(0, i); + }, + [], + startTime + (word.length - i) * interval, + ); + } + return word.length * interval; +} +``` + +### Spacing with Static Text + +When a typewriter word sits next to static text, use `margin-left` on a wrapper span. Don't use flex gap (spaces cursor from text) or trailing space in static text (collapses when dynamic is empty). + +```html +
+ Ship something + | +
+``` + +### Word Rotation + +Type → hold → backspace → next word. Cursor blinks during every idle moment (holds, after backspace). + +```js +words.forEach((word, i) => { + const typeDur = word.length / 10; + // Solid while typing + tl.call(() => cursor.classList.replace("cursor-blink", "cursor-solid"), [], offset); + tl.to("#typed-text", { text: { value: word }, duration: typeDur, ease: "none" }, offset); + // Blink during hold + tl.call(() => cursor.classList.replace("cursor-solid", "cursor-blink"), [], offset + typeDur); + offset += typeDur + 1.5; // hold + + if (i < words.length - 1) { + tl.call(() => cursor.classList.replace("cursor-blink", "cursor-solid"), [], offset); + const clearDur = backspace(tl, el, word, offset, 20); + tl.call(() => cursor.classList.replace("cursor-solid", "cursor-blink"), [], offset + clearDur); + offset += clearDur + 0.3; + } +}); +``` + +### Appending Words + +Build a sentence word-by-word into the same element: + +```js +let accumulated = ""; +words.forEach((word) => { + const target = accumulated + (accumulated ? " " : "") + word; + const newChars = target.length - accumulated.length; + tl.to("#typed-text", { text: { value: target }, duration: newChars / 10, ease: "none" }, offset); + accumulated = target; + offset += newChars / 10 + 0.3; +}); +``` + +### Multi-Line Cursor Handoff + +When handing off between typewriter lines: hide previous → blink new → pause → solid when typing. Never go hidden→solid (skips idle state). + +```js +tl.call( + () => { + prevCursor.classList.replace("cursor-blink", "cursor-hide"); + nextCursor.classList.replace("cursor-hide", "cursor-blink"); + }, + [], + handoffTime, +); + +const typeStart = handoffTime + 0.5; // brief blink pause +tl.call(() => nextCursor.classList.replace("cursor-blink", "cursor-solid"), [], typeStart); +tl.to("#next-text", { text: { value: text }, duration: dur, ease: "none" }, typeStart); +tl.call(() => nextCursor.classList.replace("cursor-solid", "cursor-blink"), [], typeStart + dur); +``` + +### Timing Guide + +| CPS | Feel | Good for | +| ----- | ---------------- | -------------------------- | +| 3-5 | Slow, deliberate | Dramatic reveals, suspense | +| 8-12 | Natural typing | Dialogue, narration | +| 15-20 | Fast, energetic | Tech demos, code | +| 30+ | Near-instant | Filling long blocks | + +--- + +## Audio Visualizer + +Pre-extract audio data, drive canvas/DOM rendering from GSAP timeline. + +### Extract Audio Data + +```bash +python scripts/extract-audio-data.py audio.mp3 -o audio-data.json +python scripts/extract-audio-data.py video.mp4 --fps 30 --bands 16 -o audio-data.json +``` + +Requires ffmpeg and numpy. + +### Data Format + +```json +{ + "fps": 30, "totalFrames": 5415, + "frames": [{ "time": 0.0, "rms": 0.42, "bands": [0.8, 0.6, 0.3, ...] }] +} +``` + +- **rms** (0-1): overall loudness, normalized across track +- **bands[]** (0-1): frequency magnitudes. Index 0 = bass, higher = treble. Each normalized independently. + +### Loading the Data + +```js +// Option A: inline (small files, under ~500KB) +var AUDIO_DATA = { + /* paste audio-data.json contents */ +}; + +// Option B: sync XHR (large files — must be synchronous for deterministic timeline construction) +var xhr = new XMLHttpRequest(); +xhr.open("GET", "audio-data.json", false); +xhr.send(); +var AUDIO_DATA = JSON.parse(xhr.responseText); +``` + +**Do NOT use async `fetch()` to load audio data.** HyperFrames requires synchronous timeline construction — the capture engine reads `window.__timelines` synchronously after page load. Building timelines inside `.then()` callbacks means the timeline isn't ready when capture starts. + +### Rendering Approaches + +**Canvas 2D** (most common — bars, waveforms, circles, gradients): + +```js +for (let f = 0; f < AUDIO_DATA.totalFrames; f++) { + tl.call( + () => { + const frame = AUDIO_DATA.frames[f]; + ctx.clearRect(0, 0, canvas.width, canvas.height); + // draw using frame.rms and frame.bands + }, + [], + f / AUDIO_DATA.fps, + ); +} +``` + +**WebGL / Three.js** — HyperFrames patches `THREE.Clock` for deterministic time. Update uniforms from audio data each frame. + +**DOM Elements** — fine for < 20 elements, less performant than Canvas for many. + +### Spatial Mapping + +- **Horizontal**: bass left, treble right (iterate bands left-to-right) +- **Vertical**: bass bottom, treble top +- **Circular**: bass at 12 o'clock, wrap clockwise; mirror for full circle + +### Smoothing + +```js +let prev = null; +const smoothing = 0.25; // 0.1-0.2 snappy, 0.3-0.5 flowing +function smooth(f) { + const raw = AUDIO_DATA.frames[f]; + if (!prev) { + prev = { rms: raw.rms, bands: [...raw.bands] }; + return prev; + } + prev = { + rms: prev.rms * smoothing + raw.rms * (1 - smoothing), + bands: raw.bands.map((b, i) => prev.bands[i] * smoothing + b * (1 - smoothing)), + }; + return prev; +} +``` + +### Motion Principles + +- **Bass drives big moves** — scale, glow, position shifts +- **Treble drives detail** — shimmer, flicker, edge effects +- **RMS drives globals** — background brightness, overall energy +- Pick 2-3 properties to animate. More looks noisy. +- Keep minimums above zero — quiet sections need life. + +### Band Count + +| Bands | Detail | Good for | +| ----- | --------- | -------------------------- | +| 4 | Low | Background glow, pulsing | +| 8 | Medium | Bar charts, basic spectrum | +| 16 | High | Detailed EQ (default) | +| 32 | Very high | Dense radial layouts | + +### Layering + +Layer multiple canvases with CSS z-index for depth — a background layer driven by bass/rms and a foreground layer driven by individual bands creates depth without complexity. + +```html + + +``` diff --git a/packages/codex-plugin/skills/gsap/scripts/extract-audio-data.py b/packages/codex-plugin/skills/gsap/scripts/extract-audio-data.py new file mode 100644 index 000000000..b4efba788 --- /dev/null +++ b/packages/codex-plugin/skills/gsap/scripts/extract-audio-data.py @@ -0,0 +1,188 @@ +#!/usr/bin/env python3 +""" +Extract per-frame audio visualization data from an audio or video file. + +Outputs JSON with RMS amplitude and frequency band data at the target FPS, +ready to embed in a HyperFrames composition. + +Usage: + python extract-audio-data.py input.mp3 -o audio-data.json + python extract-audio-data.py input.mp4 --fps 30 --bands 16 -o audio-data.json + +Requirements: + - Python 3.9+ + - ffmpeg (for decoding audio) + - numpy (pip install numpy) +""" + +import argparse +import json +import subprocess +import sys + +import numpy as np + +# --------------------------------------------------------------------------- +# FFT parameters +# +# A 4096-sample window gives ~10.8 Hz per bin at 44100Hz — enough to resolve +# low-frequency bands cleanly. The per-frame audio slice (44100/30 = 1470 +# samples at 30fps) is too small and causes low bands to map to the same bins. +# +# Frequency range 30Hz–16kHz covers the useful range for music. Below 30Hz is +# sub-bass most speakers can't reproduce; above 16kHz is noise/harmonics that +# don't contribute to perceived rhythm or melody. +# --------------------------------------------------------------------------- + +SAMPLE_RATE = 44100 +FFT_SIZE = 4096 +MIN_FREQ = 30.0 +MAX_FREQ = 16000.0 + + +def decode_audio(path: str) -> np.ndarray: + """Decode audio to mono float32 samples via ffmpeg.""" + cmd = [ + "ffmpeg", "-i", path, + "-vn", "-ac", "1", "-ar", str(SAMPLE_RATE), + "-f", "s16le", "-acodec", "pcm_s16le", + "-loglevel", "error", + "pipe:1", + ] + result = subprocess.run(cmd, capture_output=True) + if result.returncode != 0: + print(f"ffmpeg error: {result.stderr.decode()}", file=sys.stderr) + sys.exit(1) + return np.frombuffer(result.stdout, dtype=np.int16).astype(np.float32) / 32768.0 + + +def compute_band_edges(n_bands: int) -> np.ndarray: + """Logarithmically-spaced frequency band edges from MIN_FREQ to MAX_FREQ.""" + return np.array([ + MIN_FREQ * (MAX_FREQ / MIN_FREQ) ** (i / n_bands) + for i in range(n_bands + 1) + ]) + + +def compute_fft_bands( + windowed: np.ndarray, freq_per_bin: float, n_bins: int, + band_edges: np.ndarray, n_bands: int, +) -> np.ndarray: + """Compute peak magnitude in logarithmically-spaced frequency bands.""" + magnitudes = np.abs(np.fft.rfft(windowed)) + + bands = np.zeros(n_bands) + for b in range(n_bands): + low_bin = max(0, int(band_edges[b] / freq_per_bin)) + high_bin = min(n_bins, int(band_edges[b + 1] / freq_per_bin)) + if high_bin <= low_bin: + high_bin = low_bin + 1 + # Clamp to valid range to avoid empty slices + low_bin = min(low_bin, n_bins - 1) + high_bin = min(high_bin, n_bins) + bands[b] = np.max(magnitudes[low_bin:high_bin]) + + return bands + + +def extract(path: str, fps: int, n_bands: int) -> dict: + """Extract per-frame audio data.""" + print(f"Decoding audio from {path}...", file=sys.stderr) + samples = decode_audio(path) + duration = len(samples) / SAMPLE_RATE + frame_step = SAMPLE_RATE // fps + total_frames = int(duration * fps) + + print(f"Duration: {duration:.1f}s, {total_frames} frames at {fps}fps", file=sys.stderr) + print(f"FFT window: {FFT_SIZE} samples ({SAMPLE_RATE / FFT_SIZE:.1f} Hz/bin)", file=sys.stderr) + print(f"Frequency range: {MIN_FREQ:.0f}-{MAX_FREQ:.0f} Hz, {n_bands} bands", file=sys.stderr) + + # Precompute constants + hann = np.hanning(FFT_SIZE) + band_edges = compute_band_edges(n_bands) + freq_per_bin = SAMPLE_RATE / FFT_SIZE + n_bins = FFT_SIZE // 2 + 1 + half_fft = FFT_SIZE // 2 + + # Pass 1: extract raw values + rms_values = np.zeros(total_frames) + band_values = np.zeros((total_frames, n_bands)) + + for f in range(total_frames): + # RMS from the frame's audio slice + rms_start = f * frame_step + rms_end = rms_start + frame_step + frame_slice = samples[rms_start:min(rms_end, len(samples))] + if len(frame_slice) > 0: + rms_values[f] = np.sqrt(np.mean(frame_slice ** 2)) + + # FFT from a centered 4096-sample window + center = rms_start + frame_step // 2 + win_start = center - half_fft + win_end = center + half_fft + + if win_start >= 0 and win_end <= len(samples): + window = samples[win_start:win_end] * hann + else: + # Zero-pad at edges + padded = np.zeros(FFT_SIZE) + src_start = max(0, win_start) + src_end = min(len(samples), win_end) + dst_start = src_start - win_start + dst_end = dst_start + (src_end - src_start) + padded[dst_start:dst_end] = samples[src_start:src_end] + window = padded * hann + + band_values[f] = compute_fft_bands(window, freq_per_bin, n_bins, band_edges, n_bands) + + # Pass 2: normalize + peak_rms = rms_values.max() if total_frames > 0 else 1.0 + if peak_rms > 0: + rms_values /= peak_rms + + # Per-band normalization so treble is visible alongside louder bass + band_peaks = band_values.max(axis=0) + band_peaks[band_peaks == 0] = 1.0 + band_values /= band_peaks + + # Build output + frames = [] + for f in range(total_frames): + frames.append({ + "time": round(f / fps, 4), + "rms": round(float(rms_values[f]), 4), + "bands": [round(float(b), 4) for b in band_values[f]], + }) + + return { + "duration": round(duration, 4), + "fps": fps, + "bands": n_bands, + "totalFrames": total_frames, + "frames": frames, + } + + +def main(): + parser = argparse.ArgumentParser(description="Extract per-frame audio visualization data") + parser.add_argument("input", help="Audio or video file") + parser.add_argument("-o", "--output", default="audio-data.json", help="Output JSON path") + parser.add_argument("--fps", type=int, default=30, help="Frames per second (default: 30)") + parser.add_argument("--bands", type=int, default=16, help="Number of frequency bands (default: 16)") + args = parser.parse_args() + + if args.fps < 1: + parser.error("--fps must be at least 1") + if args.bands < 1: + parser.error("--bands must be at least 1") + + data = extract(args.input, args.fps, args.bands) + + with open(args.output, "w") as f: + json.dump(data, f) + + print(f"Wrote {args.output} ({data['totalFrames']} frames, {data['bands']} bands)", file=sys.stderr) + + +if __name__ == "__main__": + main() diff --git a/packages/codex-plugin/skills/hyperframes-cli/SKILL.md b/packages/codex-plugin/skills/hyperframes-cli/SKILL.md new file mode 100644 index 000000000..ba14cfe64 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes-cli/SKILL.md @@ -0,0 +1,114 @@ +--- +name: hyperframes-cli +description: HyperFrames CLI tool — hyperframes init, lint, preview, render, transcribe, tts, doctor, browser, info, upgrade, compositions, docs, benchmark. Use when scaffolding a project, linting or validating compositions, previewing in the studio, rendering to video, transcribing audio, generating TTS, or troubleshooting the HyperFrames environment. +--- + +# HyperFrames CLI + +Everything runs through `npx hyperframes`. Requires Node.js >= 22 and FFmpeg. + +## Workflow + +1. **Scaffold** — `npx hyperframes init my-video` +2. **Write** — author HTML composition (see the `hyperframes` skill) +3. **Lint** — `npx hyperframes lint` +4. **Preview** — `npx hyperframes preview` +5. **Render** — `npx hyperframes render` + +Lint before preview — catches missing `data-composition-id`, overlapping tracks, unregistered timelines. + +## Scaffolding + +```bash +npx hyperframes init my-video # interactive wizard +npx hyperframes init my-video --example warm-grain # pick an example +npx hyperframes init my-video --video clip.mp4 # with video file +npx hyperframes init my-video --audio track.mp3 # with audio file +npx hyperframes init my-video --non-interactive # skip prompts (CI/agents) +``` + +Templates: `blank`, `warm-grain`, `play-mode`, `swiss-grid`, `vignelli`, `decision-tree`, `kinetic-type`, `product-promo`, `nyt-graph`. + +`init` creates the right file structure, copies media, transcribes audio with Whisper, and installs AI coding skills. Use it instead of creating files by hand. + +## Linting + +```bash +npx hyperframes lint # current directory +npx hyperframes lint ./my-project # specific project +npx hyperframes lint --verbose # info-level findings +npx hyperframes lint --json # machine-readable +``` + +Lints `index.html` and all files in `compositions/`. Reports errors (must fix), warnings (should fix), and info (with `--verbose`). + +## Previewing + +```bash +npx hyperframes preview # serve current directory +npx hyperframes preview --port 4567 # custom port (default 3002) +``` + +Hot-reloads on file changes. Opens the studio in your browser automatically. + +## Rendering + +```bash +npx hyperframes render # standard MP4 +npx hyperframes render --output final.mp4 # named output +npx hyperframes render --quality draft # fast iteration +npx hyperframes render --fps 60 --quality high # final delivery +npx hyperframes render --format webm # transparent WebM +npx hyperframes render --docker # byte-identical +``` + +| Flag | Options | Default | Notes | +| -------------- | --------------------- | -------------------------- | --------------------------- | +| `--output` | path | renders/name_timestamp.mp4 | Output path | +| `--fps` | 24, 30, 60 | 30 | 60fps doubles render time | +| `--quality` | draft, standard, high | standard | draft for iterating | +| `--format` | mp4, webm | mp4 | WebM supports transparency | +| `--workers` | 1-8 or auto | auto | Each spawns Chrome | +| `--docker` | flag | off | Reproducible output | +| `--gpu` | flag | off | GPU-accelerated encoding | +| `--strict` | flag | off | Fail on lint errors | +| `--strict-all` | flag | off | Fail on errors AND warnings | + +**Quality guidance:** `draft` while iterating, `standard` for review, `high` for final delivery. + +## Transcription + +```bash +npx hyperframes transcribe audio.mp3 +npx hyperframes transcribe video.mp4 --model medium.en --language en +npx hyperframes transcribe subtitles.srt # import existing +npx hyperframes transcribe subtitles.vtt +npx hyperframes transcribe openai-response.json +``` + +## Text-to-Speech + +```bash +npx hyperframes tts "Text here" --voice af_nova --output narration.wav +npx hyperframes tts script.txt --voice bf_emma +npx hyperframes tts --list # show all voices +``` + +## Troubleshooting + +```bash +npx hyperframes doctor # check environment (Chrome, FFmpeg, Node, memory) +npx hyperframes browser # manage bundled Chrome +npx hyperframes info # version and environment details +npx hyperframes upgrade # check for updates +``` + +Run `doctor` first if rendering fails. Common issues: missing FFmpeg, missing Chrome, low memory. + +## Other + +```bash +npx hyperframes compositions # list compositions in project +npx hyperframes docs # open documentation +npx hyperframes benchmark . # benchmark render performance +``` diff --git a/packages/codex-plugin/skills/hyperframes-registry/SKILL.md b/packages/codex-plugin/skills/hyperframes-registry/SKILL.md new file mode 100644 index 000000000..005666575 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes-registry/SKILL.md @@ -0,0 +1,104 @@ +--- +name: hyperframes-registry +description: Install and wire registry blocks and components into HyperFrames compositions. Use when running hyperframes add, installing a block or component, wiring an installed item into index.html, or working with hyperframes.json. Covers the add command, install locations, block sub-composition wiring, component snippet merging, and registry discovery. +--- + +# HyperFrames Registry + +The registry provides reusable blocks and components installable via `hyperframes add `. + +- **Blocks** — standalone sub-compositions (own dimensions, duration, timeline). Included via `data-composition-src` in a host composition. +- **Components** — effect snippets (no own dimensions). Pasted directly into a host composition's HTML. + +## When to use this skill + +- User mentions `hyperframes add`, "block", "component", or `hyperframes.json` +- Output from `hyperframes add` appears in the session (file paths, clipboard snippet) +- You need to wire an installed item into an existing composition +- You want to discover what's available in the registry + +## Quick reference + +```bash +hyperframes add data-chart # install a block +hyperframes add grain-overlay # install a component +hyperframes add shimmer-sweep --dir . # target a specific project +hyperframes add data-chart --json # machine-readable output +hyperframes add data-chart --no-clipboard # skip clipboard (CI/headless) +``` + +After install, the CLI prints which files were written and a snippet to paste into your host composition. The snippet is a starting point — you'll need to add `data-composition-id` (must match the block's internal composition ID), `data-start`, and `data-track-index` attributes when wiring blocks. + +Note: `hyperframes add` only works for blocks and components. For examples, use `hyperframes init --example ` instead. + +## Install locations + +Blocks install to `compositions/.html` by default. +Components install to `compositions/components/.html` by default. + +These paths are configurable in `hyperframes.json`: + +```json +{ + "registry": "https://raw.githubusercontent.com/heygen-com/hyperframes/main/registry", + "paths": { + "blocks": "compositions", + "components": "compositions/components", + "assets": "assets" + } +} +``` + +See [install-locations.md](./references/install-locations.md) for full details. + +## Wiring blocks + +Blocks are standalone compositions — include them via `data-composition-src` in your host `index.html`: + +```html +
+``` + +Key attributes: + +- `data-composition-src` — path to the block HTML file +- `data-composition-id` — must match the block's internal ID +- `data-start` — when the block appears in the host timeline (seconds) +- `data-duration` — how long the block plays +- `data-width` / `data-height` — block canvas dimensions +- `data-track-index` — layer ordering (higher = in front) + +See [wiring-blocks.md](./references/wiring-blocks.md) for full details. + +## Wiring components + +Components are snippets — paste their HTML into your composition's markup, their CSS into your style block, and their JS into your script (if any): + +1. Read the installed file (e.g., `compositions/components/grain-overlay.html`) +2. Copy the HTML elements into your composition's `
` +3. Copy the ` + + +
+ + +
+ + + +``` + +Key conventions: + +- `data-composition-id` is `-demo` to avoid collisions +- The demo is self-contained — all CSS and JS from the snippet is inlined +- The GSAP timeline is registered on `window.__timelines` +- Duration should be long enough to showcase the effect (typically 5-8 seconds) + +## Blocks don't need demo.html + +Blocks are already standalone compositions that can be rendered directly. Only components need the demo wrapper. + +## Demos are not installed + +The `demo.html` is NOT installed by `hyperframes add` — it exists only in the registry for preview generation and as a reference. diff --git a/packages/codex-plugin/skills/hyperframes-registry/references/discovery.md b/packages/codex-plugin/skills/hyperframes-registry/references/discovery.md new file mode 100644 index 000000000..65a8d27f9 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes-registry/references/discovery.md @@ -0,0 +1,53 @@ +# Registry Discovery + +## Reading the registry manifest + +The top-level `registry.json` lists all available items: + +```bash +curl -s https://raw.githubusercontent.com/heygen-com/hyperframes/main/registry/registry.json +``` + +Each entry has `name` and `type` (`hyperframes:example`, `hyperframes:block`, or `hyperframes:component`). + +## Reading an item's manifest + +Each item has a `registry-item.json` with full metadata: + +``` +///registry-item.json +``` + +Where `` is `examples`, `blocks`, or `components`. + +## Item manifest fields + +| Field | Type | Required | Description | +| ---------------------- | -------- | -------- | ---------------------------------------------- | +| `name` | string | yes | Kebab-case identifier | +| `type` | string | yes | `hyperframes:block` or `hyperframes:component` | +| `title` | string | yes | Human-readable title | +| `description` | string | yes | One-line description | +| `tags` | string[] | no | Filter tags (e.g., `["data", "chart"]`) | +| `dimensions` | object | blocks | `{ width, height }` — blocks only | +| `duration` | number | blocks | Duration in seconds — blocks only | +| `files` | array | yes | Files to install (`path`, `target`, `type`) | +| `registryDependencies` | string[] | no | Other registry items this depends on | + +## Available items + +### Blocks + +| Name | Description | Tags | +| ------------ | ----------------------------------------------- | ------------------------------- | +| `data-chart` | Animated bar + line chart with staggered reveal | data, chart, statistics | +| `flowchart` | Decision tree with SVG connectors and cursor | diagram, flowchart, interactive | +| `logo-outro` | Cinematic logo reveal with tagline | branding, outro, logo | + +### Components + +| Name | Description | Tags | +| -------------------- | --------------------------------------- | -------------------------------- | +| `grain-overlay` | Animated film grain texture overlay | texture, grain, overlay, film | +| `shimmer-sweep` | CSS gradient light sweep for AI accents | text, shimmer, highlight, effect | +| `grid-pixelate-wipe` | Grid dissolve transition between scenes | transition, wipe, grid, pixelate | diff --git a/packages/codex-plugin/skills/hyperframes-registry/references/install-locations.md b/packages/codex-plugin/skills/hyperframes-registry/references/install-locations.md new file mode 100644 index 000000000..65e3d34e5 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes-registry/references/install-locations.md @@ -0,0 +1,45 @@ +# Install Locations + +## Default paths + +| Item type | Default install path | Configured by | +| --------- | ------------------------------------- | ----------------------------------- | +| Block | `compositions/.html` | `hyperframes.json#paths.blocks` | +| Component | `compositions/components/.html` | `hyperframes.json#paths.components` | + +## How path remapping works + +The `target` field in each item's `registry-item.json` specifies a default install path. The `add` command remaps the prefix based on `hyperframes.json#paths`: + +- Block targets starting with `compositions/` get remapped to `/` +- Component targets starting with `compositions/components/` get remapped to `/` + +## hyperframes.json + +Created automatically by `hyperframes init`. If it doesn't exist when you run `add`, the CLI creates it with defaults: + +```json +{ + "$schema": "https://hyperframes.heygen.com/schema/hyperframes.json", + "registry": "https://raw.githubusercontent.com/heygen-com/hyperframes/main/registry", + "paths": { + "blocks": "compositions", + "components": "compositions/components", + "assets": "assets" + } +} +``` + +## Custom layouts + +To install blocks into a `scenes/` directory instead of `compositions/`: + +```json +{ + "paths": { + "blocks": "scenes" + } +} +``` + +Then `hyperframes add data-chart` writes to `scenes/data-chart.html` instead of `compositions/data-chart.html`. The snippet output reflects the remapped path. diff --git a/packages/codex-plugin/skills/hyperframes-registry/references/wiring-blocks.md b/packages/codex-plugin/skills/hyperframes-registry/references/wiring-blocks.md new file mode 100644 index 000000000..957fa0786 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes-registry/references/wiring-blocks.md @@ -0,0 +1,91 @@ +# Wiring Blocks + +Blocks are standalone compositions with their own `data-composition-id`, dimensions, duration, and GSAP timeline. Include them in a host composition using `data-composition-src` on a `
`. + +## Basic wiring + +After `hyperframes add data-chart`, wire it into your `index.html`: + +```html +
+ + + +
+
+``` + +## Required attributes + +| Attribute | Description | +| ---------------------- | -------------------------------------------------------------------- | +| `data-composition-src` | Path to the block HTML file (relative to index.html) | +| `data-composition-id` | Unique ID matching the block's internal composition ID | +| `data-start` | When the block appears in the host timeline (seconds) | +| `data-duration` | How long the block plays (seconds, at most the block's own duration) | +| `data-track-index` | Layer ordering — higher numbers render in front | +| `data-width` | Block canvas width (match the block's dimensions) | +| `data-height` | Block canvas height (match the block's dimensions) | + +## Timeline coordination + +The block's internal GSAP timeline runs independently from the host timeline. The HyperFrames runtime loads the sub-composition, finds its `window.__timelines` registration, and seeks the block in sync with the host, offset by `data-start`. You do NOT need to reference the block's timeline in your host's GSAP code. + +## Positioning blocks + +To position a block in a specific area of the screen, add CSS: + +```html +
+``` + +## Multiple blocks + +Include multiple blocks sequentially or overlapping: + +```html +
+
+
+``` diff --git a/packages/codex-plugin/skills/hyperframes-registry/references/wiring-components.md b/packages/codex-plugin/skills/hyperframes-registry/references/wiring-components.md new file mode 100644 index 000000000..28216208c --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes-registry/references/wiring-components.md @@ -0,0 +1,77 @@ +# Wiring Components + +Components are effect snippets — HTML, CSS, and optionally JS that you merge directly into an existing composition. Unlike blocks, components have no standalone timeline; they participate in the host composition's timeline. + +## General process + +1. Run `hyperframes add ` +2. Open the installed file (e.g., `compositions/components/grain-overlay.html`) +3. Read the comment header for usage instructions +4. Copy the parts into your host composition: + - **HTML elements** — inside your `
` + - **CSS styles** — into your composition's ` + + +
+ +``` + +Load in root: `
` + +## Video and Audio + +Video must be `muted playsinline`. Audio is always a separate `
+``` + +```css +.mh-scribble-wrap { + position: relative; + display: inline; +} +.mh-scribble-text { + position: relative; + z-index: 1; +} +.mh-scribble-svg { + position: absolute; + left: 0; + bottom: -6px; + width: 100%; + height: 24px; + z-index: 0; +} +``` + +```js +// Measure path length and set initial dash state +var path = document.querySelector("#scribble-1"); +var len = path.getTotalLength(); +gsap.set(path, { strokeDasharray: len, strokeDashoffset: len }); + +// Draw the line +tl.to( + "#scribble-1", + { + strokeDashoffset: 0, + duration: 0.8, + ease: "power1.inOut", + }, + 0.7, +); +``` + +### Strikethrough Variant + +Position the SVG at `top: 50%; transform: translateY(-50%)` instead of `bottom: -6px`. + +### Wavy Path Generator + +Scale the path's viewBox width to match text width. The wave pattern `Q x1,y1 x2,y2` alternates between `y=0` and `y=24` for a natural wobble. Adjust the control points for tighter or looser waves: + +- **Tight waves**: smaller x-increments (25px per half-wave) +- **Loose waves**: larger x-increments (50px per half-wave) +- **Amplitude**: change the y range (0-24 for standard, 0-16 for subtle) + +## 5. Sketchout Mode + +Cross-hatch lines over de-emphasized text. Multiple angled lines create a "crossed out" effect. + +```html + + old price + + + + + +``` + +```css +.mh-sketchout-wrap { + position: relative; + display: inline; +} +.mh-sketchout-text { + position: relative; + z-index: 0; +} +.mh-sketchout-lines { + position: absolute; + top: 0; + left: -4px; + right: -4px; + bottom: 0; + overflow: hidden; + z-index: 1; +} +.mh-sketchout-line { + position: absolute; + display: block; + top: 50%; + left: 0; + width: 100%; + height: 2px; + background: #e53935; + transform-origin: left center; + transform: scaleX(0); +} +.mh-sketchout-fwd { + transform: scaleX(0) rotate(-12deg); +} +.mh-sketchout-bwd { + transform: scaleX(0) rotate(12deg); +} +``` + +```js +// Forward slash draws first +tl.to( + "#sketchout-1 .mh-sketchout-fwd", + { + scaleX: 1, + duration: 0.3, + ease: "power2.out", + }, + 1.0, +); + +// Backward slash follows +tl.to( + "#sketchout-1 .mh-sketchout-bwd", + { + scaleX: 1, + duration: 0.3, + ease: "power2.out", + }, + 1.15, +); +``` + +## Combining Modes in Captions + +Use mode cycling for visual variety across caption groups: + +```js +var MODES = ["highlight", "circle", "burst", "scribble"]; + +GROUPS.forEach(function (group, gi) { + var mode = MODES[gi % MODES.length]; + // Apply the mode's CSS pattern to emphasis words in this group + group.emphasisWords.forEach(function (word) { + applyMode(word.el, mode, tl, word.start); + }); +}); +``` + +Cycle every 2-3 groups for high energy, every 3-4 for medium, every 4-5 for low. diff --git a/packages/codex-plugin/skills/hyperframes/references/dynamic-techniques.md b/packages/codex-plugin/skills/hyperframes/references/dynamic-techniques.md new file mode 100644 index 000000000..c0cab609f --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/dynamic-techniques.md @@ -0,0 +1,90 @@ +# Dynamic Caption Techniques + +You are here because SKILL.md told you to read this file before writing animation code. Pick your technique combination from the table below based on the energy level you detected from the transcript, then implement using standard GSAP patterns. + +## Technique Selection by Energy + +| Energy level | Highlight | Exit | Cycle pattern | +| ------------ | ------------------------------------- | ------------------- | ----------------------------------------- | +| High | Karaoke with accent glow + scale pop | Scatter or drop | Alternate highlight styles every 2 groups | +| Medium-high | Karaoke with color pop | Scatter or collapse | Alternate every 3 groups | +| Medium | Karaoke (subtle, white only) | Fade + slide | Alternate every 3 groups | +| Medium-low | Karaoke (minimal scale change) | Fade | Single style, vary ease per group | +| Low | Karaoke (warm tones, slow transition) | Collapse | Alternate every 4 groups | + +**All energy levels use karaoke highlight as the baseline.** The difference is intensity — high energy gets accent color + glow + 15% scale pop on active words, low energy gets a gentle white shift with 3% scale. + +**Emphasis words always break the pattern.** When a word is flagged as emphasis (emotional keyword, ALL CAPS, brand name), give it a stronger animation than surrounding words (larger scale, accent color, overshoot ease). This creates contrast. + +**Marker highlight modes add a visual layer on top of karaoke.** For emphasis words that need more than color/scale, add a marker-style effect — highlight sweep, circle, burst, or scribble — using the `/marker-highlight` skill. Match mode to energy: burst for hype, circle for key terms, highlight for standard, scribble for subtle. + +## Audio-Reactive Captions (Mandatory for Music) + +**If the source audio is music (vocals over instrumentation, beats, any musical content), you MUST extract audio data and add audio-reactive animations.** This is not optional — music without audio reactivity looks disconnected. Even low-energy ballads get subtle bass pulse and treble glow. + +No special wiring is needed. The group loop already iterates over every caption group to build entrance, karaoke, and exit tweens. At that point, read the audio data for each group's time range and use it to modulate the group's animation intensity with regular GSAP tweens. + +```js +// Load audio data inline (same pattern as TRANSCRIPT) +var AUDIO = JSON.parse(audioDataJson); // { fps, totalFrames, frames: [{ bands: [...] }] } + +GROUPS.forEach(function (group, gi) { + var groupEl = document.getElementById("cg-" + gi); + if (!groupEl) return; + + // Read peak energy for this group's time range + var startFrame = Math.floor(group.start * AUDIO.fps); + var endFrame = Math.min(Math.floor(group.end * AUDIO.fps), AUDIO.totalFrames - 1); + var peakBass = 0; + var peakTreble = 0; + for (var f = startFrame; f <= endFrame; f++) { + var frame = AUDIO.frames[f]; + if (!frame) continue; + peakBass = Math.max(peakBass, frame.bands[0] || 0, frame.bands[1] || 0); + peakTreble = Math.max(peakTreble, frame.bands[6] || 0, frame.bands[7] || 0); + } + + // Modulate entrance — louder groups enter bigger and glowier + tl.to( + groupEl, + { + scale: 1 + peakBass * 0.06, + textShadow: + "0 0 " + Math.round(peakTreble * 12) + "px rgba(255,255,255," + peakTreble * 0.4 + ")", + duration: 0.3, + ease: "power2.out", + }, + group.start, + ); + + // Reset at exit so audio-driven values don't persist + tl.set(groupEl, { scale: 1, textShadow: "none" }, group.end - 0.15); +}); +``` + +This shapes the animation at build time, not playback time — no per-frame callbacks, no `tl.call()` loops, no async fetch timing issues. Loud groups come in with more weight and glow; quiet groups come in soft. The audio data modulates _how much_, the content determines _what_. + +Keep audio reactivity subtle — 3-6% scale variation and soft glow. Heavy pulsing makes text unreadable. + +To generate the audio data file: + +```bash +python3 skills/gsap-effects/scripts/extract-audio-data.py audio.mp3 --fps 30 --bands 8 -o audio-data.json +``` + +## Combining Techniques + +Don't use the same highlight animation on every group — cycle through styles using the group index. Don't combine multiple competing animations on the same word at the same timestamp. Vary techniques across groups to match the content's pace changes. + +**Marker highlight effects** (from the `/marker-highlight` skill) layer well with karaoke — use karaoke for the word-by-word reveal, then add a marker effect on emphasis words only. For example: karaoke highlights each word in white, but brand names get a yellow highlight sweep and stats get a red circle. Cycle marker modes across groups for visual variety (see the mode-to-energy mapping in the marker-highlight skill). + +## Available Tools + +These tools are available in the HyperFrames runtime. Use them when they solve a real problem — not every composition needs all of them. + +| Tool | What it does | Access | When it's useful | +| ------------------- | ------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------- | +| **pretext** | Pure-arithmetic text measurement without DOM reflow. 0.0002ms per call. | `window.__hyperframes.pretext.prepare(text, font)` / `.layout(prepared, maxWidth, lineHeight)` | Per-frame text reflow, shrinkwrap containers, computing layout before render | +| **fitTextFontSize** | Finds the largest font size that fits text on one line. Built on pretext. | `window.__hyperframes.fitTextFontSize(text, { maxWidth, fontFamily, fontWeight })` | Overflow prevention for long phrases, portrait mode, large base sizes | +| **audio data** | Pre-extracted per-frame RMS energy and frequency bands. | Extract with `extract-audio-data.py`, load inline or via `fetch("audio-data.json")` | Audio-reactive visuals — modulate intensity based on the music | +| **GSAP** | Animation timeline with tweens and callbacks. | `gsap.to()`, `gsap.set()`, `tl.to()`, `tl.set()` | All caption animation | diff --git a/packages/codex-plugin/skills/hyperframes/references/motion-principles.md b/packages/codex-plugin/skills/hyperframes/references/motion-principles.md new file mode 100644 index 000000000..f02a3bd51 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/motion-principles.md @@ -0,0 +1,69 @@ +# Motion Principles + +## Guardrails + +You know these rules but you violate them. Stop. + +- **Don't use the same ease on every tween.** You default to `power2.out` on everything. Vary eases like you vary font weights — no more than 2 independent tweens with the same ease in a scene. +- **Don't use the same speed on everything.** You default to 0.4-0.5s for everything. The slowest scene should be 3× slower than the fastest. Vary duration deliberately. +- **Don't enter everything from the same direction.** You default to `y: 30, opacity: 0` on every element. Vary: from left, from right, from scale, opacity-only, letter-spacing. +- **Don't use the same stagger on every scene.** Each scene needs its own rhythm. +- **Don't use ambient zoom on every scene.** Pick different ambient motion per scene: slow pan, subtle rotation, scale push, color shift, or nothing. Stillness after motion is powerful. +- **Don't start at t=0.** Offset the first animation 0.1-0.3s. Zero-delay feels like a jump cut. + +## What You Don't Do Without Being Told + +### Easing is emotion, not technique + +The transition is the verb. The easing is the adverb. A slide-in with `expo.out` = confident. With `sine.inOut` = dreamy. With `elastic.out` = playful. Same motion, different meaning. Choose the adverb deliberately. + +**Direction rules — these are not optional:** + +- `.out` for elements entering. Starts fast, decelerates. Feels responsive. This is your default. +- `.in` for elements leaving. Starts slow, accelerates away. Throws them off. +- `.inOut` for elements moving between positions. + +You get this backwards constantly. Ease-in for entrances feels sluggish. Ease-out for exits feels reluctant. + +### Speed communicates weight + +- Fast (0.15-0.3s) — energy, urgency, confidence +- Medium (0.3-0.5s) — professional, most content +- Slow (0.5-0.8s) — gravity, luxury, contemplation +- Very slow (0.8-2.0s) — cinematic, emotional, atmospheric + +### Scene structure: build / breathe / resolve + +Every scene has three phases. You dump everything in the build and leave nothing for breathe or resolve. + +- **Build (0-30%)** — elements enter, staggered. Don't dump everything at once. +- **Breathe (30-70%)** — content visible, alive with ONE ambient motion. +- **Resolve (70-100%)** — exit or decisive end. Exits are faster than entrances. + +### Transitions are meaning + +- **Crossfade** = "this continues" +- **Hard cut** = "wake up" / disruption +- **Slow dissolve** = "drift with me" + +You crossfade everything. Use hard cuts for disruption and register shifts. + +### Choreography is hierarchy + +The element that moves first is perceived as most important. Stagger in order of importance, not DOM order. Don't wait for completion — overlap entries. Total stagger sequence under 500ms regardless of item count. + +### Asymmetry + +Entrances need longer than exits. A card takes 0.4s to appear but 0.25s to disappear. + +## Visual Composition + +You build for the web. Video frames are not pages. + +- **Two focal points minimum per scene.** The eye needs somewhere to travel. Never a single text block floating in empty space. +- **Fill the frame.** Hero text: 60-80% of width. You will try to use web-sized elements. Don't. +- **Three layers minimum per scene.** Background treatment (glow, oversized faded type, color panel). Foreground content. Accent elements (dividers, labels, data bars). +- **Background is not empty.** Radial glows, oversized faded type bleeding off-frame, subtle border panels, hairline rules. Pure solid #000 reads as "nothing loaded." +- **Anchor to edges.** Pin content to left/top or right/bottom. Centered-and-floating is a web pattern. +- **Split frames.** Data panel on the left, content on the right. Top bar with metadata, full-width below. Zone-based layouts, not centered stacks. +- **Use structural elements.** Rules, dividers, border panels. They create paths for the eye and animate well (scaleX from 0). diff --git a/packages/codex-plugin/skills/hyperframes/references/transcript-guide.md b/packages/codex-plugin/skills/hyperframes/references/transcript-guide.md new file mode 100644 index 000000000..5bbf764a2 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transcript-guide.md @@ -0,0 +1,151 @@ +# Transcript Guide + +## How Transcripts Are Generated + +`hyperframes transcribe` handles both transcription and format conversion: + +```bash +# Transcribe audio/video (uses whisper.cpp locally, no API key needed) +npx hyperframes transcribe audio.mp3 + +# Use a larger model for better accuracy +npx hyperframes transcribe audio.mp3 --model medium.en + +# Filter to English only (skips non-English speech) +npx hyperframes transcribe audio.mp3 --language en + +# Import an existing transcript from another tool +npx hyperframes transcribe captions.srt +npx hyperframes transcribe captions.vtt +npx hyperframes transcribe openai-response.json +``` + +## Supported Input Formats + +The CLI auto-detects and normalizes these formats: + +| Format | Extension | Source | Word-level? | +| --------------------- | --------- | --------------------------------------------------------------------------- | ----------------- | +| whisper.cpp JSON | `.json` | `hyperframes init --video`, `hyperframes transcribe` | Yes | +| OpenAI Whisper API | `.json` | `openai.audio.transcriptions.create({ timestamp_granularities: ["word"] })` | Yes | +| SRT subtitles | `.srt` | Video editors, subtitle tools, YouTube | No (phrase-level) | +| VTT subtitles | `.vtt` | Web players, YouTube, transcription services | No (phrase-level) | +| Normalized word array | `.json` | Pre-processed by any tool | Yes | + +**Word-level timestamps produce better captions.** SRT/VTT give phrase-level timing, which works but can't do per-word animation effects. + +## Whisper Model Guide + +The default model (`small.en`) balances accuracy and speed. For better results, use a larger model: + +| Model | Size | Speed | Accuracy | When to use | +| ---------- | ------ | -------- | --------- | ------------------------------------- | +| `tiny` | 75 MB | Fastest | Low | Quick previews, testing pipeline | +| `base` | 142 MB | Fast | Fair | Short clips, clear audio | +| `small` | 466 MB | Moderate | Good | **Default** — good for most content | +| `medium` | 1.5 GB | Slow | Very good | Important content, noisy audio, music | +| `large-v3` | 3.1 GB | Slowest | Best | Production quality | + +**Only add `.en` suffix when the user explicitly says the audio is English.** `.en` models are slightly more accurate for English but will TRANSLATE non-English audio instead of transcribing it. + +**Critical: `.en` models translate non-English audio into English** — they don't transcribe it. If the audio might not be English, always use a model without the `.en` suffix and pass `--language` to specify the source language. If you're unsure of the language, use `small` (not `small.en`) without `--language` — whisper will auto-detect. + +```bash +# Spanish audio +npx hyperframes transcribe audio.mp3 --model small --language es + +# Unknown language — let whisper auto-detect +npx hyperframes transcribe audio.mp3 --model small +``` + +**Music and vocals over instrumentation**: `small.en` will misidentify lyrics — use `medium.en` as the minimum, or import lyrics manually. Even `medium.en` struggles with heavily produced tracks; for music videos, providing known lyrics as an SRT/VTT and importing with `hyperframes transcribe lyrics.srt` will always beat automated transcription. + +## Transcript Quality Check (Mandatory) + +After every transcription, **read the transcript and check for quality issues before proceeding.** Bad transcripts produce nonsensical captions. Never skip this step. + +### What to look for + +| Signal | Example | Cause | +| ---------------------------- | -------------------------------------- | ---------------------------------------------------------------------------- | +| Music note tokens (`♪`, `�`) | `{ "text": "♪" }` or `{ "text": "�" }` | Whisper detected music, not speech | +| Garbled / nonsense words | "Do a chin", "Get so gay", "huh" | Model misheard lyrics or background noise | +| Long gaps with no words | 20+ seconds of only `♪` tokens | Instrumental section — expected, but high ratio means speech is being missed | +| Repeated filler | Many "huh", "uh", "oh" entries | Model is hallucinating on music | +| Very short word spans | Words with `end - start < 0.05` | Unreliable timestamp alignment | + +### Automatic retry rules + +**If more than 20% of entries are `♪`/`�` tokens, or the transcript contains obvious nonsense words, the transcription failed.** Do not proceed with the bad transcript. Instead: + +1. **Retry with `medium.en`** if the original used `small.en` or smaller: + ```bash + npx hyperframes transcribe audio.mp3 --model medium.en + ``` +2. **If `medium.en` also fails** (still >20% music tokens or garbled), tell the user the audio is too noisy for local transcription and suggest: + - Providing lyrics manually as an SRT/VTT file + - Using an external API (OpenAI or Groq Whisper — see below) +3. **Always clean the transcript** before building captions — filter out `♪`/`�` tokens and entries where `text` is a single non-word character. Only real words should reach the caption composition. + +### Cleaning a transcript + +After transcription (even with a good model), strip non-word entries: + +```js +var raw = JSON.parse(transcriptJson); +var words = raw.filter(function (w) { + if (!w.text || w.text.trim().length === 0) return false; + if (/^[♪�\u266a\u266b\u266c\u266d\u266e\u266f]+$/.test(w.text)) return false; + if (/^(huh|uh|um|ah|oh)$/i.test(w.text) && w.end - w.start < 0.1) return false; + return true; +}); +``` + +### When to use which model (decision tree) + +1. **Is this speech over silence/light background?** → `small.en` is fine +2. **Is this speech over music, or music with vocals?** → Start with `medium.en` +3. **Is this a produced music track (vocals + full instrumentation)?** → Start with `medium.en`, expect to need manual lyrics or an external API +4. **Is this multilingual?** → Use `medium` or `large-v3` (no `.en` suffix) + +## Using External Transcription APIs + +For the best accuracy, use an external API and import the result: + +**OpenAI Whisper API** (recommended for quality): + +```bash +# Generate with word timestamps, then import +curl https://api.openai.com/v1/audio/transcriptions \ + -H "Authorization: Bearer $OPENAI_API_KEY" \ + -F file=@audio.mp3 -F model=whisper-1 \ + -F response_format=verbose_json \ + -F "timestamp_granularities[]=word" \ + -o transcript-openai.json + +npx hyperframes transcribe transcript-openai.json +``` + +**Groq Whisper API** (fast, free tier available): + +```bash +curl https://api.groq.com/openai/v1/audio/transcriptions \ + -H "Authorization: Bearer $GROQ_API_KEY" \ + -F file=@audio.mp3 -F model=whisper-large-v3 \ + -F response_format=verbose_json \ + -F "timestamp_granularities[]=word" \ + -o transcript-groq.json + +npx hyperframes transcribe transcript-groq.json +``` + +## If No Transcript Exists + +1. Check the project root for `transcript.json`, `.srt`, or `.vtt` files +2. If none found, run transcription — pick the starting model based on the content type: + - Speech/voiceover → `small.en` + - Music with vocals → `medium.en` + ```bash + npx hyperframes transcribe --model medium.en + ``` +3. **Read the transcript and run the quality check** (see above). If it fails, retry with a larger model or suggest manual lyrics. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions.md b/packages/codex-plugin/skills/hyperframes/references/transitions.md new file mode 100644 index 000000000..3b5404bba --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions.md @@ -0,0 +1,112 @@ +# Scene Transitions + +A transition tells the viewer how two scenes relate. A crossfade says "this continues." A push slide says "next point." A blur crossfade says "drift with me." Choose transitions that match what the content is doing emotionally, not just technically. + +## Animation Rules for Multi-Scene Compositions + +These are non-negotiable for every multi-scene composition: + +1. **Every composition uses transitions.** No exceptions. Scenes without transitions feel like jump cuts. +2. **Every scene uses entrance animations.** Elements animate IN via `gsap.from()` — opacity, position, scale, etc. No scene should pop fully-formed onto screen. +3. **Exit animations are BANNED** except on the final scene. Do NOT use `gsap.to()` to animate elements out before a transition fires. The transition IS the exit. Outgoing scene content must be fully visible when the transition starts — the transition handles the visual handoff. +4. **Final scene exception:** The last scene MAY fade elements out (e.g., fade to black at the end of the composition). This is the only scene where exit animations are allowed. + +## Energy → Primary Transition + +| Energy | CSS Primary | Shader Primary | Accent | Duration | Easing | +| ---------------------------------------- | ---------------------------- | ------------------------------------ | ------------------------------ | --------- | ---------------------- | +| **Calm** (wellness, brand story, luxury) | Blur crossfade, focus pull | Cross-warp morph, thermal distortion | Light leak, circle iris | 0.5-0.8s | `sine.inOut`, `power1` | +| **Medium** (corporate, SaaS, explainer) | Push slide, staggered blocks | Whip pan, cinematic zoom | Squeeze, vertical push | 0.3-0.5s | `power2`, `power3` | +| **High** (promos, sports, music, launch) | Zoom through, overexposure | Ridged burn, glitch, chromatic split | Staggered blocks, gravity drop | 0.15-0.3s | `power4`, `expo` | + +Pick ONE primary (60-70% of scene changes) + 1-2 accents. Never use a different transition for every scene. + +## Mood → Transition Type + +Think about what the transition _communicates_, not just what it looks like. + +| Mood | Transitions | Why it works | +| ------------------------ | ------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------- | +| **Warm / inviting** | Light leak, blur crossfade, focus pull, film burn · **Shader:** thermal distortion, light leak, cross-warp morph | Soft edges, warm color washes. Nothing sharp or mechanical. | +| **Cold / clinical** | Squeeze, zoom out, blinds, shutter, grid dissolve · **Shader:** gravitational lens | Content transforms mechanically — compressed, shrunk, sliced, gridded. | +| **Editorial / magazine** | Push slide, vertical push, diagonal split, shutter · **Shader:** whip pan | Like turning a page or slicing a layout. Clean directional movement. | +| **Tech / futuristic** | Grid dissolve, staggered blocks, blinds, chromatic aberration · **Shader:** glitch, chromatic split | Grid dissolve is the core "data" transition. Shader glitch adds posterization + scan lines. | +| **Tense / edgy** | Glitch, VHS, chromatic aberration, ripple · **Shader:** ridged burn, glitch, domain warp | Instability, distortion, digital breakdown. Ridged burn adds sharp lightning-crack edges. | +| **Playful / fun** | Elastic push, 3D flip, circle iris, morph circle, clock wipe · **Shader:** ripple waves, swirl vortex | Overshoot, bounce, rotation, expansion. Swirl vortex adds organic spiral distortion. | +| **Dramatic / cinematic** | Zoom through, zoom out, gravity drop, overexposure, color dip to black · **Shader:** cinematic zoom, gravitational lens, domain warp | Scale, weight, light extremes. Shader transitions add per-pixel depth. | +| **Premium / luxury** | Focus pull, blur crossfade, color dip to black · **Shader:** cross-warp morph, thermal distortion | Restraint. Cross-warp morph flows both scenes into each other organically. | +| **Retro / analog** | Film burn, light leak, VHS, clock wipe · **Shader:** light leak | Organic imperfection. Warm color bleeds, scan line displacement. | + +## Narrative Position + +| Position | Use | Why | +| -------------------------- | -------------------------------------------------------------------------- | ----------------------------------------------------- | +| **Opening** | Your most distinctive transition. Match the mood. 0.4-0.6s | Sets the visual language for the entire piece. | +| **Between related points** | Your primary transition. Consistent. 0.3s | Don't distract — the content is continuing. | +| **Topic change** | Something different from your primary. Staggered blocks, shutter, squeeze. | Signals "new section" — the viewer's brain resets. | +| **Climax / hero reveal** | Your boldest accent. Fastest or most dramatic. | This is the payoff — spend your best transition here. | +| **Wind-down** | Return to gentle. Blur crossfade, crossfade. 0.5-0.7s | Let the viewer exhale after the climax. | +| **Outro** | Slowest, simplest. Crossfade, color dip to black. 0.6-1.0s | Closure. Don't introduce new energy at the end. | + +## Blur Intensity by Energy + +| Energy | Blur | Duration | Hold at peak | +| ---------- | ------- | -------- | ------------ | +| **Calm** | 20-30px | 0.8-1.2s | 0.3-0.5s | +| **Medium** | 8-15px | 0.4-0.6s | 0.1-0.2s | +| **High** | 3-6px | 0.2-0.3s | 0s | + +## Presets + +| Preset | Duration | Easing | +| ---------- | -------- | ----------------- | +| `snappy` | 0.2s | `power4.inOut` | +| `smooth` | 0.4s | `power2.inOut` | +| `gentle` | 0.6s | `sine.inOut` | +| `dramatic` | 0.5s | `power3.in` → out | +| `instant` | 0.15s | `expo.inOut` | +| `luxe` | 0.7s | `power1.inOut` | + +## Implementation + +Read [transitions/catalog.md](transitions/catalog.md) for GSAP code and hard rules for every transition type. + +| Category | CSS | Shader (WebGL) | +| ----------- | -------------------------------------------------------------- | ------------------------------------------------------------------------- | +| Push/slide | Push slide, vertical push, elastic push, squeeze | Whip pan | +| Scale/zoom | Zoom through, zoom out, gravity drop, 3D flip | Cinematic zoom, gravitational lens | +| Reveal/mask | Circle iris, diamond iris, diagonal split, clock wipe, shutter | SDF iris | +| Dissolve | Crossfade, blur crossfade, focus pull, color dip | Cross-warp morph, domain warp | +| Cover | Staggered blocks, horizontal blinds, vertical blinds | — | +| Light | Light leak, overexposure burn, film burn | Light leak (shader), thermal distortion | +| Distortion | Glitch, chromatic aberration, ripple, VHS tape | Glitch (shader), chromatic split, ridged burn, ripple waves, swirl vortex | +| Pattern | Grid dissolve, morph circle | — | + +## Transitions That Don't Work in CSS + +Avoid: star iris, tilt-shift, lens flare, hinge/door. See catalog.md for why. + +## CSS vs Shader + +CSS transitions animate scene containers with opacity, transforms, clip-path, and filters. Shader transitions composite both scene textures per-pixel on a WebGL canvas — they can warp, dissolve, and morph in ways CSS cannot. + +**Both are first-class options.** Shaders are provided by the `@hyperframes/shader-transitions` package — import from the package instead of writing raw GLSL. CSS transitions are simpler to set up. Choose based on the effect you want, not based on which is easier. + +When a composition uses shader transitions, ALL transitions in that composition should be shader-based (the WebGL canvas replaces DOM-based scene switching). Don't mix CSS and shader transitions in the same composition. + +## Shader-Compatible CSS Rules + +Shader transitions capture DOM scenes to WebGL textures via html2canvas. The canvas 2D rendering pipeline doesn't match CSS exactly. Follow these rules to avoid visible artifacts at transition boundaries: + +1. **No `transparent` keyword in gradients.** Canvas interpolates `transparent` as `rgba(0,0,0,0)` (black at zero alpha), creating dark fringes. Always use the target color at zero alpha: `rgba(200,117,51,0)` not `transparent`. +2. **No gradient backgrounds on elements thinner than 4px.** Canvas can't match CSS gradient rendering on 1-2px elements. Use solid `background-color` on thin accent lines. +3. **No CSS variables (`var()`) on elements visible during capture.** html2canvas doesn't reliably resolve custom properties. Use literal color values in inline styles. +4. **Mark uncapturable decorative elements with `data-no-capture`.** The capture function skips these. They're present on the live DOM but absent from the shader texture. Use for elements that can't follow the rules above. +5. **No gradient opacity below 0.15.** Gradient elements below 10% opacity render differently in canvas vs CSS. Increase to 0.15+ or use a solid color at equivalent brightness. +6. **Every `.scene` div must have explicit `background-color`, AND pass the same color as `bgColor` in the `init()` config.** The package captures scene elements via html2canvas. Both the CSS `background-color` on `.scene` and the `bgColor` config must match. Without either, the texture renders as black. + +These rules only apply to shader transition compositions. CSS-only compositions have no restrictions. + +## Visual Pattern Warning + +Avoid transitions that create visible repeating geometric patterns — grids of tiles, hexagonal cells, uniform dot arrays, evenly-spaced blob circles. These look cheap and artificial regardless of the math behind them. Organic noise (FBM, domain warping) is good because it's irregular. Geometric repetition is bad because the eye instantly sees the grid. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/catalog.md b/packages/codex-plugin/skills/hyperframes/references/transitions/catalog.md new file mode 100644 index 000000000..4918b77bc --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/catalog.md @@ -0,0 +1,117 @@ +# Transition Catalog + +Hard rules, scene template, and routing to implementation code. Read the reference file for the transition type you need — don't load all of them. + +## Hard Rules (CSS) + +These cause real bugs if violated. + +**Scene visibility:** Scene 1 visible by default (no `opacity: 0`). Scenes 2+ have `opacity: 0` on the CONTAINER div. GSAP reveals them. No visibility shim (`timedEls`). + +**Fonts:** Just write the `font-family` you want — the compiler embeds supported fonts automatically via `@font-face` with inline data URIs. No need for `` tags or `@import`. Works in all contexts including sandboxed iframes. + +**Element structure:** No `class="clip"` on scene divs in standalone compositions. Only the root div gets `data-composition-id`/`data-start`/`data-duration`. + +**Overlay elements:** Staggered blocks = full-screen 1920x1080, NOT thin strips. Glitch RGB overlays = normal blending at 35% opacity, NOT `mix-blend-mode: multiply` (invisible on dark backgrounds). Light leak overlays = larger than the frame (2400px+), never a visible shape. Overexposure = use `filter: brightness()` on the scene, not just a white overlay. + +**VHS tape:** Clone actual scene content with `cloneNode(true)`, NOT colored bars. Each strip: wider than frame (2020px at left:-50px). Red+blue chromatic copies at z-index above main strip. Seeded PRNG for deterministic random offsets. + +**Z-index:** Gravity drop, zoom out, diagonal split need outgoing scene ON TOP (`zIndex: 10`) so it exits while revealing the new scene behind (`zIndex: 1`). + +**Page burn:** Content burns with the page — no falling debris. Hide scene1 via `tl.set` at burn end, NEVER `onComplete` (not reversible). `onUpdate` must restore `clipPath: "none"` when `wp <= 0` for rewind support. Incoming scene fades from black at 90% through burn. + +**Clock wipe:** 9-point polygon with intermediate edge positions. Step through 4 quadrants with separate tweens. + +**Grid dissolve:** Cycle 5 palette colors per cell, not monochrome. + +**Blinds count by energy:** Calm: 4h/6v. Medium: 6-8h/8v. High: 12-16h/16v. + +**Don't use:** Star iris (polygon interpolation broken), tilt-shift (no selective CSS blur), lens flare (visible shape, not optical), hinge/door (distorts too fast). + +## Shader Transitions + +Shader setup, WebGL init, capture, and fragment shaders are handled by `@hyperframes/shader-transitions` (`packages/shader-transitions/`). Read the package source for API details. Compositions using shaders must follow the CSS rules in [transitions.md](../transitions.md) § "Shader-Compatible CSS Rules". + +## Scene Template + +```html + + + + + + + + +
+
+
+
+ + + +``` + +Every transition follows: position new scene → animate outgoing → swap → animate incoming → clean up overlays. + +## CSS Transitions + +All code examples use `old` for the outgoing scene-inner selector and `new` for the incoming, with `T` as the transition start time. Read the reference file for the type you need. + +| Type | Transitions | Reference | +| -------------- | ---------------------------------------------------- | ------------------------------------------ | +| Push | Push slide, vertical push, elastic push, squeeze | [css-push.md](./css-push.md) | +| Radial / Shape | Circle iris, diamond iris, diagonal split | [css-radial.md](./css-radial.md) | +| 3D | 3D card flip | [css-3d.md](./css-3d.md) | +| Scale / Zoom | Zoom through, zoom out | [css-scale.md](./css-scale.md) | +| Dissolve | Crossfade, blur crossfade, focus pull, color dip | [css-dissolve.md](./css-dissolve.md) | +| Cover | Staggered blocks, horizontal blinds, vertical blinds | [css-cover.md](./css-cover.md) | +| Light | Light leak, overexposure burn, film burn | [css-light.md](./css-light.md) | +| Distortion | Glitch, chromatic aberration, ripple, VHS tape | [css-distortion.md](./css-distortion.md) | +| Mechanical | Shutter, clock wipe | [css-mechanical.md](./css-mechanical.md) | +| Grid | Grid dissolve | [css-grid.md](./css-grid.md) | +| Other | Gravity drop, morph circle | [css-other.md](./css-other.md) | +| Blur | Blur through, directional blur | [css-blur.md](./css-blur.md) | +| Destruction | Page burn | [css-destruction.md](./css-destruction.md) | + +## Shader Transitions + +WebGL shader transitions are provided by `@hyperframes/shader-transitions` (`packages/shader-transitions/`). The package handles setup, capture, WebGL init, render loop, and GSAP integration. Read the package source for available shaders and API — do not copy raw GLSL manually. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-3d.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-3d.md new file mode 100644 index 000000000..b86b52012 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-3d.md @@ -0,0 +1,12 @@ +## 3D + +### 3D Card Flip + +180° Y-axis rotation. Requires CSS: `backface-visibility: hidden; transform-style: preserve-3d;` on both scene-inners. Parent needs `perspective: 1200px`. + +```js +tl.set(new, { rotationY: -180, opacity: 1 }, T); +tl.to(old, { rotationY: 180, duration: 0.6, ease: "power2.inOut" }, T); +tl.to(new, { rotationY: 0, duration: 0.6, ease: "power2.inOut" }, T); +tl.set(old, { opacity: 0 }, T + 0.6); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-blur.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-blur.md new file mode 100644 index 000000000..2698ba90e --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-blur.md @@ -0,0 +1,51 @@ +## Blur + +All blur transitions scale with energy. See SKILL.md "Blur Intensity by Energy" for the full table. + +### Blur Through + +Content becomes fully abstract before resolving. The heaviest blur transition. + +**Calm (default for this type — it's inherently heavy):** + +```js +tl.to(old, { filter: "blur(30px)", scale: 1.08, duration: 0.5, ease: "power1.in" }, T); +tl.to(old, { opacity: 0, duration: 0.3, ease: "power1.in" }, T + 0.3); +// Hold: both scenes in abstract blur state +tl.fromTo(new, + { filter: "blur(30px)", scale: 0.92, opacity: 0 }, + { filter: "blur(30px)", scale: 0.92, opacity: 1, duration: 0.2, ease: "none" }, T + 0.5); +// Slow resolve +tl.to(new, { filter: "blur(0px)", scale: 1, duration: 0.7, ease: "power1.out" }, T + 0.7); +``` + +**Medium:** + +```js +tl.to(old, { filter: "blur(15px)", scale: 1.05, opacity: 0, duration: 0.4, ease: "power2.in" }, T); +tl.fromTo(new, + { filter: "blur(15px)", scale: 0.95, opacity: 0 }, + { filter: "blur(0px)", scale: 1, opacity: 1, duration: 0.4, ease: "power2.out" }, T + 0.2); +``` + +### Directional Blur + +Blur + skew simulating motion in one direction. Scale blur and skew with energy. + +**Medium (default):** + +```js +tl.to(old, { filter: "blur(12px)", skewX: -8, x: -200, opacity: 0, duration: 0.4, ease: "power3.in" }, T); +tl.fromTo(new, + { filter: "blur(12px)", skewX: 8, x: 200, opacity: 0 }, + { filter: "blur(0px)", skewX: 0, x: 0, opacity: 1, duration: 0.4, ease: "power3.out" }, T + 0.15); +``` + +**Calm (heavier blur, gentler motion):** + +```js +tl.to(old, { filter: "blur(20px)", skewX: -4, x: -100, opacity: 0, duration: 0.6, ease: "power1.in" }, T); +tl.fromTo(new, + { filter: "blur(20px)", skewX: 4, x: 100, opacity: 0 }, + { filter: "blur(0px)", skewX: 0, x: 0, opacity: 1, duration: 0.6, ease: "power1.out" }, T + 0.3); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-cover.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-cover.md new file mode 100644 index 000000000..6ec60b80c --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-cover.md @@ -0,0 +1,43 @@ +## Cover + +### Staggered Color Blocks + +Full-screen (1920x1080) colored divs slide across staggered. Scene swaps while covered. + +**2-block** (standard): + +```js +tl.set("#wipe-a", { x: -1920 }, T - 0.01); +tl.set("#wipe-b", { x: -1920 }, T - 0.01); +tl.to("#wipe-a", { x: 0, duration: 0.25, ease: "power3.inOut" }, T); +tl.to("#wipe-b", { x: 0, duration: 0.25, ease: "power3.inOut" }, T + 0.06); +tl.set(old, { opacity: 0 }, T + 0.2); +tl.set(new, { opacity: 1 }, T + 0.2); +tl.to("#wipe-a", { x: 1920, duration: 0.25, ease: "power3.inOut" }, T + 0.28); +tl.to("#wipe-b", { x: 1920, duration: 0.25, ease: "power3.inOut" }, T + 0.34); +``` + +**5-block** (dense variant): same pattern with 5 blocks at 0.04s stagger. Use composition palette colors. + +### Horizontal Blinds + +Full-width strips slide across staggered. Each strip: `width: 1920px; height: Xpx`. + +**6 strips** (180px each): `0.03s` stagger +**12 strips** (90px each): `0.018s` stagger + +```js +for (var i = 0; i < N; i++) { + tl.set("#blind-h-" + i, { x: -1920 }, T - 0.01); + tl.fromTo("#blind-h-" + i, { x: -1920 }, { x: 0, duration: 0.2, ease: "power3.inOut" }, T + i * stagger); +} +tl.set(old, { opacity: 0 }, T + coverTime); +tl.set(new, { opacity: 1 }, T + coverTime); +for (var i = 0; i < N; i++) { + tl.to("#blind-h-" + i, { x: 1920, duration: 0.2, ease: "power3.inOut" }, T + exitStart + i * stagger); +} +``` + +### Vertical Blinds + +Same as horizontal but strips are tall and narrow, moving on Y axis. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-destruction.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-destruction.md new file mode 100644 index 000000000..9229cffc4 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-destruction.md @@ -0,0 +1,95 @@ +## Destruction + +### Page Burn + +The outgoing scene literally burns away from a corner. A fire front expands with noise-based irregular edges, a canvas draws the scorched char line at the burn boundary, and individual text characters/elements chip off and fall with gravity as the fire reaches them. The incoming scene reveals behind the burn. + +This transition has three systems working together: + +1. **Fire geometry** — a radial front expanding from a corner (e.g., bottom-right) with noise-based irregularity for organic edges +2. **Scene clipping** — the outgoing scene uses an SVG clip-path (with `fill-rule: evenodd`) that cuts a hole matching the fire front. As the fire expands, more of the scene is clipped away. All content (text, images, lines) burns with the page — no separate debris. +3. **Scorched edge** — a `` overlay draws a radial gradient fringe at the fire boundary to simulate charring + +**When to use:** Dramatic reveals, edgy/destructive mood, gaming, cyberpunk. This is the most dramatic transition in the catalog — reserve it for hero moments. + +**Requirements:** + +- A `` element for the burn edge overlay +- A noise function for organic fire edge geometry +- SVG clip-path with evenodd fill-rule for the inverted clip + +**Fire geometry (deterministic noise):** + +```js +function noise(x) { + var ix = Math.floor(x), + fx = x - ix; + var a = Math.sin(ix * 127.1 + 311.7) * 43758.5453; + var b = Math.sin((ix + 1) * 127.1 + 311.7) * 43758.5453; + var t = fx * fx * (3 - 2 * fx); + return a - Math.floor(a) + (b - Math.floor(b) - (a - Math.floor(a))) * t; +} + +function fireRadiusAtAngle(angle, progress) { + var base = progress * maxRadius; + return ( + base + + noise(angle * 3 + progress * 4) * 50 + + noise(angle * 8 + progress * 9) * 20 + + noise(angle * 15 + progress * 15) * 8 + ); +} +``` + +**Incoming scene timing:** The incoming scene should NOT be visible during the burn. As the fire consumes the outgoing scene, **black shows through the holes** — this is the dramatic part. The viewer watches content being destroyed against blackness. + +At ~90% through the burn, the incoming scene fades in SLOWLY from black — the background first, then content staggered. Use long, gentle fades (`power1.out`, 0.8-1.2s durations) so it feels like the new scene materializes from darkness, not a hard swap. + +```js +// Scene 2 stays at opacity: 0 during the burn — black behind the fire +tl.set("#s2-title", { opacity: 0 }, T); +tl.set("#s2-subtitle", { opacity: 0 }, T); + +// At 90% through, scene bg fades in slowly from black +var contentReveal = T + BURN_DURATION * 0.9; +tl.to("#scene2", { opacity: 1, duration: 1.2, ease: "power1.out" }, contentReveal); + +// Content fades in staggered on top, even slower +tl.to("#s2-title", { opacity: 1, duration: 1.0, ease: "power1.out" }, contentReveal + 0.5); +tl.to("#s2-subtitle", { opacity: 1, duration: 0.8, ease: "power1.out" }, contentReveal + 0.7); +``` + +**Content burns with the page — no falling debris.** The clip-path on scene1 IS the effect — as the fire shape expands, everything behind the fire edge (text, images, lines) disappears naturally. Don't clone elements, don't create falling debris. The content is part of the page being consumed. The scorched canvas edge provides the visual char line at the burn boundary. + +**Hide scene1 via `tl.set` at burn end — NEVER in `onComplete`.** Using `onComplete` to hide scene1 is not reversible when scrubbing. Instead, use a `tl.set` at the exact burn end time: + +```js +tl.to( + burnState, + { + progress: 1, + duration: BURN_DURATION, + ease: "none", + onUpdate: function () { + var wp = burnState.progress; + var scene1 = document.getElementById("scene1"); + if (wp <= 0) { + scene1.style.clipPath = "none"; // fully visible when rewound + } else if (wp < 1) { + scene1.style.clipPath = buildClipPath(wp); + } + drawEdge(wp); + }, + // NO onComplete — use tl.set instead + }, + T, +); + +// Hide scene1 at exact burn end — reversible via timeline +tl.set("#scene1", { opacity: 0 }, T + BURN_DURATION); +tl.set("#scene1", { clipPath: "none" }, T + BURN_DURATION); +``` + +The `onUpdate` handles clip-path and canvas edge per-frame. The `tl.set` handles the final hide — and GSAP automatically reverses it when scrubbing backward, restoring scene1 to `opacity: 1`. + +The `onUpdate` callback is the key — it runs every frame to advance the clip-path and canvas edge in sync with the timeline. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-dissolve.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-dissolve.md new file mode 100644 index 000000000..3966fa9aa --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-dissolve.md @@ -0,0 +1,66 @@ +## Dissolve + +### Crossfade + +Simple opacity swap. The baseline. + +```js +tl.to(old, { opacity: 0, duration: 0.5, ease: "power2.inOut" }, T); +tl.fromTo(new, { opacity: 0 }, { opacity: 1, duration: 0.5, ease: "power2.inOut" }, T); +``` + +### Blur Crossfade + +Dissolve with blur + scale shift. **Scale blur amount by energy** — see SKILL.md "Blur Intensity by Energy" section. The examples below show the medium (default) version. For calm compositions, increase to 20-30px with a 0.3-0.5s hold at peak blur. For high-energy, decrease to 3-6px with no hold. + +**Medium (default):** + +```js +tl.to(old, { filter: "blur(10px)", scale: 1.03, opacity: 0, duration: 0.5, ease: "power2.inOut" }, T); +tl.fromTo(new, + { filter: "blur(10px)", scale: 0.97, opacity: 0 }, + { filter: "blur(0px)", scale: 1, opacity: 1, duration: 0.5, ease: "power2.inOut" }, T + 0.1); +``` + +**Calm (wellness, luxury) — heavy blur, holds at abstract color:** + +```js +tl.to(old, { filter: "blur(25px)", scale: 1.05, duration: 0.6, ease: "power1.in" }, T); +tl.to(old, { opacity: 0, duration: 0.4, ease: "power1.in" }, T + 0.4); +tl.fromTo(new, + { filter: "blur(25px)", scale: 0.95, opacity: 0 }, + { filter: "blur(25px)", scale: 0.95, opacity: 1, duration: 0.3, ease: "power1.inOut" }, T + 0.5); +tl.to(new, { filter: "blur(0px)", scale: 1, duration: 0.6, ease: "power1.out" }, T + 0.8); +``` + +### Focus Pull + +Outgoing slowly blurs while incoming fades in sharp. Depth-of-field feel. **Scale blur amount and hold duration by energy.** + +**Medium:** + +```js +tl.to(old, { filter: "blur(15px)", duration: 0.5, ease: "power1.in" }, T); +tl.to(old, { opacity: 0, duration: 0.3, ease: "power2.in" }, T + 0.25); +tl.fromTo(new, { opacity: 0 }, { opacity: 1, duration: 0.3, ease: "power2.out" }, T + 0.25); +``` + +**Calm — slow rack focus with long hold at peak defocus:** + +```js +tl.to(old, { filter: "blur(30px)", duration: 0.8, ease: "power1.in" }, T); +tl.to(old, { opacity: 0, duration: 0.5, ease: "power1.in" }, T + 0.6); +tl.fromTo(new, { opacity: 0, filter: "blur(20px)" }, + { opacity: 1, filter: "blur(20px)", duration: 0.3, ease: "power1.inOut" }, T + 0.7); +tl.to(new, { filter: "blur(0px)", duration: 0.6, ease: "power1.out" }, T + 1.0); +``` + +### Color Dip + +Fade to solid color, hold, fade up new scene. + +```js +tl.to(old, { opacity: 0, duration: 0.2, ease: "power2.in" }, T); +// Background color shows through +tl.fromTo(new, { opacity: 0 }, { opacity: 1, duration: 0.2, ease: "power2.out" }, T + 0.25); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-distortion.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-distortion.md new file mode 100644 index 000000000..44627f7c2 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-distortion.md @@ -0,0 +1,45 @@ +## Distortion + +### Glitch + +RGB-tinted overlays (NOT multiply blend — use normal blending at 35% opacity) jitter with large offsets. Scene itself also jitters. + +```js +tl.set("#glitch-r", { opacity: 1, x: 40, y: -8 }, T); +tl.set("#glitch-g", { opacity: 1, x: -30, y: 12 }, T); +tl.set("#glitch-b", { opacity: 1, x: 15, y: -20 }, T); +tl.set(old, { x: -15 }, T); +// 6 jitter frames at 0.03s intervals with big offsets (±30-60px) +// ... swap and clear at T + 0.2 +``` + +### Chromatic Aberration + +RGB overlays start aligned then spread apart (±80px), scene fades, converge on new scene. + +```js +tl.set("#glitch-r", { opacity: 0.6, x: 0 }, T); +tl.set("#glitch-g", { opacity: 0.6, x: 0 }, T); +tl.set("#glitch-b", { opacity: 0.6, x: 0 }, T); +tl.to("#glitch-r", { x: -80, opacity: 0.8, duration: 0.3, ease: "power2.in" }, T); +tl.to("#glitch-b", { x: 80, opacity: 0.8, duration: 0.3, ease: "power2.in" }, T); +tl.to("#glitch-g", { y: 30, duration: 0.3, ease: "power2.in" }, T); +// Swap at T + 0.3, converge back at T + 0.3 +``` + +### Ripple + +Rapid oscillation (±30px) + scale distortion (0.97-1.03) + increasing blur. Swap at peak distortion. + +```js +tl.to(old, { x: 30, scale: 1.02, duration: 0.04, ease: "none" }, T); +tl.to(old, { x: -25, scale: 0.98, filter: "blur(4px)", duration: 0.04, ease: "none" }, T + 0.04); +// ... more oscillations with increasing blur +// Swap at peak, incoming stabilizes with decreasing wobble +``` + +### VHS Tape + +Clone scene into 20 horizontal strips (each 54px, clip-path'd). Each strip shifts x independently with seeded pseudo-random offsets at per-bar random intervals. Add red+blue chromatic offset copies on each strip (z-index above main, 35% opacity). Make strips wider than frame (2020px at left:-50px) so edges never show. + +See SKILL.md for clone-based implementation pattern. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-grid.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-grid.md new file mode 100644 index 000000000..ee1670068 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-grid.md @@ -0,0 +1,10 @@ +## Grid + +### Grid Dissolve + +Grid of colored cells covers the frame in a ripple from center. Scene swaps at 50% coverage. Cells fade out in ripple. + +**12-cell** (4x3, each 480x270): standard +**120-cell** (12x10, each 160x108): dense variant — lower opacity (0.75), tighter ripple + +Cells are created dynamically in JS, sorted by distance from center for ripple stagger. diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-light.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-light.md new file mode 100644 index 000000000..08d4d5293 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-light.md @@ -0,0 +1,49 @@ +## Light + +### Light Leak + +Multiple warm-colored overlays wash across frame. Needs: a flat warm tint layer + 2-3 bright radial gradient divs, all larger than the frame so edges are never visible. + +```js +// Warm tint washes over entire frame +tl.to("#leak-warm", { opacity: 0.4, duration: 0.3, ease: "power1.in" }, T); +// Bright leak elements drift in +tl.to("#leak-1", { opacity: 0.9, x: 300, duration: 0.5, ease: "sine.inOut" }, T + 0.05); +tl.to("#leak-2", { opacity: 0.8, x: 200, duration: 0.6, ease: "sine.inOut" }, T + 0.1); +// Peak warmth then swap +tl.to("#leak-warm", { opacity: 0.6, duration: 0.15, ease: "power2.in" }, T + 0.35); +tl.set(old, { opacity: 0 }, T + 0.45); +tl.set(new, { opacity: 1 }, T + 0.45); +// Leak fades +tl.to("#leak-warm", { opacity: 0, duration: 0.4, ease: "power2.out" }, T + 0.5); +tl.to("#leak-1", { opacity: 0, x: 600, duration: 0.35, ease: "power1.out" }, T + 0.5); +``` + +### Overexposure Burn + +Scene progressively blows out to white using CSS `filter: brightness()`, then white overlay fades in. Swap at peak white. White recedes to reveal new scene. + +```js +tl.to(old, { filter: "brightness(1.5)", scale: 1.03, duration: 0.2, ease: "power1.in" }, T); +tl.to(old, { filter: "brightness(3)", scale: 1.06, duration: 0.2, ease: "power2.in" }, T + 0.2); +tl.to("#flash-overlay", { opacity: 0.5, duration: 0.25, ease: "power1.in" }, T + 0.15); +tl.to("#flash-overlay", { opacity: 1, duration: 0.15, ease: "power2.in" }, T + 0.4); +tl.set(old, { opacity: 0, filter: "brightness(1)", scale: 1 }, T + 0.55); +tl.set(new, { opacity: 1 }, T + 0.55); +tl.to("#flash-overlay", { opacity: 0, duration: 0.35, ease: "power2.out" }, T + 0.55); +``` + +### Film Burn + +Staggered warm overlays (amber, orange, red) bleed from one edge. Each overlay is a large radial gradient div at high z-index. + +```js +tl.to("#burn-a", { opacity: 1, x: -300, duration: 0.4, ease: "power1.in" }, T); +tl.to("#burn-b", { opacity: 1, x: -500, duration: 0.5, ease: "power1.in" }, T + 0.05); +tl.to("#burn-c", { opacity: 1, x: -200, duration: 0.45, ease: "power1.in" }, T + 0.1); +tl.set(old, { opacity: 0 }, T + 0.35); +tl.set(new, { opacity: 1 }, T + 0.35); +tl.to("#burn-a", { opacity: 0, duration: 0.3, ease: "power2.out" }, T + 0.45); +tl.to("#burn-b", { opacity: 0, duration: 0.3, ease: "power2.out" }, T + 0.5); +tl.to("#burn-c", { opacity: 0, duration: 0.3, ease: "power2.out" }, T + 0.55); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-mechanical.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-mechanical.md new file mode 100644 index 000000000..fa119e7de --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-mechanical.md @@ -0,0 +1,30 @@ +## Mechanical + +### Shutter + +Two full-screen halves close from top and bottom, meet in the middle. Swap while closed. Open again. + +```js +tl.to("#shutter-top", { y: 0, duration: 0.25, ease: "power3.in" }, T); +tl.to("#shutter-bot", { y: 0, duration: 0.25, ease: "power3.in" }, T); +tl.set(old, { opacity: 0 }, T + 0.25); +tl.set(new, { opacity: 1 }, T + 0.25); +tl.to("#shutter-top", { y: -540, duration: 0.25, ease: "power3.out" }, T + 0.3); +tl.to("#shutter-bot", { y: 540, duration: 0.25, ease: "power3.out" }, T + 0.3); +``` + +### Clock Wipe + +Radial polygon sweep stepping through quadrants. Use 9-point polygon with intermediate edge positions for smooth sweep. + +```js +tl.set(new, { opacity: 1, zIndex: 10 }, T); +var d = 0.1; // duration per quadrant +tl.set(new, { clipPath: "polygon(50% 50%, 50% 0%, 50% 0%, 50% 0%, 50% 0%, 50% 0%, 50% 0%, 50% 0%, 50% 0%)" }, T); +tl.to(new, { clipPath: "polygon(50% 50%, 50% 0%, 100% 0%, 100% 50%, 100% 50%, 100% 50%, 100% 50%, 100% 50%, 100% 50%)", duration: d, ease: "none" }, T); +tl.to(new, { clipPath: "polygon(50% 50%, 50% 0%, 100% 0%, 100% 50%, 100% 100%, 50% 100%, 50% 100%, 50% 100%, 50% 100%)", duration: d, ease: "none" }, T + d); +tl.to(new, { clipPath: "polygon(50% 50%, 50% 0%, 100% 0%, 100% 50%, 100% 100%, 50% 100%, 0% 100%, 0% 50%, 0% 50%)", duration: d, ease: "none" }, T + d*2); +tl.to(new, { clipPath: "polygon(50% 50%, 50% 0%, 100% 0%, 100% 50%, 100% 100%, 50% 100%, 0% 100%, 0% 50%, 0% 0%)", duration: d, ease: "none" }, T + d*3); +tl.set(new, { clipPath: "none", zIndex: "auto" }, T + d*4 + 0.02); +tl.set(old, { opacity: 0, zIndex: "auto" }, T + d*4 + 0.02); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-other.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-other.md new file mode 100644 index 000000000..698368a38 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-other.md @@ -0,0 +1,25 @@ +## Other + +### Gravity Drop + +Old scene falls down with slight rotation. New scene was behind it. Needs z-index. + +```js +tl.set(new, { opacity: 1, zIndex: 1 }, T); +tl.set(old, { zIndex: 10 }, T); +tl.to(old, { y: 1200, rotation: 4, duration: 0.5, ease: "power3.in" }, T); +tl.set(old, { opacity: 0, zIndex: "auto" }, T + 0.5); +tl.set(new, { zIndex: "auto" }, T + 0.5); +``` + +### Morph Circle + +A circle scales up from center to fill frame (becoming the new scene's background color). New scene content fades in on top. + +```js +tl.set("#morph-circle", { background: newBgColor, opacity: 1, scale: 0 }, T); +tl.to("#morph-circle", { scale: 30, duration: 0.5, ease: "power3.in" }, T); +tl.set(old, { opacity: 0 }, T + 0.4); +tl.set(new, { opacity: 1 }, T + 0.4); +tl.to("#morph-circle", { opacity: 0, duration: 0.15, ease: "power2.out" }, T + 0.5); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-push.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-push.md new file mode 100644 index 000000000..b7f550351 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-push.md @@ -0,0 +1,41 @@ +## Linear / Push + +### Push Slide + +Both scenes move together — new pushes old out. + +```js +tl.to(old, { x: -1920, duration: 0.5, ease: "power3.inOut" }, T); +tl.fromTo(new, { x: 1920, opacity: 1 }, { x: 0, duration: 0.5, ease: "power3.inOut" }, T); +``` + +### Vertical Push + +Same as push slide but vertical. + +```js +tl.to(old, { y: -1080, duration: 0.5, ease: "power3.inOut" }, T); +tl.fromTo(new, { y: 1080, opacity: 1 }, { y: 0, duration: 0.5, ease: "power3.inOut" }, T); +``` + +### Elastic Push + +Push with overshoot bounce on the incoming scene. + +```js +tl.to(old, { x: -1920, duration: 0.5, ease: "power3.in" }, T); +tl.fromTo(new, { x: 1920, opacity: 1 }, { x: 30, duration: 0.4, ease: "power4.out" }, T + 0.1); +tl.to(new, { x: -15, duration: 0.15, ease: "sine.inOut" }, T + 0.5); +tl.to(new, { x: 0, duration: 0.1, ease: "sine.out" }, T + 0.65); +``` + +### Squeeze + +Old compresses, new expands from opposite side. + +```js +tl.to(old, { scaleX: 0, transformOrigin: "left center", duration: 0.4, ease: "power3.inOut" }, T); +tl.fromTo(new, { scaleX: 0, transformOrigin: "right center", opacity: 1 }, + { scaleX: 1, duration: 0.4, ease: "power3.inOut" }, T + 0.1); +tl.set(old, { opacity: 0 }, T + 0.5); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-radial.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-radial.md new file mode 100644 index 000000000..040dad2f1 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-radial.md @@ -0,0 +1,37 @@ +## Radial / Shape + +### Circle Iris + +Expanding circle from center reveals new scene. + +```js +tl.set(new, { opacity: 1 }, T); +tl.fromTo(new, + { clipPath: "circle(0% at 50% 50%)" }, + { clipPath: "circle(75% at 50% 50%)", duration: 0.5, ease: "power2.out" }, T); +tl.set(old, { opacity: 0 }, T + 0.5); +``` + +### Diamond Iris + +Expanding diamond shape from center. + +```js +tl.set(new, { opacity: 1 }, T); +tl.fromTo(new, + { clipPath: "polygon(50% 50%, 50% 50%, 50% 50%, 50% 50%)" }, + { clipPath: "polygon(50% -20%, 120% 50%, 50% 120%, -20% 50%)", duration: 0.5, ease: "power2.out" }, T); +tl.set(old, { opacity: 0 }, T + 0.5); +``` + +### Diagonal Split + +Old scene shrinks to a triangle in one corner. + +```js +tl.set(new, { opacity: 1, zIndex: 1 }, T); +tl.set(old, { zIndex: 10, clipPath: "polygon(0% 0%, 100% 0%, 100% 100%, 0% 100%)" }, T); +tl.to(old, { clipPath: "polygon(60% 0%, 100% 0%, 100% 40%, 60% 0%)", duration: 0.5, ease: "power3.inOut" }, T); +tl.set(old, { opacity: 0, zIndex: "auto", clipPath: "none" }, T + 0.5); +tl.set(new, { zIndex: "auto" }, T + 0.5); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/transitions/css-scale.md b/packages/codex-plugin/skills/hyperframes/references/transitions/css-scale.md new file mode 100644 index 000000000..b16e26469 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/transitions/css-scale.md @@ -0,0 +1,24 @@ +## Scale / Zoom + +### Zoom Through + +Old zooms past camera + blurs, new zooms in from behind. + +```js +tl.to(old, { scale: 2.5, opacity: 0, filter: "blur(8px)", duration: 0.4, ease: "power3.in" }, T); +tl.fromTo(new, + { scale: 0.5, opacity: 0, filter: "blur(8px)" }, + { scale: 1, opacity: 1, filter: "blur(0px)", duration: 0.4, ease: "power3.out" }, T + 0.15); +``` + +### Zoom Out + +Old shrinks away, new was behind it. Needs z-index management. + +```js +tl.set(new, { opacity: 1, zIndex: 1 }, T); +tl.set(old, { zIndex: 10, transformOrigin: "50% 50%" }, T); +tl.to(old, { scale: 0.3, opacity: 0, duration: 0.4, ease: "power3.in" }, T); +tl.set(old, { zIndex: "auto" }, T + 0.4); +tl.set(new, { zIndex: "auto" }, T + 0.4); +``` diff --git a/packages/codex-plugin/skills/hyperframes/references/tts.md b/packages/codex-plugin/skills/hyperframes/references/tts.md new file mode 100644 index 000000000..c403564d8 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/tts.md @@ -0,0 +1,75 @@ +# Text-to-Speech + +Generate speech audio locally using Kokoro-82M (no API key, runs on CPU). + +## Voice Selection + +Match voice to content. Default is `af_heart`. + +| Content type | Voice | Why | +| ------------- | --------------------- | -------------------------- | +| Product demo | `af_heart`/`af_nova` | Warm, professional | +| Tutorial | `am_adam`/`bf_emma` | Neutral, easy to follow | +| Marketing | `af_sky`/`am_michael` | Energetic or authoritative | +| Documentation | `bf_emma`/`bm_george` | Clear British English | +| Casual | `af_heart`/`af_sky` | Approachable, natural | + +Run `npx hyperframes tts --list` for all 54 voices (8 languages). + +## Multilingual Phonemization + +Kokoro voice IDs encode language in the first letter: `a`=American English, `b`=British English, `e`=Spanish, `f`=French, `h`=Hindi, `i`=Italian, `j`=Japanese, `p`=Brazilian Portuguese, `z`=Mandarin. The CLI auto-detects the phonemizer locale from that prefix — you don't need to pass `--lang` when the voice matches the text. + +```bash +npx hyperframes tts "La reunión empieza a las nueve" --voice ef_dora --output es.wav +npx hyperframes tts "今日はいい天気ですね" --voice jf_alpha --output ja.wav +``` + +Use `--lang` only to override auto-detection (e.g. stylized accents): + +```bash +npx hyperframes tts "Hello there" --voice af_heart --lang fr-fr --output accented.wav +``` + +Valid `--lang` codes: `en-us`, `en-gb`, `es`, `fr-fr`, `hi`, `it`, `pt-br`, `ja`, `zh`. + +Non-English phonemization requires `espeak-ng` installed system-wide (`brew install espeak-ng` on macOS, `apt-get install espeak-ng` on Debian/Ubuntu). + +## Speed Tuning + +- **0.7-0.8** — Tutorial, complex content +- **1.0** — Natural pace (default) +- **1.1-1.2** — Intros, upbeat content +- **1.5+** — Rarely appropriate + +## Usage + +```bash +npx hyperframes tts "Your script here" --voice af_nova --output narration.wav +npx hyperframes tts script.txt --voice bf_emma --output narration.wav +``` + +In compositions: + +```html + +``` + +## TTS + Captions Workflow + +```bash +npx hyperframes tts script.txt --voice af_heart --output narration.wav +npx hyperframes transcribe narration.wav # → transcript.json with word-level timestamps +``` + +## Requirements + +- Python 3.8+ with `kokoro-onnx` and `soundfile` +- Model downloads on first use (~311 MB + ~27 MB voices, cached in `~/.cache/hyperframes/tts/`) diff --git a/packages/codex-plugin/skills/hyperframes/references/typography.md b/packages/codex-plugin/skills/hyperframes/references/typography.md new file mode 100644 index 000000000..2a2ee52c5 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/references/typography.md @@ -0,0 +1,175 @@ +# Typography + +The compiler embeds supported fonts — just write `font-family` in CSS. + +## Banned + +Training-data defaults that every LLM reaches for. These produce monoculture across compositions. + +Inter, Roboto, Open Sans, Noto Sans, Arimo, Lato, Source Sans, PT Sans, Nunito, Poppins, Outfit, Sora, Playfair Display, Cormorant Garamond, Bodoni Moda, EB Garamond, Cinzel, Prata, Syne + +**Syne in particular** is the most overused "distinctive" display font. It is an instant AI design tell. + +## Guardrails + +You know these rules but you violate them. Stop. + +- **Don't pair two sans-serifs.** You do this constantly — one for headlines, one for body. Cross the boundary: serif + sans, or sans + mono. +- **One expressive font per scene.** You pick two interesting fonts trying to make it "better." One performs, one recedes. +- **Weight contrast must be extreme.** You default to 400 vs 700. Video needs 300 vs 900. The difference must be visible in motion at a glance. +- **Video sizes, not web sizes.** Body: 20px minimum. Headlines: 60px+. Data labels: 16px. You will try to use 14px. Don't. + +## What You Don't Do Without Being Told + +- **Tension should mean something.** Don't pattern-match pairings. Ask WHY these two fonts disagree. The pairing should embody the content's contradiction — mechanical vs human, public vs private, institutional vs personal. If you can't articulate the tension, it's arbitrary. +- **Register switching.** Assign different fonts to different communicative modes — one voice for statements, another for data, another for attribution. Not hierarchy on a page. Voices in a conversation. +- **Tension can live inside a single font.** A font that looks familiar but is secretly strange creates tension with the viewer's expectations, not with another font. +- **One variable changed = dramatic contrast.** Same letterforms, monospaced vs proportional. Same family at different optical sizes. Changing only rhythm while everything else stays constant. +- **Double personality works.** Two expressive fonts can coexist if they share an attitude (both irreverent, both precise) even when their forms are completely different. +- **Time is hierarchy.** The first element to appear is the most important. In video, sequence replaces position. +- **Motion is typography.** How a word enters carries as much meaning as the font. A 0.1s slam vs a 2s fade — same font, completely different message. +- **Fixed reading time.** 3 seconds on screen = must be readable in 2. Fewer words, larger type. +- **Tracking tighter than web.** -0.03em to -0.05em on display sizes. Video encoding compresses letter detail. + +## Finding Fonts + +Don't default to what you know. If the content is luxury, a grotesque sans might create more tension than the expected Didone serif. Decide the register first, then search. + +Save this script to `/tmp/fontquery.py` and run with `curl -s 'https://fonts.google.com/metadata/fonts' > /tmp/gfonts.json && python3 /tmp/fontquery.py /tmp/gfonts.json`: + +```python +import json, sys, random +from collections import OrderedDict + +random.seed() # true random each run + +with open(sys.argv[1]) as f: + data = json.load(f) +fonts = data.get("familyMetadataList", []) + +ban = {"Inter","Roboto","Open Sans","Noto Sans","Lato","Poppins","Source Sans 3", + "PT Sans","Nunito","Outfit","Sora","Playfair Display","Cormorant Garamond", + "Bodoni Moda","EB Garamond","Cinzel","Prata","Arimo","Source Sans Pro","Syne"} +skip_pfx = ("Roboto","Noto ","Google Sans","Bpmf","Playwrite","Anek","BIZ ", + "Nanum","Shippori","Sawarabi","Zen ","Kaisei","Kiwi ","Yuji ","Radio ") + +def ok(f): + if f["family"] in ban: return False + if any(f["family"].startswith(b) for b in skip_pfx): return False + if "latin" not in (f.get("subsets") or []): return False + return True + +seen = set() +R = OrderedDict() + +# Trending Sans — recent (2022+), popular (<300) +R["Trending Sans"] = [] +for f in fonts: + if not ok(f) or f["family"] in seen: continue + if f.get("category") in ("Sans Serif","Display") and f.get("dateAdded","") >= "2022-01-01" and f.get("popularity",9999) < 300: + R["Trending Sans"].append(f); seen.add(f["family"]) + +# Trending Serif — recent (2018+), popular (<600) +R["Trending Serif"] = [] +for f in fonts: + if not ok(f) or f["family"] in seen: continue + if f.get("category") == "Serif" and f.get("dateAdded","") >= "2018-01-01" and f.get("popularity",9999) < 600: + R["Trending Serif"].append(f); seen.add(f["family"]) + +# Monospace — recent (2018+), popular (<600) +R["Monospace"] = [] +for f in fonts: + if not ok(f) or f["family"] in seen: continue + if f.get("category") == "Monospace" and f.get("dateAdded","") >= "2018-01-01" and f.get("popularity",9999) < 600: + R["Monospace"].append(f); seen.add(f["family"]) + +# Impact & Condensed — heavy display fonts with 800+ weight +R["Impact & Condensed"] = [] +for f in fonts: + if not ok(f) or f["family"] in seen: continue + has_heavy = any(k in list(f.get("fonts",{}).keys()) for k in ("800","900")) + is_display = f.get("category") in ("Sans Serif","Display") + if has_heavy and is_display and f.get("popularity",9999) < 400: + R["Impact & Condensed"].append(f); seen.add(f["family"]) + +# Script & Handwriting — popular (<300) +R["Script & Handwriting"] = [] +for f in fonts: + if not ok(f) or f["family"] in seen: continue + if f.get("category") == "Handwriting" and f.get("popularity",9999) < 300: + R["Script & Handwriting"].append(f); seen.add(f["family"]) + + +# Randomize the top 5 in each category so the LLM doesn't always pick the same first result +for cat in R: + R[cat].sort(key=lambda x: x.get("popularity",9999)) + top5 = R[cat][:5] + rest = R[cat][5:] + random.shuffle(top5) + R[cat] = top5 + rest +limits = {"Trending Sans":15,"Trending Serif":12,"Monospace":8, + "Impact & Condensed":12,"Script & Handwriting":10} +for cat in R: + items = R[cat][:limits.get(cat,10)] + if not items: continue + print(f"--- {cat} ({len(items)}) ---") + for ff in items: + var = "VAR" if ff.get("axes") else " " + print(f' {ff.get("popularity"):4d} | {var} | {ff["family"]}') + print() +``` + +Five categories: trending sans, trending serif, monospace, impact/condensed, script/handwriting. All dynamically filtered from Google Fonts metadata — no hardcoded font names. Cross classification boundaries when pairing. + +## Selection Thinking + +Don't pick fonts by category reflex (editorial → serif, tech → mono, modern → geometric sans). That's pattern matching, not design. + +1. **Name the register.** What voice is the content speaking in? Institutional authority? Personal confession? Technical precision? Casual irreverence? The register narrows the field more than the category. +2. **Think physically.** Imagine the font as a physical object the brand could ship — a museum exhibit caption, a hand-painted shop sign, a 1970s mainframe terminal manual, a fabric label inside a coat, a children's book printed on cheap newsprint, a tax form. Whichever physical object fits the register is pointing at the right _kind_ of typeface. +3. **Reject your first instinct.** The first font that feels right is usually your training-data default for that register. If you picked it last time too, find something else. +4. **Cross-check the assumption.** An editorial brief does NOT need a serif. A technical brief does NOT need a sans. A children's product does NOT need a rounded display font. The most distinctive choice often contradicts the category expectation. + +## Similar-Font Pairing + +Never pair two fonts that are similar but not identical — two geometric sans-serifs, two transitional serifs, two humanist sans. They create visual friction without clear hierarchy. The viewer senses something is "off" but can't articulate it. Either use one font at two weights, or pair fonts that contrast on multiple axes: serif + sans, condensed + wide, geometric + humanist. + +## Dark Backgrounds + +Light text on dark backgrounds creates two optical illusions you need to compensate for: + +- **Increased apparent weight.** Light-on-dark reads heavier than dark-on-light at the same `font-weight`. Use 350 instead of 400 for body text. Headlines are less affected because size compensates. +- **Tighter apparent spacing.** Light halos around letterforms reduce perceived gaps. Increase `line-height` by 0.05-0.1 beyond your light-background value. For display sizes, add 0.01em `letter-spacing` to counteract. + +## OpenType Features for Data + +Most fonts ship with OpenType features that are off by default. Turn them on for data compositions: + +```css +/* Tabular numbers — digits align vertically in columns */ +.stat-value, +.timer, +.data-column { + font-variant-numeric: tabular-nums; +} + +/* Diagonal fractions — renders 1/2 as ½ */ +.recipe-amount, +.ratio { + font-variant-numeric: diagonal-fractions; +} + +/* Small caps for abbreviations — less visual shouting */ +.abbreviation, +.unit { + font-variant-caps: all-small-caps; +} + +/* Disable ligatures in code — fi, fl, ffi should stay separate */ +code, +.code { + font-variant-ligatures: none; +} +``` + +`tabular-nums` is essential any time numbers are stacked vertically — stat callouts, timers, scoreboards, data tables. Without it, digits have proportional widths and columns don't align. diff --git a/packages/codex-plugin/skills/hyperframes/scripts/animation-map.mjs b/packages/codex-plugin/skills/hyperframes/scripts/animation-map.mjs new file mode 100644 index 000000000..55a96607c --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/scripts/animation-map.mjs @@ -0,0 +1,596 @@ +#!/usr/bin/env node +// animation-map.mjs — HyperFrames animation map for agents +// +// Reads every GSAP timeline registered in window.__timelines, enumerates +// tweens, samples bboxes at N points per tween, computes flags and +// human-readable summaries. Outputs a single animation-map.json. +// +// Usage: +// node skills/hyperframes/scripts/animation-map.mjs \ +// [--frames N] [--out ] [--min-duration S] [--width W] [--height H] [--fps N] + +import { mkdir, writeFile } from "node:fs/promises"; +import { resolve, join } from "node:path"; + +import { + createFileServer, + createCaptureSession, + initializeSession, + closeCaptureSession, + getCompositionDuration, +} from "@hyperframes/producer"; + +// ─── CLI ───────────────────────────────────────────────────────────────────── + +const args = parseArgs(process.argv.slice(2)); +if (!args.composition) die("missing "); + +const FRAMES = Number(args.frames ?? 6); +const OUT_DIR = resolve(args.out ?? ".hyperframes/anim-map"); +const MIN_DUR = Number(args["min-duration"] ?? 0.15); +const WIDTH = Number(args.width ?? 1920); +const HEIGHT = Number(args.height ?? 1080); +const FPS = Number(args.fps ?? 30); +const COMP_DIR = resolve(args.composition); + +await mkdir(OUT_DIR, { recursive: true }); + +// ─── Main ──────────────────────────────────────────────────────────────────── + +const server = await createFileServer({ projectDir: COMP_DIR, port: 0 }); +const session = await createCaptureSession( + server.url, + OUT_DIR, + { width: WIDTH, height: HEIGHT, fps: FPS, format: "png" }, + null, +); +await initializeSession(session); + +try { + const duration = await getCompositionDuration(session); + const tweens = await enumerateTweens(session); + const kept = tweens.filter((tw) => tw.end - tw.start >= MIN_DUR); + + const report = { + composition: COMP_DIR, + duration, + totalTweens: tweens.length, + mappedTweens: kept.length, + skippedMicroTweens: tweens.length - kept.length, + tweens: [], + }; + + for (let i = 0; i < kept.length; i++) { + const tw = kept[i]; + const times = Array.from( + { length: FRAMES }, + (_, k) => +(tw.start + ((k + 0.5) / FRAMES) * (tw.end - tw.start)).toFixed(3), + ); + + const bboxes = []; + for (const t of times) { + await seekTo(session, t); + const bbox = await measureTarget(session, tw.selectorHint); + bboxes.push({ t, ...bbox }); + } + + const animProps = tw.props.filter( + (p) => !["parent", "overwrite", "immediateRender", "startAt", "runBackwards"].includes(p), + ); + const flags = computeFlags(tw, bboxes, { width: WIDTH, height: HEIGHT }); + const summary = describeTween(tw, animProps, bboxes, flags); + + report.tweens.push({ + index: i + 1, + selector: tw.selectorHint, + targets: tw.targetCount, + props: animProps, + start: +tw.start.toFixed(3), + end: +tw.end.toFixed(3), + duration: +(tw.end - tw.start).toFixed(3), + ease: tw.ease, + bboxes, + flags, + summary, + }); + } + + markCollisions(report.tweens); + + for (const tw of report.tweens) { + if (tw.flags.includes("collision") && !tw.summary.includes("collision")) { + tw.summary += " Overlaps another animated element."; + } + } + + // ── Composition-level analysis ── + report.choreography = buildTimeline(report.tweens, duration); + report.density = computeDensity(report.tweens, duration); + report.staggers = detectStaggers(report.tweens); + report.elements = buildElementLifecycles(report.tweens); + report.deadZones = findDeadZones(report.density, duration); + report.snapshots = await captureSnapshots(session, report.tweens, duration); + + await writeFile(join(OUT_DIR, "animation-map.json"), JSON.stringify(report, null, 2)); + + printSummary(report); +} finally { + await closeCaptureSession(session).catch(() => {}); + server.close(); +} + +// ─── Seek helper ──────────────────────────────────────────────────────────── + +async function seekTo(session, t) { + await session.page.evaluate((time) => { + if (window.__hf && typeof window.__hf.seek === "function") { + window.__hf.seek(time); + return; + } + const tls = window.__timelines; + if (tls) { + for (const tl of Object.values(tls)) { + if (typeof tl.seek === "function") tl.seek(time); + } + } + }, t); + await new Promise((r) => setTimeout(r, 100)); +} + +// ─── Timeline introspection ────────────────────────────────────────────────── + +async function enumerateTweens(session) { + return await session.page.evaluate(() => { + const results = []; + const registry = window.__timelines || {}; + + const selectorOf = (el) => { + if (!el || !(el instanceof Element)) return null; + if (el.id) return `#${el.id}`; + const cls = [...el.classList].slice(0, 2).join("."); + return cls ? `${el.tagName.toLowerCase()}.${cls}` : el.tagName.toLowerCase(); + }; + + const walk = (node, parentOffset = 0) => { + if (!node) return; + if (typeof node.getChildren === "function") { + const offset = parentOffset + (node.startTime?.() ?? 0); + for (const child of node.getChildren(true, true, true)) { + walk(child, offset); + } + return; + } + const targets = (node.targets?.() ?? []).filter((t) => t instanceof Element); + if (!targets.length) return; + const vars = node.vars ?? {}; + const props = Object.keys(vars).filter( + (k) => + ![ + "duration", + "ease", + "delay", + "repeat", + "yoyo", + "onStart", + "onUpdate", + "onComplete", + "stagger", + ].includes(k), + ); + const start = parentOffset + (node.startTime?.() ?? 0); + const end = start + (node.duration?.() ?? 0); + results.push({ + selectorHint: selectorOf(targets[0]) ?? "(unknown)", + targetCount: targets.length, + props, + start, + end, + ease: typeof vars.ease === "string" ? vars.ease : (vars.ease?.toString?.() ?? "none"), + }); + }; + + for (const tl of Object.values(registry)) walk(tl, 0); + results.sort((a, b) => a.start - b.start); + return results; + }); +} + +async function measureTarget(session, selector) { + return await session.page.evaluate((sel) => { + const el = document.querySelector(sel); + if (!el) return { x: 0, y: 0, w: 0, h: 0, missing: true }; + const r = el.getBoundingClientRect(); + const cs = getComputedStyle(el); + return { + x: Math.round(r.x), + y: Math.round(r.y), + w: Math.round(r.width), + h: Math.round(r.height), + opacity: parseFloat(cs.opacity), + visible: cs.visibility !== "hidden" && cs.display !== "none", + }; + }, selector); +} + +// ─── Tween description (the key output for agents) ────────────────────────── + +function describeTween(tw, props, bboxes, flags) { + const dur = (tw.end - tw.start).toFixed(2); + const parts = []; + + parts.push(`${tw.selectorHint} animates ${props.join("+")} over ${dur}s (${tw.ease})`); + + // Movement + const first = bboxes[0]; + const last = bboxes[bboxes.length - 1]; + if (first && last) { + const dx = last.x - first.x; + const dy = last.y - first.y; + if (Math.abs(dx) > 3 || Math.abs(dy) > 3) { + const dirs = []; + if (Math.abs(dy) > 3) dirs.push(dy < 0 ? `${Math.abs(dy)}px up` : `${Math.abs(dy)}px down`); + if (Math.abs(dx) > 3) + dirs.push(dx < 0 ? `${Math.abs(dx)}px left` : `${Math.abs(dx)}px right`); + parts.push(`moves ${dirs.join(" and ")}`); + } + } + + // Opacity + if (first && last && first.opacity !== undefined && last.opacity !== undefined) { + const o1 = first.opacity; + const o2 = last.opacity; + if (Math.abs(o2 - o1) > 0.1) { + if (o1 < 0.1 && o2 > 0.5) parts.push("fades in"); + else if (o1 > 0.5 && o2 < 0.1) parts.push("fades out"); + else parts.push(`opacity ${o1.toFixed(1)}→${o2.toFixed(1)}`); + } + } + + // Scale (from props) + if (props.includes("scale") || props.includes("scaleX") || props.includes("scaleY")) { + parts.push("scales"); + } + + // Size changes + if (first && last) { + const dw = last.w - first.w; + const dh = last.h - first.h; + if (Math.abs(dw) > 5) parts.push(`width ${first.w}→${last.w}px`); + if (Math.abs(dh) > 5) parts.push(`height ${first.h}→${last.h}px`); + } + + // Visibility + if (first && last && first.visible !== last.visible) { + parts.push(last.visible ? "becomes visible" : "becomes hidden"); + } + + // Final position + if (last && !last.missing) { + parts.push(`ends at (${last.x}, ${last.y}) ${last.w}×${last.h}px`); + } + + // Flags + if (flags.length > 0) { + parts.push(`FLAGS: ${flags.join(", ")}`); + } + + return parts.join(". ") + "."; +} + +// ─── Flag computation ─────────────────────────────────────────────────────── + +function computeFlags(tw, bboxes, { width, height }) { + const flags = []; + const dur = tw.end - tw.start; + + if (bboxes.every((b) => b.w === 0 || b.h === 0)) flags.push("degenerate"); + + const anyOffscreen = bboxes.some( + (b) => + b.x + b.w <= 0 || + b.y + b.h <= 0 || + b.x >= width || + b.y >= height || + b.x < -b.w * 0.5 || + b.y < -b.h * 0.5 || + b.x + b.w > width + b.w * 0.5 || + b.y + b.h > height + b.h * 0.5, + ); + if (anyOffscreen) flags.push("offscreen"); + + if (bboxes.every((b) => b.opacity !== undefined && b.opacity < 0.01 && b.visible)) { + flags.push("invisible"); + } + + if (dur < 0.2 && tw.props.some((p) => ["y", "x", "opacity", "scale"].includes(p))) { + flags.push("paced-fast"); + } + if (dur > 2.0) flags.push("paced-slow"); + + return flags; +} + +function markCollisions(tweens) { + for (let i = 0; i < tweens.length; i++) { + for (let j = i + 1; j < tweens.length; j++) { + const a = tweens[i]; + const b = tweens[j]; + if (a.end <= b.start || b.end <= a.start) continue; + for (const ba of a.bboxes) { + const bb = b.bboxes.find((x) => Math.abs(x.t - ba.t) < 0.05); + if (!bb) continue; + const overlap = rectOverlapArea(ba, bb); + const aArea = ba.w * ba.h; + if (aArea > 0 && overlap / aArea > 0.3) { + if (!a.flags.includes("collision")) a.flags.push("collision"); + if (!b.flags.includes("collision")) b.flags.push("collision"); + break; + } + } + } + } +} + +function rectOverlapArea(a, b) { + const x1 = Math.max(a.x, b.x); + const y1 = Math.max(a.y, b.y); + const x2 = Math.min(a.x + a.w, b.x + b.w); + const y2 = Math.min(a.y + a.h, b.y + b.h); + return Math.max(0, x2 - x1) * Math.max(0, y2 - y1); +} + +// ─── Composition-level analysis ───────────────────────────────────────────── + +function buildTimeline(tweens, duration) { + const cols = 60; + const lines = []; + const secPerCol = duration / cols; + + lines.push("Timeline (" + duration.toFixed(1) + "s, each char ≈ " + secPerCol.toFixed(2) + "s):"); + lines.push(" " + "0s" + " ".repeat(cols - 8) + duration.toFixed(0) + "s"); + lines.push(" " + "┼" + "─".repeat(cols - 1) + "┤"); + + for (const tw of tweens) { + const startCol = Math.floor(tw.start / secPerCol); + const endCol = Math.min(cols, Math.ceil(tw.end / secPerCol)); + const bar = + " ".repeat(startCol) + + "█".repeat(Math.max(1, endCol - startCol)) + + " ".repeat(Math.max(0, cols - endCol)); + const label = tw.selector + " " + tw.props.join("+"); + lines.push(" " + bar + " " + label); + } + + return lines.join("\n"); +} + +function computeDensity(tweens, duration) { + const buckets = []; + for (let t = 0; t < duration; t += 0.5) { + const active = tweens.filter((tw) => tw.start <= t + 0.5 && tw.end >= t); + buckets.push({ t: +t.toFixed(1), activeTweens: active.length }); + } + return buckets; +} + +function findDeadZones(density, duration) { + const zones = []; + let zoneStart = null; + for (const d of density) { + if (d.activeTweens === 0) { + if (zoneStart === null) zoneStart = d.t; + } else { + if (zoneStart !== null) { + const zoneEnd = d.t; + if (zoneEnd - zoneStart >= 1.0) { + zones.push({ + start: zoneStart, + end: zoneEnd, + duration: +(zoneEnd - zoneStart).toFixed(1), + note: + "No animation for " + + (zoneEnd - zoneStart).toFixed(1) + + "s. Intentional hold or missing entrance?", + }); + } + zoneStart = null; + } + } + } + if (zoneStart !== null && duration - zoneStart >= 1.0) { + zones.push({ + start: zoneStart, + end: +duration.toFixed(1), + duration: +(duration - zoneStart).toFixed(1), + note: + "No animation for " + + (duration - zoneStart).toFixed(1) + + "s at end. Final hold or missing outro?", + }); + } + return zones; +} + +function detectStaggers(tweens) { + const groups = []; + const used = new Set(); + + for (let i = 0; i < tweens.length; i++) { + if (used.has(i)) continue; + const tw = tweens[i]; + const group = [tw]; + used.add(i); + + for (let j = i + 1; j < tweens.length; j++) { + if (used.has(j)) continue; + const other = tweens[j]; + const sameProps = tw.props.join(",") === other.props.join(","); + const sameDuration = Math.abs(tw.duration - other.duration) < 0.05; + const closeInTime = other.start - tw.start < tw.duration * 4; + if (sameProps && sameDuration && closeInTime) { + group.push(other); + used.add(j); + } + } + + if (group.length >= 3) { + const intervals = []; + for (let k = 1; k < group.length; k++) { + intervals.push(+(group[k].start - group[k - 1].start).toFixed(3)); + } + const avgInterval = intervals.reduce((a, b) => a + b, 0) / intervals.length; + const maxDrift = Math.max(...intervals.map((iv) => Math.abs(iv - avgInterval))); + const consistent = maxDrift < avgInterval * 0.3; + + groups.push({ + elements: group.map((g) => g.selector), + props: tw.props, + count: group.length, + intervals, + avgInterval: +avgInterval.toFixed(3), + consistent, + note: consistent + ? group.length + + " elements stagger at " + + (avgInterval * 1000).toFixed(0) + + "ms intervals" + : group.length + + " elements stagger with uneven intervals (" + + intervals.map((iv) => (iv * 1000).toFixed(0) + "ms").join(", ") + + ")", + }); + } + } + + return groups; +} + +function buildElementLifecycles(tweens) { + const elements = {}; + for (const tw of tweens) { + const sel = tw.selector; + if (!elements[sel]) { + elements[sel] = { firstTween: tw.start, lastTween: tw.end, tweenCount: 0, props: new Set() }; + } + elements[sel].firstTween = Math.min(elements[sel].firstTween, tw.start); + elements[sel].lastTween = Math.max(elements[sel].lastTween, tw.end); + elements[sel].tweenCount++; + tw.props.forEach((p) => elements[sel].props.add(p)); + } + + const result = {}; + for (const [sel, data] of Object.entries(elements)) { + const lastBbox = findLastBbox(tweens, sel); + result[sel] = { + firstAppears: +data.firstTween.toFixed(3), + lastAnimates: +data.lastTween.toFixed(3), + tweenCount: data.tweenCount, + props: [...data.props], + endsVisible: lastBbox ? lastBbox.opacity > 0.1 && lastBbox.visible : null, + finalPosition: lastBbox + ? { x: lastBbox.x, y: lastBbox.y, w: lastBbox.w, h: lastBbox.h } + : null, + }; + } + return result; +} + +function findLastBbox(tweens, selector) { + for (let i = tweens.length - 1; i >= 0; i--) { + if (tweens[i].selector === selector && tweens[i].bboxes?.length > 0) { + return tweens[i].bboxes[tweens[i].bboxes.length - 1]; + } + } + return null; +} + +async function captureSnapshots(session, tweens, duration) { + const times = [0, duration * 0.25, duration * 0.5, duration * 0.75, duration - 0.1]; + const snapshots = []; + + for (const t of times) { + await seekTo(session, t); + const visible = await session.page.evaluate(() => { + const out = []; + const els = document.querySelectorAll("[id]"); + for (const el of els) { + const cs = getComputedStyle(el); + if (cs.display === "none") continue; + const opacity = parseFloat(cs.opacity); + if (opacity < 0.01) continue; + const rect = el.getBoundingClientRect(); + if (rect.width < 1 || rect.height < 1) continue; + out.push({ + id: el.id, + x: Math.round(rect.x), + y: Math.round(rect.y), + w: Math.round(rect.width), + h: Math.round(rect.height), + opacity: +opacity.toFixed(2), + }); + } + return out; + }); + + const activeTweens = tweens + .filter((tw) => tw.start <= t && tw.end >= t) + .map((tw) => tw.selector); + + snapshots.push({ + t: +t.toFixed(2), + visibleElements: visible.length, + animatingNow: activeTweens, + elements: visible, + }); + } + + return snapshots; +} + +// ─── Output ───────────────────────────────────────────────────────────────── + +function printSummary(report) { + console.log( + `\nAnimation map: ${report.mappedTweens}/${report.totalTweens} tweens (skipped ${report.skippedMicroTweens} micro-tweens)`, + ); + + const flagCounts = {}; + for (const tw of report.tweens) { + for (const f of tw.flags) flagCounts[f] = (flagCounts[f] ?? 0) + 1; + } + if (Object.keys(flagCounts).length > 0) { + for (const [f, n] of Object.entries(flagCounts)) console.log(` ${f}: ${n}`); + } + if (report.staggers?.length > 0) { + console.log(` staggers: ${report.staggers.map((s) => s.note).join("; ")}`); + } + if (report.deadZones?.length > 0) { + console.log( + ` dead zones: ${report.deadZones.map((z) => z.start + "-" + z.end + "s").join(", ")}`, + ); + } + + console.log(report.choreography); +} + +function parseArgs(argv) { + const out = {}; + let positional = 0; + for (let i = 0; i < argv.length; i++) { + const a = argv[i]; + if (a.startsWith("--")) { + const k = a.slice(2); + const v = argv[i + 1]?.startsWith("--") ? true : argv[++i]; + out[k] = v; + } else if (positional === 0) { + out.composition = a; + positional++; + } + } + return out; +} + +function die(msg) { + console.error(`animation-map: ${msg}`); + process.exit(2); +} diff --git a/packages/codex-plugin/skills/hyperframes/scripts/contrast-report.mjs b/packages/codex-plugin/skills/hyperframes/scripts/contrast-report.mjs new file mode 100644 index 000000000..63fa6259c --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/scripts/contrast-report.mjs @@ -0,0 +1,335 @@ +#!/usr/bin/env node +// contrast-report.mjs — HyperFrames contrast audit +// +// Reads a composition, seeks to N sample timestamps, walks the DOM for text +// elements, measures the WCAG 2.1 contrast ratio between each element's +// declared foreground color and the pixels behind it, and emits: +// +// - contrast-report.json (machine-readable, one entry per text element × sample) +// - contrast-overlay.png (sprite grid; magenta=fail AA, yellow=pass AA only, green=AAA) +// +// Usage: +// node skills/hyperframes/scripts/contrast-report.mjs \ +// [--samples N] [--out ] [--width W] [--height H] [--fps N] +// +// The composition directory must contain an index.html. Raw authoring HTML +// works — the producer's file server auto-injects the runtime at serve time. +// Exits 1 if any text element fails WCAG AA. + +import { mkdir, writeFile } from "node:fs/promises"; +import { resolve } from "node:path"; + +import sharp from "sharp"; + +// Use the producer's file server — it auto-injects the HyperFrames runtime +// and render-seek bridge, so raw authoring HTML works without a build step. +import { + createFileServer, + createCaptureSession, + initializeSession, + closeCaptureSession, + captureFrameToBuffer, + getCompositionDuration, +} from "@hyperframes/producer"; + +// ─── CLI ───────────────────────────────────────────────────────────────────── + +const args = parseArgs(process.argv.slice(2)); +if (!args.composition) die("missing "); + +const SAMPLES = Number(args.samples ?? 10); +const OUT_DIR = resolve(args.out ?? ".hyperframes/contrast"); +const WIDTH = Number(args.width ?? 1920); +const HEIGHT = Number(args.height ?? 1080); +const FPS = Number(args.fps ?? 30); +const COMP_DIR = resolve(args.composition); + +// ─── Main ──────────────────────────────────────────────────────────────────── + +await mkdir(OUT_DIR, { recursive: true }); + +const server = await createFileServer({ projectDir: COMP_DIR, port: 0 }); +const session = await createCaptureSession( + server.url, + OUT_DIR, + { width: WIDTH, height: HEIGHT, fps: FPS, format: "png" }, + null, +); +await initializeSession(session); + +try { + const duration = await getCompositionDuration(session); + const times = Array.from( + { length: SAMPLES }, + (_, i) => +(((i + 0.5) / SAMPLES) * duration).toFixed(3), + ); + + const allEntries = []; + const overlayFrames = []; + + for (let i = 0; i < times.length; i++) { + const t = times[i]; + const { buffer: pngBuf } = await captureFrameToBuffer(session, i, t); + const elements = await probeTextElements(session, t); + const annotated = await annotateFrame(pngBuf, elements); + overlayFrames.push({ t, png: annotated }); + for (const el of elements) allEntries.push({ time: t, ...el }); + } + + const report = { + composition: COMP_DIR, + width: WIDTH, + height: HEIGHT, + duration, + samples: times, + entries: allEntries, + summary: summarize(allEntries), + }; + + await writeFile(resolve(OUT_DIR, "contrast-report.json"), JSON.stringify(report, null, 2)); + await writeOverlaySprite(overlayFrames, resolve(OUT_DIR, "contrast-overlay.png")); + + printSummary(report); + process.exitCode = report.summary.failAA > 0 ? 1 : 0; +} finally { + await closeCaptureSession(session).catch(() => {}); + server.close(); +} + +// ─── DOM probe (runs in the page) ──────────────────────────────────────────── + +async function probeTextElements(session, _t) { + // `session.page` is the Puppeteer Page owned by the capture session. + // We pass a pure function to `evaluate`: it walks the DOM and returns + // enough info for us to compute a ratio in Node using the frame buffer. + return await session.page.evaluate(() => { + /** @type {Array<{selector: string, text: string, fg: [number,number,number,number], fontSize: number, fontWeight: number, bbox: {x:number,y:number,w:number,h:number}}>} */ + const out = []; + const walker = document.createTreeWalker(document.body, NodeFilter.SHOW_ELEMENT); + const parseColor = (c) => { + const m = c.match(/rgba?\(([^)]+)\)/); + if (!m) return [0, 0, 0, 1]; + const parts = m[1].split(",").map((s) => parseFloat(s.trim())); + return [parts[0], parts[1], parts[2], parts[3] ?? 1]; + }; + const selectorOf = (el) => { + if (el.id) return `#${el.id}`; + const cls = [...el.classList].slice(0, 2).join("."); + return cls ? `${el.tagName.toLowerCase()}.${cls}` : el.tagName.toLowerCase(); + }; + let el; + while ((el = walker.nextNode())) { + // must have direct text + const direct = [...el.childNodes].some( + (n) => n.nodeType === 3 && n.textContent.trim().length, + ); + if (!direct) continue; + const cs = getComputedStyle(el); + if (cs.visibility === "hidden" || cs.display === "none") continue; + if (parseFloat(cs.opacity) <= 0.01) continue; + const rect = el.getBoundingClientRect(); + if (rect.width < 8 || rect.height < 8) continue; + out.push({ + selector: selectorOf(el), + text: el.textContent.trim().slice(0, 60), + fg: parseColor(cs.color), + fontSize: parseFloat(cs.fontSize), + fontWeight: Number(cs.fontWeight) || 400, + bbox: { x: rect.x, y: rect.y, w: rect.width, h: rect.height }, + }); + } + return out; + }); +} + +// ─── Pixel sampling + WCAG math ────────────────────────────────────────────── + +async function annotateFrame(pngBuf, elements) { + const img = sharp(pngBuf); + const meta = await img.metadata(); + const { width, height } = meta; + const raw = await img.ensureAlpha().raw().toBuffer(); + const channels = 4; + + const measured = []; + for (const el of elements) { + const bg = sampleRingMedian(raw, width, height, channels, el.bbox); + const fg = compositeOver(el.fg, bg); // flatten any alpha against measured bg + const ratio = wcagRatio(fg, bg); + const large = isLargeText(el.fontSize, el.fontWeight); + el.bg = bg; + el.ratio = +ratio.toFixed(2); + el.wcagAA = large ? ratio >= 3 : ratio >= 4.5; + el.wcagAALarge = ratio >= 3; + el.wcagAAA = large ? ratio >= 4.5 : ratio >= 7; + measured.push(el); + } + + // Draw boxes + ratio labels as an SVG overlay (sharp composite). + const svg = buildOverlaySVG(measured, width, height); + return await sharp(pngBuf) + .composite([{ input: Buffer.from(svg), top: 0, left: 0 }]) + .png() + .toBuffer(); +} + +function sampleRingMedian(raw, width, height, channels, bbox) { + // 4-px ring immediately outside the element bbox. Median of each channel. + const r = [], + g = [], + b = []; + const x0 = Math.max(0, Math.floor(bbox.x) - 4); + const x1 = Math.min(width - 1, Math.ceil(bbox.x + bbox.w) + 4); + const y0 = Math.max(0, Math.floor(bbox.y) - 4); + const y1 = Math.min(height - 1, Math.ceil(bbox.y + bbox.h) + 4); + const pushPixel = (x, y) => { + const i = (y * width + x) * channels; + r.push(raw[i]); + g.push(raw[i + 1]); + b.push(raw[i + 2]); + }; + for (let x = x0; x <= x1; x++) { + pushPixel(x, y0); + pushPixel(x, y1); + } + for (let y = y0; y <= y1; y++) { + pushPixel(x0, y); + pushPixel(x1, y); + } + return [median(r), median(g), median(b), 1]; +} + +function median(arr) { + const s = [...arr].sort((a, b) => a - b); + return s[Math.floor(s.length / 2)]; +} + +function compositeOver([fr, fg, fb, fa], [br, bg, bb]) { + return [ + Math.round(fr * fa + br * (1 - fa)), + Math.round(fg * fa + bg * (1 - fa)), + Math.round(fb * fa + bb * (1 - fa)), + 1, + ]; +} + +function relLum([r, g, b]) { + const ch = (v) => { + const s = v / 255; + return s <= 0.03928 ? s / 12.92 : ((s + 0.055) / 1.055) ** 2.4; + }; + return 0.2126 * ch(r) + 0.7152 * ch(g) + 0.0722 * ch(b); +} + +function wcagRatio(a, b) { + const la = relLum(a); + const lb = relLum(b); + const [L1, L2] = la > lb ? [la, lb] : [lb, la]; + return (L1 + 0.05) / (L2 + 0.05); +} + +function isLargeText(fontSize, fontWeight) { + return fontSize >= 24 || (fontSize >= 19 && fontWeight >= 700); +} + +// ─── Overlay rendering ─────────────────────────────────────────────────────── + +function buildOverlaySVG(elements, w, h) { + const rects = elements + .map((el) => { + const color = !el.wcagAA ? "#ff00aa" : !el.wcagAAA ? "#ffcc00" : "#00e08a"; + const { x, y, w: bw, h: bh } = el.bbox; + return ` + + + + ${el.ratio.toFixed(1)}:1 + `; + }) + .join(""); + return `${rects}`; +} + +async function writeOverlaySprite(frames, outPath) { + if (!frames.length) return; + const cols = Math.min(frames.length, 5); + const rows = Math.ceil(frames.length / cols); + const { width, height } = await sharp(frames[0].png).metadata(); + const scale = 0.25; + const cellW = Math.round(width * scale); + const cellH = Math.round(height * scale); + + const cells = await Promise.all( + frames.map(async (f) => ({ + input: await sharp(f.png).resize(cellW, cellH).png().toBuffer(), + time: f.t, + })), + ); + + const composites = cells.map((c, i) => ({ + input: c.input, + top: Math.floor(i / cols) * cellH, + left: (i % cols) * cellW, + })); + + await sharp({ + create: { + width: cols * cellW, + height: rows * cellH, + channels: 3, + background: { r: 16, g: 16, b: 20 }, + }, + }) + .composite(composites) + .png() + .toFile(outPath); +} + +// ─── Summary ──────────────────────────────────────────────────────────────── + +function summarize(entries) { + const total = entries.length; + const failAA = entries.filter((e) => !e.wcagAA).length; + const passAAonly = entries.filter((e) => e.wcagAA && !e.wcagAAA).length; + const passAAA = entries.filter((e) => e.wcagAAA).length; + return { total, failAA, passAAonly, passAAA }; +} + +function printSummary({ summary, entries }) { + const { total, failAA, passAAonly, passAAA } = summary; + console.log(`\nContrast report: ${total} text-element samples`); + console.log(` fail WCAG AA: ${failAA}`); + console.log(` pass AA, not AAA: ${passAAonly}`); + console.log(` pass AAA: ${passAAA}`); + if (failAA) { + console.log("\nFailures:"); + for (const e of entries.filter((x) => !x.wcagAA)) { + console.log(` t=${e.time}s ${e.selector.padEnd(24)} ${e.ratio.toFixed(2)}:1 "${e.text}"`); + } + } +} + +// ─── Utilities ────────────────────────────────────────────────────────────── + +function parseArgs(argv) { + const out = {}; + let positional = 0; + for (let i = 0; i < argv.length; i++) { + const a = argv[i]; + if (a.startsWith("--")) { + const k = a.slice(2); + const v = argv[i + 1]?.startsWith("--") ? true : argv[++i]; + out[k] = v; + } else if (positional === 0) { + out.composition = a; + positional++; + } + } + return out; +} + +function die(msg) { + console.error(`contrast-report: ${msg}`); + process.exit(2); +} diff --git a/packages/codex-plugin/skills/hyperframes/visual-styles.md b/packages/codex-plugin/skills/hyperframes/visual-styles.md new file mode 100644 index 000000000..d1e8025c8 --- /dev/null +++ b/packages/codex-plugin/skills/hyperframes/visual-styles.md @@ -0,0 +1,211 @@ +# Visual Style Library + +Named visual identities for HyperFrames videos. Each style is grounded in a real graphic design tradition. Use them to give your video a specific visual personality, not just generic "clean" or "bold." + +**How to pick:** Match mood first, content second. Ask: _"What should the viewer FEEL?"_ + +**How to use:** Reference the style in your scene plan. Translate the style's principles into concrete composition decisions — palette choice, font selection, entrance patterns, transition type, ambient motion feel. + +## Quick Reference + +| Style | Mood | Best for | Primary shader | +| --------------- | --------------------- | ---------------------------------- | --------------------------------- | +| Swiss Pulse | Clinical, precise | SaaS, data, dev tools, metrics | Cinematic Zoom or SDF Iris | +| Velvet Standard | Premium, timeless | Luxury, enterprise, keynotes | Cross-Warp Morph | +| Deconstructed | Industrial, raw | Tech launches, security, punk | Glitch or Whip Pan | +| Maximalist Type | Loud, kinetic | Big announcements, launches | Ridged Burn | +| Data Drift | Futuristic, immersive | AI, ML, cutting-edge tech | Gravitational Lens or Domain Warp | +| Soft Signal | Intimate, warm | Wellness, personal stories, brand | Thermal Distortion | +| Folk Frequency | Cultural, vivid | Consumer apps, food, communities | Swirl Vortex or Ripple Waves | +| Shadow Cut | Dark, cinematic | Dramatic reveals, security, exposé | Domain Warp | + +--- + +## 1. Swiss Pulse — Josef Müller-Brockmann + +**Mood:** Clinical, precise | **Best for:** SaaS dashboards, developer tools, APIs, metrics + +- Black (`#1a1a1a`), white, ONE accent — electric blue (`#0066FF`) or amber (`#FFB300`) +- Helvetica or Inter Bold for headlines, Regular for labels. Numbers large (80–120px) +- Grid-locked compositions. Every element snaps to an invisible 12-column grid +- Animated counters count up from 0. Hard cuts, no decorative transitions +- Transitions: Cinematic Zoom or SDF Iris (precise, geometric) + +**GSAP signature:** `expo.out`, `power4.out`. Entries are fast and snap into place. Nothing floats. + +``` +Swiss Pulse: Black/white + one electric accent. Grid-locked compositions. +Numbers dominate the frame at 80-120px. Counter animations from 0. +Hard cuts or geometric transitions. Nothing decorative. +``` + +--- + +## 2. Velvet Standard — Massimo Vignelli + +**Mood:** Premium, timeless | **Best for:** Luxury products, enterprise software, keynotes, investor decks + +- Black, white, ONE rich accent — deep navy (`#1a237e`) or gold (`#c9a84c`) +- Thin sans-serif, ALL CAPS, wide letter-spacing (`0.15em+`) +- Generous negative space. Symmetrical, centered, architectural precision +- Slow, deliberate. Sequential reveals with long holds. No frantic motion +- Transitions: Cross-Warp Morph (elegant, organic flow between scenes) + +**GSAP signature:** `sine.inOut`, `power1`. Nothing snaps — everything glides with intention. + +``` +Velvet Standard: Black, white, one rich accent. Thin ALL CAPS type with wide tracking. +Generous negative space. Sequential reveals, long holds. +Cross-Warp Morph transitions. Slow and deliberate — luxury takes its time. +``` + +--- + +## 3. Deconstructed — Neville Brody + +**Mood:** Industrial, raw | **Best for:** Tech news, developer launches, security products, punk-energy reveals + +- Dark grey (`#1a1a1a`), rust orange (`#D4501E`), raw white (`#f0f0f0`) +- Type at angles, overlapping edges, escaping frames. Bold industrial weight +- Gritty textures: scan-line effects, glitch artifacts baked into the design +- Text SLAMS and SHATTERS. Letters scramble then snap to final position +- Transitions: Glitch shader or Whip Pan (breaks the rules, feels aggressive) + +**GSAP signature:** `back.out(2.5)`, `steps(8)`, `elastic.out(1.2, 0.4)`. Intentional irregularity. + +``` +Deconstructed: Dark grey #1a1a1a + rust orange #D4501E. Type at angles, escaping frames. +Scan-line glitch overlays. Text SLAMS and scrambles into place. +Glitch shader transitions. Industrial and raw — nothing should feel polished. +``` + +--- + +## 4. Maximalist Type — Paula Scher + +**Mood:** Loud, kinetic | **Best for:** Big product launches, milestone announcements, high-energy hype videos + +- Bold saturated: red (`#E63946`), yellow (`#FFD60A`), black, white — maximum contrast +- Text IS the visual. Overlapping type layers at different scales and angles, filling 50–80% of frame +- Everything is kinetic: slamming, sliding, scaling. 2–3 second rapid-fire scenes +- Text layered OVER footage — never empty backgrounds +- Transitions: Ridged Burn (explosive, dramatic, impossible to ignore) + +**GSAP signature:** `expo.out`, `back.out(1.8)`. Fast arrivals, hard stops. + +``` +Maximalist Type: Red, yellow, black, white — max contrast. Text IS the visual. +Overlapping at different scales, 50-80% of frame. Everything in motion. +Ridged Burn transitions. No static moments — kinetic energy throughout. +``` + +--- + +## 5. Data Drift — Refik Anadol + +**Mood:** Futuristic, immersive | **Best for:** AI products, ML platforms, data companies, speculative tech + +- Iridescent: deep black (`#0a0a0a`), electric purple (`#7c3aed`), cyan (`#06b6d4`) +- Thin futuristic sans-serif — floating, weightless, minimal +- Fluid morphing compositions. Extreme scale shifts (micro → macro) +- Particles coalesce into numbers. Light traces data paths through the frame +- Transitions: Gravitational Lens or Domain Warp (otherworldly distortion) + +**GSAP signature:** `sine.inOut`, `power2.out`. Smooth, continuous, organic. Nothing hard. + +``` +Data Drift: Deep black #0a0a0a with electric purple #7c3aed and cyan #06b6d4. +Thin futuristic type, minimal text. Particles coalesce into numbers. +Gravitational Lens or Domain Warp transitions. Fluid, immersive, otherworldly. +``` + +--- + +## 6. Soft Signal — Stefan Sagmeister + +**Mood:** Intimate, warm | **Best for:** Wellness brands, personal stories, lifestyle products, human-centered apps + +- Warm amber (`#F5A623`), cream (`#FFF8EC`), dusty rose (`#C4A3A3`), sage green (`#8FAF8C`) +- Handwritten-style or humanist serif fonts. Personal, lowercase, delicate +- Close-up framing feel: single element fills the frame. Nothing feels corporate +- Slow drifts and floats, never snaps. Soft organic motion throughout +- Transitions: Thermal Distortion (warm, flowing, like heat shimmer) + +**GSAP signature:** `sine.inOut`, `power1.inOut`. Everything breathes. + +``` +Soft Signal: Warm amber, cream, dusty rose, sage green. Humanist or handwritten type. +Single elements fill the frame — intimate, never corporate. +Slow drifts and floats throughout. Thermal Distortion transitions. +Nothing should feel hurried or polished. +``` + +--- + +## 7. Folk Frequency — Eduardo Terrazas + +**Mood:** Cultural, vivid | **Best for:** Consumer apps, food platforms, community products, festive launches + +- Vivid folk: hot pink (`#FF1493`), cobalt blue (`#0047AB`), sun yellow (`#FFE000`), emerald (`#009B77`) +- Bold warm rounded type. Pattern and repetition — folk art rhythm and density +- Layered compositions with rich visual texture. Every frame feels handcrafted +- Colorful motion: elements bounce, pop, and spin into place with joy +- Transitions: Swirl Vortex or Ripple Waves (hypnotic, celebratory) + +**GSAP signature:** `back.out(1.6)`, `elastic.out(1, 0.5)`. Overshoots feel intentional. + +``` +Folk Frequency: Hot pink #FF1493, cobalt blue, sun yellow, emerald. Bold rounded type. +Pattern and repetition throughout. Layered, dense, handcrafted feeling. +Swirl Vortex or Ripple Waves transitions. Joyful, celebratory energy. +``` + +--- + +## 8. Shadow Cut — Hans Hillmann + +**Mood:** Dark, cinematic | **Best for:** Security products, dramatic reveals, investigative content, intense launches + +- Near-monochrome: deep blacks (`#0a0a0a`), cold greys (`#3a3a3a`), stark white + blood red (`#C1121F`) or toxic green (`#39FF14`) +- Sharp angular text like film noir title cards. Heavy contrast, no softness +- Heavy shadow — elements emerge from darkness. Reveal is the narrative +- Slow creeping push-ins, dramatic scale reveals, silence before the hit +- Transitions: Domain Warp (dissolves reality itself before revealing the next scene) + +**GSAP signature:** `power4.in` for exits, `power3.out` for dramatic reveals. The pause before the hit matters. + +``` +Shadow Cut: Deep blacks #0a0a0a, cold greys, stark white + one accent (blood red or toxic green). +Sharp angular type, film noir aesthetic. Elements emerge from darkness. +Slow creeping push-ins. Domain Warp transitions. The reveal IS the story. +``` + +--- + +## Mood → Style Guide + +| If the content feels... | Use... | +| ---------------------------------- | --------------- | +| Data-driven, analytical, technical | Swiss Pulse | +| Premium, enterprise, luxury | Velvet Standard | +| Raw, punk, aggressive, rebellious | Deconstructed | +| Hype, loud, high-energy launch | Maximalist Type | +| AI, ML, speculative, futuristic | Data Drift | +| Human, warm, personal, wellness | Soft Signal | +| Cultural, fun, consumer, festive | Folk Frequency | +| Dark, dramatic, intense, cinematic | Shadow Cut | + +--- + +## Creating Custom Styles + +These 8 styles are examples — not constraints. Create your own by: + +1. **Name it** after a designer, art movement, or cultural reference +2. **Palette**: 2-3 colors max. Declare explicit hex values +3. **Typography**: One family, two weights. State the role of each +4. **Motion rules**: How fast? Snappy or fluid? Overshoot or precision? +5. **Transition**: Which shader matches the energy? +6. **What NOT to do**: 2-3 explicit anti-patterns for this style + +The pattern: **named style → palette → typography → motion rules → transition → avoids.** diff --git a/packages/codex-plugin/skills/website-to-hyperframes/SKILL.md b/packages/codex-plugin/skills/website-to-hyperframes/SKILL.md new file mode 100644 index 000000000..8a96b5bb1 --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/SKILL.md @@ -0,0 +1,121 @@ +--- +name: website-to-hyperframes +description: | + Capture a website and create a HyperFrames video from it. Use when: (1) a user provides a URL and wants a video, (2) someone says "capture this site", "turn this into a video", "make a promo from my site", (3) the user wants a social ad, product tour, or any video based on an existing website, (4) the user shares a link and asks for any kind of video content. Even if the user just pastes a URL — this is the skill to use. +--- + +# Website to HyperFrames + +Capture a website, then produce a professional video from it. + +Users say things like: + +- "Capture https://... and make me a 25-second product launch video" +- "Turn this website into a 15-second social ad for Instagram" +- "Create a 30-second product tour from https://..." + +The workflow has 7 steps. Each produces an artifact that gates the next. + +--- + +## Step 1: Capture & Understand + +**Read:** [references/step-1-capture.md](references/step-1-capture.md) + +Run the capture, read the extracted data, and build a working summary using the write-down-and-forget method. + +**Gate:** Print your site summary (name, top colors, fonts, key assets, one-sentence vibe). + +--- + +## Step 2: Write DESIGN.md + +**Read:** [references/step-2-design.md](references/step-2-design.md) + +Write a simple brand reference for the captured website. 6 sections, ~90 lines. This is a cheat sheet, not the creative plan — that comes in Step 4. + +**Gate:** `DESIGN.md` exists in the project directory. + +--- + +## Step 3: Write SCRIPT + +**Read:** [references/step-3-script.md](references/step-3-script.md) + +Write the narration script. The story backbone. Scene durations come from the narration, not from guessing. + +**Gate:** `SCRIPT.md` exists in the project directory. + +--- + +## Step 4: Write STORYBOARD + +**Read:** [references/step-4-storyboard.md](references/step-4-storyboard.md) + +Write per-beat creative direction: mood, camera, animations, transitions, assets, depth layers, SFX. This is the creative north star — the document the engineer follows to build each composition. + +**Gate:** `STORYBOARD.md` exists with beat-by-beat direction and an asset audit table. + +--- + +## Step 5: Generate VO + Map Timing + +**Read:** [references/step-5-vo.md](references/step-5-vo.md) + +Generate TTS audio, transcribe for word-level timestamps, and map timestamps to beats. Update STORYBOARD.md with real durations. + +**Gate:** `narration.wav` (or .mp3) + `transcript.json` exist. Beat timings in STORYBOARD.md updated. + +--- + +## Step 6: Build Compositions + +**Read:** The `hyperframes` skill (load it — every rule matters) +**Read:** [references/step-6-build.md](references/step-6-build.md) + +Build each composition following the storyboard. After each one: self-review for layout, asset placement, and animation quality. + +**Gate:** Every composition has been self-reviewed. No overlapping elements, no misplaced assets, no static images without motion. + +--- + +## Step 7: Validate & Deliver + +**Read:** [references/step-7-validate.md](references/step-7-validate.md) + +Lint, validate, snapshot, preview. Deliver the preview to the user first — only render to MP4 on explicit request. + +**Gate:** `npx hyperframes lint` and `npx hyperframes validate` pass with zero errors. + +--- + +## Quick Reference + +### Video Types + +| Type | Duration | Beats | Narration | +| --------------------- | -------- | ----- | ---------------------- | +| Social ad (IG/TikTok) | 10-15s | 3-4 | Optional hook sentence | +| Product demo | 30-60s | 5-8 | Full narration | +| Feature announcement | 15-30s | 3-5 | Full narration | +| Brand reel | 20-45s | 4-6 | Optional, music focus | +| Launch teaser | 10-20s | 2-4 | Minimal, high energy | + +### Format + +- **Landscape**: 1920x1080 (default) +- **Portrait**: 1080x1920 (Instagram Stories, TikTok) +- **Square**: 1080x1080 (Instagram feed) + +### Reference Files + +| File | When to read | +| ------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| [step-1-capture.md](references/step-1-capture.md) | Step 1 — reading captured data | +| [step-2-design.md](references/step-2-design.md) | Step 2 — writing DESIGN.md | +| [step-3-script.md](references/step-3-script.md) | Step 3 — writing the narration script | +| [step-4-storyboard.md](references/step-4-storyboard.md) | Step 4 — per-beat creative direction | +| [step-5-vo.md](references/step-5-vo.md) | Step 5 — TTS, transcription, timing | +| [step-6-build.md](references/step-6-build.md) | Step 6 — building compositions with self-review | +| [step-7-validate.md](references/step-7-validate.md) | Step 7 — lint, validate, snapshot, preview | +| [techniques.md](references/techniques.md) | Steps 4 & 6 — 10 visual techniques with code patterns (SVG drawing, Canvas 2D, 3D, typography, Lottie, video, typing, variable fonts, MotionPath, transitions) | diff --git a/packages/codex-plugin/skills/website-to-hyperframes/references/step-1-capture.md b/packages/codex-plugin/skills/website-to-hyperframes/references/step-1-capture.md new file mode 100644 index 000000000..162ffdae1 --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/references/step-1-capture.md @@ -0,0 +1,74 @@ +# Step 1: Capture & Understand + +## Run the capture + +Create a project directory for your video, then capture the website into a `capture/` subfolder within it: + +```bash +npx hyperframes capture -o /capture +``` + +Example: `npx hyperframes capture https://stripe.com -o videos/stripe-launch/capture` + +Keeping the capture artifacts (`screenshots/`, `assets/`, `extracted/`, `AGENTS.md`, `CLAUDE.md`) in a dedicated `capture/` subfolder keeps them isolated from the later build files (`SCRIPT.md`, `STORYBOARD.md`, `DESIGN.md`, `compositions/`, `index.html`, `narration.wav`, `transcript.json`, `renders/`, `snapshots/`), which all live at `/` root. + +For exploratory captures that aren't becoming a video yet, `-o captures/` at the repo root is fine — the isolation convention only matters when you're building a video on top of the capture. + +No API keys required. The capture extracts design tokens, screenshots, fonts, and assets with DOM-context descriptions automatically. + +**Optional:** Set `GEMINI_API_KEY` (or `GOOGLE_API_KEY`) in a `.env` file at the repo root for richer AI-powered image descriptions via Gemini 3.1 Flash Lite vision (~$0.001/image). + +Wait for it to complete. Print how many screenshots, assets, sections, and fonts were extracted. + +## Read and summarize + +Read each file below. After reading each one, **write a 1-2 sentence summary** of what you learned. These summaries are your working memory — the raw file content may be cleared from context later. + +### Must read (do not skip) + +1. **View the scroll screenshots** — viewport-sized captures covering the full page height (the number depends on the page length). Start with: + - `capture/screenshots/scroll-000.png` — the hero section at full 1920x1080 resolution. This is the most important image. Describe: is the background light or dark? What's the dominant visual element? What colors jump out? + - Then scan through the rest to see the full page. Each screenshot overlaps the previous by ~30%. + + After viewing them, write 3-4 sentences describing the site's visual mood, layout patterns, color strategy, and overall feel. + +2. **`capture/extracted/tokens.json`** — Note the top 5-7 colors (HEX), all font families with their weights (e.g. `Inter (400,700)` or `Sohne (100-900 variable)`), number of sections, and number of headings/CTAs. + +3. **`capture/extracted/visible-text.txt`** — Each line is prefixed with the HTML tag: `[h1] Heading`, `[p] Body text`, `[a] Link text`. Use these tags to understand hierarchy — headings are key messages, paragraphs are supporting copy. Strip the `[tag]` prefix when quoting text in the script. + +4. **`capture/extracted/asset-descriptions.md`** — One-line-per-file summary of all downloaded assets. Note which assets are most visually striking or useful for video (hero images, logos, product screenshots). + +### Read if they exist + +5. **`capture/extracted/animations.json`** — Note if the site uses scroll-triggered animations, marquees, canvas/WebGL, or named CSS animations. + +6. **`capture/extracted/lottie-manifest.json`** — View each preview image at `capture/assets/lottie/previews/` to see what the animations look like. + +7. **`capture/extracted/video-manifest.json`** — View each preview at `capture/assets/videos/previews/` to see what each video shows. + +8. **`capture/extracted/shaders.json`** — If present, this contains the actual GLSL shader code that powers the site's WebGL visual effects (gradient waves, particle systems, noise fields). Read the fragment shaders to extract: color values used in gradients, noise algorithms, blend functions. You can recreate similar effects in your compositions using Canvas 2D or by embedding the shader patterns with a `` + WebGL context. See the Canvas 2D and procedural art patterns in `techniques.md`. + +### On-demand (read when building scenes) + +9. **Individual images in `capture/assets/`** — Use `capture/extracted/asset-descriptions.md` as your index. View specific images when you need them for a beat. + +10. **`capture/extracted/assets-catalog.json`** — Use to find remote URLs when you need an asset that wasn't downloaded. + +### For rich captures (30+ images) + +Launch a sub-agent to view all images and SVGs: + +> "Read every image in capture/assets/ and every SVG in capture/assets/svgs/. For each, write one line: filename — what it shows, dominant colors, approximate size. Return the complete catalog." + +Use the sub-agent's catalog as your asset reference for the rest of the workflow. + +## Gate + +Print your site summary before proceeding to Step 2: + +- **Site:** [name] +- **Colors:** [top 3-5 HEX values with roles] +- **Fonts:** [font families] +- **Sections:** [count] sections, [count] headings, [count] CTAs +- **Key assets:** [3-5 most useful assets for video] +- **Vibe:** [one sentence describing the visual identity] diff --git a/packages/codex-plugin/skills/website-to-hyperframes/references/step-2-design.md b/packages/codex-plugin/skills/website-to-hyperframes/references/step-2-design.md new file mode 100644 index 000000000..90d312467 --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/references/step-2-design.md @@ -0,0 +1,178 @@ +# Step 2: Write DESIGN.md + +DESIGN.md is a **brand cheat sheet** for the captured website. It encodes the visual identity so you can reference exact colors, fonts, and patterns while writing the storyboard and compositions. + +DESIGN.md is NOT the creative plan. The STORYBOARD (Step 4) drives creative direction. DESIGN.md is a reference you consult, not a document you follow slavishly. + +## The 6 Sections + +### `## Overview` + +3-4 sentences. Describe the visual identity factually: layout patterns (bento grid, logo wall, hero section), color strategy, typography tone, overall feel. Be precise, not poetic. + +### `## Colors` + +5-10 key colors with HEX values from `capture/extracted/tokens.json` and their roles: + +``` +- **Primary Surface**: `#020204` — deep black background +- **Primary Content**: `#FFFFFF` — high-purity white for text and borders +- **Accent Warm**: `#FB923C` — orange for CTAs and highlights +``` + +Include semantic colors if the site uses color to differentiate product areas. + +### `## Typography` + +Font families with weights, roles, and any distinctive usage: + +``` +- **Serif**: Cormorant Garamond (Italic). Major headings, brand identity. +- **Monospace**: Geist Mono. Subheaders, labels, terminal readouts. High tracking (0.1-0.3em), all-caps. +- **Sans-Serif**: Inter. Body copy, interface elements. Small sizes (9-14px). +``` + +Include sizing hierarchy if notable (hero: 64px, section: 32px, body: 16px). + +### `## Elevation` + +One paragraph on depth strategy: Does the site use borders, shadows, glassmorphism, or flat color shifts? Reference specific patterns (e.g., "1px borders at white/10 opacity" or "layered backdrop-blur with thin borders"). + +### `## Components` + +Name every notable UI component you see in the screenshot. Be specific: + +- "Cinematic Accordion" not "Cards" +- "Logo Marquee" not "Scrolling section" +- "Glass Cards with grain overlay" not "Content containers" + +For each, note the distinctive visual treatment (border-radius, spacing, hover behavior). + +### `## Do's and Don'ts` + +3-5 rules each, derived from what the site actually does and doesn't do: + +``` +### Do's +- Use thin subtle borders (white/10) to separate sections +- Keep imagery desaturated with dark gradients for text readability + +### Don'ts +- Do not use bright solid background colors — stay in "The Void" +- Do not use standard drop shadows — use radial glow or bloom effects +- Do not use sharp high-speed animations — all motion should be fluid +``` + +## Rules + +- Use **exact HEX values** from `capture/extracted/tokens.json`. Do not approximate. +- Name components by what you see in the screenshot, not generic terms. +- Keep it under 100 lines. This is a cheat sheet, not a design system document. +- No "Style Prompt" section — the storyboard handles creative direction. +- No "Assets" section — `capture/extracted/asset-descriptions.md` already covers this. +- No "Motion" section — the storyboard specifies motion per-beat. + +## Example + +This is a real DESIGN.md from a production capture (Soulscape 2026): + +```markdown +# Design System + +## Overview + +Soulscape 2026 is a cinematic, "high-signal" digital experience that positions itself as the vanguard of AI filmmaking. The visual personality is dark, technical, and premium, characterized by high-contrast "Flare" on "Void" (white on black) aesthetics. The layout is dense but organized, utilizing heavy horizontal layering and border-defined sections to evoke a wide-screen cinematic feel. Motion is a core tenet, with atmospheric grain overlays, shifting light leaks, and slow-moving marquees creating constant, breathing texture. + +## Colors + +- **Primary Surface**: `#020204` (Void) - Deep black for the entire background. +- **Primary Content**: `#FFFFFF` (Flare) - High-purity white for typography and primary borders. +- **Accent 1 (Warm)**: `#FB923C` - Orange for industry/executive tiers and primary CTAs. +- **Accent 2 (Cool)**: `#60A5FA` - Blue for creative voices and summit-focused components. +- **Subtle Overlays**: `rgba(255, 255, 255, 0.02)` to `0.08` for glass backgrounds. + +## Typography + +- **Serif**: Cormorant Garamond (Italic). Major headings and "Soul" brand identity. Classical cinematic contrast. +- **Monospace**: Geist Mono. Subheaders, labels, terminal readouts. High tracking (0.1-0.3em), all-caps. +- **Sans-Serif**: Inter. Body copy and interface elements. Small sizes (9-14px). + +## Elevation + +- **Glassmorphism**: Components use backdrop-filter blur(10px) with thin borders (1px solid rgba(255, 255, 255, 0.08)). +- **Layering**: Depth via fixed global grain-overlay and localized light-leak gradients rather than box-shadows. +- **Interaction**: Hover triggers subtle translateY(-5px) and increased border opacity. + +## Components + +- **Cinematic Accordion**: Expanding horizontal/vertical card system where panels expand from compressed state to reveal full-bleed imagery and large serif typography. +- **HUD Explorer**: Floating mobile navigation trigger styled as a "Lens" with pulsing glow and terminal readouts. +- **Slow Marquees**: Continuous horizontal tickers for partner logos and veteran listings. +- **Glass Cards**: Content containers with subtle gradients, rounded corners (2.5rem), and high-contrast iconography. +- **Grain & Flicker**: Global CSS noise filters and holographic flicker animations on UI labels. + +## Do's and Don'ts + +### Do's + +- Use thin subtle borders (white/10) to separate sections rather than solid color changes. +- Maintain high letter-spacing on all Geist Mono labels. +- Use serif italics for emotional or visionary statements. +- Keep imagery desaturated or stylized with dark gradients for readability. + +### Don'ts + +- Do not use bright solid background colors — the page must remain in "The Void." +- Do not use standard drop shadows — use radial glow or bloom effects instead. +- Do not use sharp high-speed animations — all motion should be fluid and breathing. +``` + +Here is a contrasting example from a light, corporate brand to show the range: + +```markdown +# Design System + +## Overview + +Stripe's visual personality is defined by high-precision, technical sophistication, and a fluid, forward-moving motion language. The layout is dense but expertly balanced, utilizing a "canary" grid system that favors high-density data visualizations and modular bento-style layouts. The tone is authoritative and innovative, characterized by smooth CSS animations, complex SVG graphics that mimic UI dashboards, and the iconic "hero wave" background that uses layered gradients to create depth and movement. + +## Colors + +- **Brand Primary**: #635bff (The signature Stripe Blurple) +- **Text Solid**: #0a2540 (Deep navy for primary headings) +- **Text Soft**: #424770 (Subdued slate for descriptions and secondary text) +- **Surface Background**: #ffffff (White primary surface) +- **Surface Subdued**: #f6f9fc (Light gray for section contrast) +- **Accent Green**: #212d45 (Used in high-converting success UI graphics) +- **Accent Orange**: #ff6118 (Used for specific product highlights like Connect) +- **Accent Yellow**: #fc5 (Warm highlight used in bento cards) +- **Border Quiet**: #e6ebf1 (Soft borders for cards and dividers) + +## Typography + +- **Primary Font**: Sohne (sohne-var), a custom neo-grotesque that balances technical precision with approachability. Used across all headers and body copy. +- **Monospace Font**: SourceCodePro-Medium, specifically for code snippets, tabular data, and technical UI identifiers. +- **Heading Scale**: hds-heading--xxl ~3rem, hds-heading--lg ~1.5rem, hds-heading--md ~1.125rem +- **Body Scale**: Standard body text centers around 1rem (16px) with a line-height of 1.5-1.6. + +## Elevation + +- **Shadows**: Multi-layered shadow system (e.g., 0 30px 60px -12px rgba(50,50,93,0.25)). Shadows are diffused and deep for a floating effect. +- **Borders**: Heavy use of 1px solid borders to define bento grid boundaries instead of shadows in flat sections. +- **Glass/Layering**: Navigation overlays use backdrop-filter blur(5px) with translucent white background. + +## Components + +- **Navigation Popover**: Animated dropdown spanning page margin with multi-column bento layouts. +- **Bento Cards**: Interactive grid-aligned containers with gradient hover effects that follow the cursor. +- **Customer Marquee**: Seamless horizontal scrolling loop of flat-colored SVG logos. +- **UI Graphics**: Custom HTML/CSS representations of the Stripe Dashboard with tabular numbers and mini-charts. +- **CTA Buttons**: Rounded-pill shapes with subtle scale transforms on hover. + +## Do's and Don'ts + +- **Do**: Use smooth cubic-bezier(.25, 1, .5, 1) transitions for all hover states and entering animations. +- **Do**: Maintain strict vertical alignment between iconography and text labels. +- **Don't**: Use sharp-cornered cards; always apply a border-radius. +- **Don't**: Over-saturate backgrounds; stick to white or #f6f9fc and let brand assets provide color pop. +``` diff --git a/packages/codex-plugin/skills/website-to-hyperframes/references/step-3-script.md b/packages/codex-plugin/skills/website-to-hyperframes/references/step-3-script.md new file mode 100644 index 000000000..28a0743cb --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/references/step-3-script.md @@ -0,0 +1,96 @@ +# Step 3: Write the Narration Script + +**Before writing, re-read DESIGN.md** — specifically the Overview and Components sections. The script should reference real product features, real stats, and real components that the website highlights. Use exact numbers from `capture/extracted/visible-text.txt`. + +The script is the backbone. Everything downstream — scene durations, animation timing, beat pacing — comes from the narration. Write it before the storyboard. + +Save as `SCRIPT.md` in the project directory. + +## Pacing + +- **2.5 words per second** is natural speaking pace +- 15s = ~37 words. 30s = ~75 words. 60s = ~150 words +- Leave room for pauses. Silence between sentences is a feature, not dead air +- The script should feel SHORTER than the video — visual breathing room matters + +## Tone + +Write like a person, not a brochure: + +- Use contractions: "it's", "you'll", "that's", "we've" +- Vary sentence length — short punchy phrases mixed with longer flowing ones +- Read it out loud. If it sounds robotic, rewrite it +- Avoid jargon unless the audience expects it + +## Number Pronunciation + +Write what you want the voice to say. TTS reads literally. + +| On the website | Write in script as | +| -------------- | --------------------------------- | +| 135+ | more than one hundred thirty five | +| $1.9T | nearly two trillion dollars | +| 99.999% | ninety nine point nine percent | +| 200M+ | over two hundred million | +| 10x | ten times | +| API | A P I | +| stripe.com | stripe dot com | + +The visual can show the exact figure while the voice rounds it. + +## Structure + +For product videos from a website capture: + +1. **Hook** — what's surprising or impressive about this product? A bold claim, a provocative question, a contrast, or a striking number. This is the opening line. **Vary the hook type** — don't default to a stat every time. +2. **Story** — what does the product do? Who uses it? Keep it concrete. +3. **Proof** — stats, customer names, social proof. Real numbers from the website. +4. **CTA** — what should the viewer do? "Start building at stripe dot com." + +Not every video needs all four. A 15-second social ad might be Hook + Proof + CTA. A 60-second product tour uses all four with more Story. + +## The Opening Line + +The most important sentence in the video. It must create tension, curiosity, or surprise in the first 3 seconds. + +Patterns that work: + +- **A bold claim**: "The financial infrastructure that powers the internet economy." +- **A question that provokes**: "What if your database could think?" +- **A contrast**: "Your AI agent already knows how to make videos. It just needs the right format." +- **A number that shocks**: "Nearly two trillion dollars." (Use sparingly — not every video should open with a stat.) + +If the opening is generic ("Welcome to Stripe" / "Introducing our product"), start over. + +## Example + +From a 62-second product launch video (team reference): + +``` +Your AI agent already knows how to make videos. +It just needs the right format. + +This is Hyperframes. An open source framework. HTML in, video out. + +A div is a keyframe. Data attributes are your timeline. +CSS is your look. G-Sap is your animation engine. + +Anything a browser can render can be a frame in your video. + +CSS animations. G-Sap. Lottie. Shaders. Three.js. + +Drop in music, sound effects, footage — it all composes together. + +No new framework for the agent to learn. +Just HTML. + +The agent writes it. The renderer captures every frame as MP4. +It's deterministic. Identical outputs, every time. + +Give your agent the CLI. Tell it what to make. +Watch it build. + +Hyperframes. Go make something. +``` + +Note: ~140 words for 62 seconds — that's 2.3 words/sec, leaving room for pauses and visual breathing. diff --git a/packages/codex-plugin/skills/website-to-hyperframes/references/step-4-storyboard.md b/packages/codex-plugin/skills/website-to-hyperframes/references/step-4-storyboard.md new file mode 100644 index 000000000..c802a4425 --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/references/step-4-storyboard.md @@ -0,0 +1,247 @@ +# Step 4: Write the Storyboard + +**Before writing anything, fully re-read these files:** + +- **DESIGN.md** — your color palette, font rules, components, Do's/Don'ts. Every creative decision must be grounded in this brand identity. If it says "white backgrounds with purple accent" — plan light scenes, not dark moody ones. +- **`capture/extracted/asset-descriptions.md`** — read EVERY line. This is your menu of available visuals. Each line describes what the image actually shows (e.g., "translucent ribbons in orange, pink, and purple on white background" or "a high-speed train under a dark starry sky"). Use these descriptions to decide which assets belong in which beat. Assets you don't understand from the description — view them directly before assigning. +- **[techniques.md](techniques.md)** — 11 visual techniques (SVG path drawing, Canvas 2D art, CSS 3D, per-word typography, Lottie, video compositing, typing effect, variable fonts, MotionPath, velocity transitions, audio-reactive). Pick 2-3 per beat and specify them in the storyboard. + +The storyboard is the creative north star. It tells the engineer exactly what to build for each beat — mood, camera, animations, transitions, assets, sound. Write it as if you're briefing a motion designer who's never seen the website. + +Save as `STORYBOARD.md` in the project directory. + +--- + +## Global Direction + +Every STORYBOARD.md starts with global settings: + +```markdown +**Format:** 1920×1080 +**Audio:** [TTS provider] voiceover + underscore + SFX +**VO direction:** [voice character — e.g., "mid-age male, calm confident delivery, +Apple keynote register — economy of words, silence between sentences is a feature"] +**Style basis:** DESIGN.md (brand colors, fonts, components from the captured site) +``` + +**Global guardrails** (adapt to the brand): + +- Push color presence. Muted is fine, flat is not. Every beat should have at least one color that pulls your eye. +- Motion should be visible and intentional. Err toward more movement than feels safe — subtle reads as static at 30fps. +- Use as many captured assets as the creative vision allows. Scatter framework icons around a dashboard. Layer enterprise photos behind stats. Use product screenshots as floating cards. The assets exist — use them generously. +- Aim for 8-10 visual elements per beat, not 2-3. A great beat has: background texture, midground content, foreground accents, floating decorative elements, animated icons, SVG path drawings, particle effects, typographic details. It should feel DENSE and alive. +- Use at least 2-3 different techniques from techniques.md per beat — not across the whole video, per beat. Don't default to basic fade/scale/opacity — mix in SVG path drawing, CSS 3D transforms, typing effects, counter animations, canvas procedural art. Each beat should feel like its own visual world. + +**Underscore/music direction** (if applicable): + +- Describe the mood, reference artists, when it swells or drops +- Example: "Minimal electronic. Warm sustained pad already playing when the video starts. Sits underneath everything, never competing with VO. Swells gently during the flex section, drops to near-nothing for the comparison, resolves on a final chord." + +--- + +## Asset Audit + +Before writing any beats, audit every captured asset. Print this table: + +| Asset | Type | Assign to Beat | Role | +| ------------------------------ | ---------- | -------------- | ------------------------------------- | +| wave-fallback-desktop.png | Hero image | Beat 1 | Full-bleed animated background | +| enterprise-accordion-hertz.png | Photo | Beat 3 | Enterprise credibility, Ken Burns pan | +| stripe-logo.svg | SVG | Beat 1, Beat 5 | Brand mark opener + closer | +| datavizstatic3x.png | Data viz | Beat 3 | Supporting visual behind stats | +| icon-3.svg | Icon | SKIP | Decorative, too small | + +**Minimum utilization:** + +- At least 50% of product screenshots and hero images must appear +- Brand logo appears in the first AND last beat +- The site's signature visual (gradient wave, hero illustration, key product UI) must appear — it's the most recognizable brand element +- Maximum 2 consecutive text-only beats. The 3rd must contain a visual asset +- Opening beat must contain a visual asset, not text-only + +--- + +## Per-Beat Direction + +Each beat is a WORLD, not a layout. Before writing CSS specs and GSAP instructions, describe what the viewer EXPERIENCES. The difference between a great storyboard and a mediocre one: + +**Mediocre:** "Dark navy background. '$1.9T' in white, 280px. Logo top-left. Wave image bottom-right." +**Great:** "Camera is already mid-flight over a vast dark canvas. The gradient wave sweeps across the frame like aurora borealis — alive, shifting. '$1.9T' SLAMS into existence with such force the wave ripples in response. This isn't a slide — it's a moment." + +The first describes pixels. The second describes an experience. Write the second, then figure out the pixels. + +Each beat should have: + +### Concept + +The big idea for this beat in 2-3 sentences. What visual WORLD are we in? What metaphor drives it? What should the viewer FEEL? This is the most important part — everything else flows from it. + +### VO cue + +Which narration line plays over this beat. + +### Visual description + +What the viewer sees — described cinematically, not as CSS specs. Use camera language (pan, zoom, drift, settle). Describe at least 5 visual elements, not just text + background. Think in layers — what's moving in the foreground, midground, background simultaneously? + +### Mood direction + +Cultural and design references, not hex codes: + +- "Geometric, rhythmic, precise. Think Josef Albers or Bauhaus color studies." +- "Warm workspace. Nice notebook energy, not technical blueprint." +- "Cinematic title sequence. The kind of opening where you lean forward." + +### Assets + +Which captured files to use, referenced by filename: + +- "Background: `capture/assets/wave-fallback-desktop.png` — full-bleed, slow zoom 1→1.04 over beat duration" +- "Logo: `capture/assets/svgs/stripe-logo.svg` — centered, fades in at 0.5s" +- "Enterprise photo: `capture/assets/enterprise-accordion-hertz.png` — Ken Burns pan, 70% opacity overlay" + +### Animation choreography + +Specific motion verbs per element — not "it animates in" but HOW: + +| Energy | Verbs | Example | +| ------------- | --------------------------------------------- | ------------------------------------- | +| High impact | SLAMS, CRASHES, PUNCHES, STAMPS, SHATTERS | "$1.9T" SLAMS in from left at -5° | +| Medium energy | CASCADE, SLIDES, DROPS, FILLS, DRAWS | Three cards CASCADE in staggered 0.3s | +| Low energy | types on, FLOATS, morphs, COUNTS UP, fades in | Counter COUNTS UP from 0 to 135K | + +Every element gets a verb. If you can't name the verb, the element is not yet designed. + +### Transition + +How this beat hands off to the next. Specify the type and parameters. + +**When to pick which:** + +| Choose shader transition for | Choose CSS transition for | Choose hard cut for | +| ------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------- | -------------------------------------------------------------- | +| Reveals, big reaction shots, product/logo unveils, energy shifts, "wow" moments | Continuous camera-motion beats where the scene feels like one move broken into cuts | Rapid-fire lists, percussive edits on the beat, comedic timing | +| Any moment the music/VO punctuates with a downbeat or SFX hit | Beats that ease from one composition into the next with shared motion vocabulary | Sequences of 3+ quick tempo-matched switches | +| Brand moments where the transition itself _is_ the visual | Minimal/editorial pacing | Anytime a 0.3-0.8s transition would feel too slow | + +Rule of thumb: if the beat is the _centerpiece_ of the video, shader-transition into it. If the beat is connective tissue, CSS-transition. A brand reel of 5-7 beats usually wants 1-2 shader transitions (the hero reveal + the CTA) and the rest CSS or hard cuts — too many shader transitions flatten their impact. + +**CSS transitions** (choose from `skills/hyperframes/references/transitions/catalog.md`): + +- Velocity-matched upward: exit `y:-150, blur:30px, 0.33s power2.in` → entry `y:150→0, blur:30px→0, 1.0s power2.out` +- Whip pan: exit `x:-400, blur:24px, 0.3s power3.in` → entry `x:400→0, blur:24px→0, 0.3s power3.out` +- Blur through: exit `blur:20px, 0.3s` → entry `blur:20px→0, 0.25s power3.out` +- Zoom through: exit `scale:1→1.2, blur:20px, 0.2s power3.in` → entry `scale:0.75→1, blur:20px→0, 0.5s expo.out` +- Hard cut / smash cut (for rapid-fire sequences) + +**Shader transitions** (choose from `packages/shader-transitions/README.md`): + +- Cross-Warp Morph (organic, versatile) — 0.5-0.8s, power2.inOut +- Cinematic Zoom (professional momentum) — 0.4-0.6s, power2.inOut +- Gravitational Lens (otherworldly) — 0.6-1.0s, power2.inOut +- Glitch (aggressive, high energy) — 0.3-0.5s +- See `packages/shader-transitions/README.md` for the full API, available shaders, and setup + +**How velocity-matched CSS transitions work:** +Exit the outgoing beat with an accelerating ease (power2.in or power3.in) plus a blur ramp. Enter the incoming beat with a decelerating ease (power2.out or power3.out) plus blur clear. The fastest point of both easing curves meets at the cut — the viewer perceives continuous camera motion, not two discrete animations. Match exit velocity to entry velocity within ~5% tolerance. + +### Depth layers + +What's in foreground, midground, and background. Every beat should have at least 2 layers: + +- "BG: dark navy fill + subtle radial glow. MG: stat cards with drop shadow. FG: brand logo bottom-right." + +### SFX cues + +What sounds at what moment: + +- "On the capture pulse — a soft, warm analog shutter click." +- "Left side carries a faint low drone. On fold: drone cuts. Silence. Then a single clean chime." + +--- + +## Production Architecture + +Include this file tree at the bottom of the storyboard: + +``` +project/ +├── index.html root — VO + underscore + beat orchestration +├── DESIGN.md brand reference (from Step 2) +├── SCRIPT.md narration text (from Step 3) +├── STORYBOARD.md THIS FILE — creative north star +├── transcript.json word-level timestamps (from Step 5) +├── narration.wav TTS audio (from Step 5) +├── capture/ captured website data (from Step 1) +│ ├── screenshots/ +│ ├── assets/ +│ │ ├── svgs/ +│ │ ├── fonts/ +│ │ ├── lottie/ +│ │ └── videos/ +│ ├── extracted/ +│ │ ├── tokens.json +│ │ ├── visible-text.txt +│ │ ├── asset-descriptions.md +│ │ ├── animations.json +│ │ ├── assets-catalog.json +│ │ └── detected-libraries.json +│ ├── AGENTS.md +│ └── CLAUDE.md +└── compositions/ + ├── beat-1-hook.html + ├── beat-2-features.html + ├── ... + └── captions.html +``` + +--- + +## Example: Beat-by-Beat Format + +Here are three beats from a production storyboard showing the level of detail expected. + +### BEAT 1 — COLD OPEN (0:00–0:05) + +**VO:** "Your AI agent already knows how to make videos." + +**Concept:** We're already in motion when the video starts. No title card, no fade from black. We're mid-flight over an infinite creative workspace — dozens of living compositions scattered below us like a city seen from a drone. Each one is alive, running a different animation. The message is clear before any words: this tool makes videos. Lots of them. + +**Visual:** Slow smooth diagonal drift over a vast canvas (3600×2200px plane). Scattered across it: 25 composition cards at organic angles (±5-15° rotation), soft shadows, thin borders. Each card contains a DIFFERENT running animation — kinetic type, gradient morph, data viz, particle system, logo assembly, SVG drawing, shader noise, 3D rotating object. Depth-of-field: close cards slightly blurred, focal sweet-spot in mid-distance, far cards smaller and desaturated. + +**Camera:** Diagonal drift top-left to bottom-right, slight 2-3° rotation over 5s. power1.inOut ease. Zoom accelerates in final second as we approach one specific card. + +**Assets:** Product screenshots and logo on cards. Each card is a mini-composition with its own animation. + +**SFX:** Ambient warmth pad already playing. Faint textured hum — overhearing creative activity from a distance. + +--- + +### BEAT 5 — THE THESIS (0:20–0:24) + +**VO:** "Anything a browser can render can be a frame in your video." + +**Mood:** Big statement. This sentence gets its own canvas. Clean, spacious, typographic. + +**Visual:** Words appear as staggered kinetic typography. "Anything a browser can render" — distinctive serif, gentle fade + rise (y: 24px → 0, opacity 0 → 1, 0.4s, power2.out). Held beat — one second of stillness. "can be a frame in your video." appears below. As the final word lands, the entire text pulses once — a brief warm flash, subtle scale bump to 101%. + +**Transition OUT:** Whip pan left — x:-400, blur:24px, opacity:0.4, 0.3s power3.in + +**SFX:** Silence under the first line. On the capture pulse — a soft analog shutter click. + +--- + +### BEAT 7 — THE CONTRAST (0:38–0:44) + +**VO:** "No new framework for the agent to learn. Just HTML." + +**Mood:** Clean comparison. Light base. Two worlds side by side. + +**Visual:** Left half: dense code, small, compressed, overwhelming. Scrolls slowly upward. Slightly desaturated. Right half: spacious HTML, syntax-highlighted, generous line spacing, inviting. On "Just HTML." — the left side folds inward along its center line, like a book closing. The right side expands to fill the frame. Warm glow rises behind it. + +**Transition IN:** Zoom through — scale 0.75→1, blur 20px→0, 0.5s expo.out +**Transition OUT:** Velocity-matched upward — y:-150, blur:30px, 0.33s power2.in + +**Assets:** Real framework code on the left (actual content, not lorem ipsum). Real HyperFrames HTML on the right. + +**SFX:** Left side carries a faint low drone. On fold: drone cuts. Silence. Then a single clean chime as the right side expands. diff --git a/packages/codex-plugin/skills/website-to-hyperframes/references/step-5-vo.md b/packages/codex-plugin/skills/website-to-hyperframes/references/step-5-vo.md new file mode 100644 index 000000000..913081cc8 --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/references/step-5-vo.md @@ -0,0 +1,42 @@ +# Step 5: Generate VO + Map Timing + +## Audition voices + +Never use the first voice you find. Audition 2-3 voices with the first sentence of SCRIPT.md: + +- **Kokoro** (try first — free, no API key) — `npx hyperframes tts SCRIPT.md --voice af_nova --output narration.wav`. Runs locally on CPU. Requires Python 3.10+ (macOS system Python 3.9 won't work — if it fails with an onnxruntime error, move to the next option). +- **ElevenLabs** (best voice quality, widest selection) — `mcp__elevenlabs__search_voices` to browse, `mcp__elevenlabs__text_to_speech` to generate. Does not return timestamps — transcribe separately after. +- **HeyGen TTS** (returns word timestamps automatically — saves a transcribe step) — `mcp__claude_ai_HeyGen__text_to_speech`. Use when you want timestamps without a separate transcription pass. + +Pick the voice that sounds most natural and conversational. Listen for pacing — does it breathe between sentences? Does it sound like a person or a robot? + +## Generate full narration + +Generate the full script as `narration.wav` (or `.mp3`) in the project directory. + +**Also save the exact spoken text** — with pronunciation substitutions applied (e.g., `API` → `A P I`, `$2T` → `two trillion`) — as `narration.txt` in the same directory. This is the string passed to TTS, distinct from `SCRIPT.md` which is the human-readable creative doc. Having `narration.txt` makes it trivial to regenerate the audio later with a different voice without re-deriving the substitutions. Name it exactly `narration.txt`. + +## Transcribe for word-level timestamps + +```bash +npx hyperframes transcribe narration.wav +``` + +Produces `transcript.json` with `[{ text, start, end }]` for every word. These timestamps are the source of truth for all beat durations. + +## Map timestamps to beats + +Go through STORYBOARD.md beat by beat. For each beat: + +1. Find the first word of that beat's VO cue in `transcript.json` +2. Find the last word of that beat's VO cue +3. Set `beat.start = firstWord.start`, `beat.end = lastWord.end` +4. Add 0.3-0.5s padding at the end for visual breathing room + +Update STORYBOARD.md with real durations. Replace estimated times (e.g., "0:00-0:05") with actual timestamps (e.g., "0.00-3.21s"). + +Beat boundaries land on word onsets — hard cuts to the VO. + +## Update index.html + +Update each scene slot's `data-start` and `data-duration` to match the real beat timings from the transcript. Also update the total composition duration and audio element duration. diff --git a/packages/codex-plugin/skills/website-to-hyperframes/references/step-6-build.md b/packages/codex-plugin/skills/website-to-hyperframes/references/step-6-build.md new file mode 100644 index 000000000..c04c5991e --- /dev/null +++ b/packages/codex-plugin/skills/website-to-hyperframes/references/step-6-build.md @@ -0,0 +1,166 @@ +# Step 6: Build Compositions + +**Before building, fully re-read these files:** + +- **DESIGN.md** — your color palette, fonts, components, and Do's/Don'ts. Every composition must use EXACT hex colors and font families from this file. If it says "white backgrounds" — use white, not dark. +- **STORYBOARD.md** — the beat-by-beat plan you're executing. Each beat specifies assets, animations, transitions, and which techniques to use. +- **`capture/extracted/asset-descriptions.md`** — when the storyboard assigns an asset to a beat, re-read the description to understand what it shows and how to position/style it correctly. +- **[techniques.md](techniques.md)** — code patterns for the 10 visual techniques. When the storyboard says "SVG path drawing" or "per-word kinetic typography" — read the code pattern from this file and adapt it. +- **transcript.json** — word-level timestamps that drive scene durations. + +**Split the work: spawn a sub-agent for each beat.** By this step your context is full of captured data, DESIGN.md, SCRIPT, STORYBOARD, and transcript. Building compositions on top of all that means the detailed rules below compete with thousands of tokens of prior work. Each sub-agent gets a fresh context focused on one beat — dramatically better output. + +**How to dispatch each sub-agent:** + +Pass file PATHS, not file contents. The #1 failure mode is reading an asset file and pasting its SVG/image data into the sub-agent prompt. The sub-agent then uses inline content instead of referencing the file on disk. Same with fonts — pass the local woff2 path, don't substitute Google Fonts. + +``` +Build the composition for beat 1. Save to compositions/beat-1-hook.html. + +STORYBOARD for this beat: +[paste the beat section from STORYBOARD.md] + +ASSETS — reference by path, do NOT read/inline the file contents: +- Logo: (top-left, 40x40px) +- Hero image: (full-bleed background) +- Noise texture: ../capture/assets/noise.png (full-frame overlay, 3% opacity) + +FONTS — use @font-face with the captured font files, NOT Google Fonts: +@font-face { font-family: 'BrandFont'; src: url('../capture/assets/fonts/BrandFont-Regular.woff2'); } + +Read DESIGN.md for exact colors and Do's/Don'ts. +Read techniques.md for animation code patterns. +Load the `hyperframes` skill for composition structure rules. +``` + +After each sub-agent finishes, verify the composition references `../capture/assets/` — if it used inline SVGs or Google Fonts instead of the captured files, fix it before moving on. + +Load the `hyperframes` skill first — it has the rules for data attributes, timeline contracts, deterministic rendering, and layout. Everything below supplements those rules, not replaces them. + +--- + +## Per-Composition Process + +For each beat in the storyboard: + +### 1. Read the beat's storyboard section + +Know the mood, visual description, assets, animation choreography, transition, and SFX before writing any HTML. + +### 2. Build the static end-state first + +Position every element where it should be at its **most visible moment** — the frame where everything is fully entered and correctly placed. Write this as static HTML+CSS. No GSAP yet. + +This is the "Layout Before Animation" principle from the compose skill. The CSS position is the ground truth. Animations describe the journey to and from it. + +### 3. Verify the static layout + +Look at it. Check: + +- Are elements where the storyboard says they should be? +- Are depth layers present (foreground / midground / background)? +- Do any elements overlap unintentionally? +- Are assets sized correctly? (hero images should fill 50-70% of frame, not sit at 100x100px) + +### 4. Add entrance animations + +Use `gsap.from()` — animate FROM offscreen/invisible TO the CSS position. The CSS position is where the element ends up. + +### 5. Add mid-scene activity + +Every visible element must have continuous motion. A still image on a still background is a JPEG with a progress bar. + +| Element type | Mid-scene activity | +| -------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | +| Image / screenshot | Slow zoom (scale 1→1.03), slow pan, or Ken Burns | +| Stat / number | Counter animates from 0 to target | +| Logo grid | Subtle shimmer sweep, or gentle scale pulse | +| Any persistent element | Subtle float (y ±4-6px, sine.inOut, yoyo) | +| Logo / CTA (with music or dramatic VO) | Audio-reactive scale/glow — bass pulses the logo (3–4%), treble glows the CTA. See technique #11 in `techniques.md` for the sampling pattern | + +### 6. Add exit / transition + +Check the storyboard's transition specification for this beat: + +- **CSS transition**: implement the exit animation (e.g., `y:-150, blur:30px, 0.33s power2.in`). The next composition handles its own entry. +- **Shader transition**: no exit animation needed — the shader handles the blend. Read `packages/shader-transitions/README.md` for the API, available shaders, and setup. The package handles WebGL init, capture, and GSAP integration — do not copy raw GLSL manually. +- **Hard cut**: no exit animation. The scene simply ends. + +For all CSS transition types and their GSAP implementations, read `skills/hyperframes/references/transitions/catalog.md`. + +### 7. Asset cross-reference + +Before self-review, verify you actually used the assets you planned to: + +1. Open STORYBOARD.md and find this beat's asset assignments +2. List every asset that was assigned to this beat +3. Search the composition HTML for each filename (e.g., grep for "wave-fallback-desktop") +4. If any assigned asset is missing from the HTML, add it now +5. Check for the inline anti-pattern: if the HTML contains `` +6. Check fonts: if the HTML uses `fonts.googleapis.com` but there are captured fonts in `capture/assets/fonts/`, replace with `@font-face` pointing to the local files (e.g., `src: url('../capture/assets/fonts/BrandFont-Regular.woff2')`) + +This step catches the two most common failures: compositions ending up text-only, and assets being inlined instead of file-referenced. + +### 8. Self-review + +After building the composition, check WITH ACTUAL CODE: + +- [ ] Asset cross-reference passed (step 7 above — every assigned asset is in the HTML) +- [ ] Elements are where the storyboard says they should be (no misplacement) +- [ ] No overlapping text (text covering text is always ugly) +- [ ] Depth layers present (2+ layers minimum) +- [ ] Every visible element has mid-scene activity (not just entrance + exit) +- [ ] Font sizes above minimum (20px body text, 16px labels — sub-14px is unreadable after encoding) +- [ ] No full-screen dark linear gradients (H.264 creates visible banding — use solid + localized radial glows) +- [ ] Timeline registered: `window.__timelines["comp-id"] = tl` +- [ ] Colors match DESIGN.md exactly (paste the HEX value, don't approximate) +- [ ] **Every `` with appropriate z-layering, or stagger the `data-start` values. Linter: `duplicate_media_discovery_risk`. + +**If `skills/hyperframes-animation-map/` is installed**, run it: + +```bash +node skills/hyperframes-animation-map/scripts/animation-map.mjs +``` + +Read the summaries. Fix every flag: offscreen, collision, invisible, pacing issues. + +### 9. Move to the next composition + +--- + +## Asset Presentation + +Never embed a raw flat image. Every image must have motion treatment: + +- **Perspective tilt**: use `gsap.set(el, { transformPerspective: 1200, rotationY: -8 })` + `box-shadow` — creates depth. Do NOT use CSS `transform: perspective(...)` as GSAP will overwrite it. +- **Slow zoom (Ken Burns)**: GSAP `scale: 1` → `1.04` over beat duration — makes photos cinematic +- **Device frame**: Wrap in a laptop/phone shape using CSS `border-radius` and `box-shadow` +- **Floating UI**: Extract a key element and animate it at a different z-depth for parallax +- **Scroll reveal**: Clip the image to a viewport window and animate `y` position + +--- + +## Audio Wiring + +In the root `index.html`: + +- **Narration**: `