Skip to content

fix(framework): persist proxyFallbackOrigins across page loads#37

Open
nsyring wants to merge 1 commit intoagent0ai:mainfrom
nsyring:fix/fetch-proxy-persistent-origin-cache
Open

fix(framework): persist proxyFallbackOrigins across page loads#37
nsyring wants to merge 1 commit intoagent0ai:mainfrom
nsyring:fix/fetch-proxy-persistent-origin-cache

Conversation

@nsyring
Copy link
Copy Markdown

@nsyring nsyring commented Apr 27, 2026

fix(framework): persist proxyFallbackOrigins across page loads

Summary

installFetchProxy(...) already remembers cross-origin endpoints whose direct fetch was blocked, so subsequent calls to the same origin go straight through /api/proxy without re-trying the doomed direct fetch. The cache was an in-memory Set, which means every page load starts cold — the first call to chatgpt.com (Codex provider), to a polling status widget's external API, or to any other CORS-blocked upstream pays one failed-direct-fetch round trip and emits a red Access-Control-Allow-Origin error in the DevTools console before the wrapper transparently retries via the proxy.

This PR persists the origin set in localStorage with a 7-day TTL, so a known-CORS-blocked origin routes directly through the proxy from the first fetch of every page load. No upstream config change, no caller change, no new opt-in: the cache just grows past the page lifetime.

Why

Three concrete reproducers I observed in real usage:

  1. OpenAI Codex provider: every fresh tab logged a red CORS error for chatgpt.com/backend-api/codex/responses on the first chat call. The Codex hook now routes via space.proxy.buildUrl(...) directly (PR feat: native OpenAI Codex (ChatGPT Plus) OAuth provider #22 + Codex proxy fix), so this is fixed at the call site — but it shouldn't have to be fixed at every call site; the framework's fallback cache already has the right idea.

  2. Polling status widgets: a widget that polls a local LAN inference API every 3 s logs one CORS error per page load. The widget skill (SKILL.md) explicitly endorses bare fetch(url) for external HTTP because "the runtime already retries blocked origins through /api/proxy" — but that promise only holds after the in-memory cache is warm. With the cache cold every reload, the first three seconds of widget telemetry are noisy.

  3. installFetchProxy's own retry path: every fallback retry hits the slow filesystem on Linux and surfaces as a brief UI hitch when the user changes spaces rapidly. The hitch is small but real, and orthogonal to the "is the response correct" question.

In all three cases, the runtime knows the origin is CORS-blocked. Forgetting that fact when the page reloads is a regression we never wanted; the in-memory Set just happens to be the easy data structure.

What changed

app/L0/_all/mod/_core/framework/js/fetch-proxy.js:

  • The proxyFallbackOrigins data structure changes from Set<origin> to Map<origin, lastSeenAtMs>. The Map keys are still origin strings; the values are timestamps used for TTL eviction.
  • loadPersistedProxyFallbackOrigins() is called on module load. It reads the JSON object at localStorage["space.framework.proxy-fallback-origins"], drops stale entries (>7 days since last touch) and corrupt entries, and seeds the in-memory Map. Any storage error (sandboxed context, disabled localStorage) is silent — origin caching is an optimisation, not a correctness requirement, and the in-memory Map still serves the page lifetime.
  • persistProxyFallbackOrigins() writes the current Map back as a JSON object. Called from rememberProxyFallbackOrigin(...) after a new origin is added and from clearProxyFallbackOrigins(). Removes the localStorage entry entirely when the Map is empty.
  • hasProxyFallbackOrigin(targetUrl) now checks the entry's lastSeenAt against the TTL. Stale entries are evicted on read so a recovered upstream (a service that finally added Access-Control-Allow-Origin) can leave the cache naturally without manual intervention.
  • rememberProxyFallbackOrigin(targetUrl) writes the current timestamp; subsequent reads refresh the TTL window. So an actively-used origin stays cached indefinitely; an origin that hasn't been used in 7 days falls out.
  • clearProxyFallbackOrigins() (already exposed on the proxiedFetch closure for callers that want to force-refresh) now clears localStorage too.

The cache key remains the origin string returned by new URL(targetUrl, window.location.href).origin, so the existing semantics of "fall back per origin, not per URL path" are preserved.

No API change, no caller change, no new public surface. The helper functions on the proxiedFetch closure (hasProxyFallbackOrigin, rememberProxyFallbackOrigin, clearProxyFallbackOrigins) keep their signatures.

Storage shape

// localStorage["space.framework.proxy-fallback-origins"]:
{
  "https://chatgpt.com": 1761638400000,
  "https://api.example.local:9090": 1761735600000
}
  • Keys: origin strings (no path, no query)
  • Values: Date.now() of the most recent read OR write that touched the origin
  • Encoded as a JSON object so partial writes don't corrupt the cache (atomic via the JSON.stringify roundtrip)
  • TTL: 7 days from lastSeenAt. Values older than that are silently dropped.

Behavior matrix

Scenario Before this PR After this PR
First page load, first fetch to a CORS-blocked origin Direct fetch fails → console error → fallback via /api/proxy succeeds Same
Second fetch in same page lifetime to that origin Direct via proxy (cache hit) Same
Page reload, first fetch to that same origin Direct fetch fails again → same console error → fallback succeeds Direct via proxy from first fetch — no console error, no retry
Origin recovered from CORS-blocked → CORS-permitted, user fetches it Cache hit forever (in-memory Set never expires) until tab close TTL evicts after 7 days of inactivity; the next fetch tries direct again
Storage disabled (sandboxed iframe, localStorage error) n/a Falls back to in-memory Map; same behaviour as today
clearProxyFallbackOrigins() called Clears in-memory Set Clears in-memory Map and localStorage entry

Test plan

  • node --check app/L0/_all/mod/_core/framework/js/fetch-proxy.js passes
  • Pure-helper sanity table verified: empty Map returns false; remembered origin returns true; stale entry returns false and is evicted; persisted JSON round-trips correctly through the load path
  • Manual verification in npm run desktop:pack build:
    • First load with localStorage clean: first call to chatgpt.com logs the expected CORS error once; the next page reload is silent (cache hit on first fetch)
    • Polling widget that fetches a CORS-blocked LAN endpoint: first poll on a fresh load logs the error; subsequent polls and subsequent reloads are silent
    • localStorage["space.framework.proxy-fallback-origins"] is populated after a successful retry-via-proxy and stripped of stale entries on next read

Out of scope (possible follow-ups)

  • A first-class space.proxy.preregisterOrigin(url) API for callers that know an upstream is CORS-blocked at module load time and want to skip even the first failed direct fetch. Useful for known-third-party APIs (Codex chatgpt.com, OpenAI Platform). The current PR is sufficient to eliminate the console noise, and a preregister API would be additive on top of the persistence layer.
  • Telemetry on cache hits and stale evictions. The Map is module-private; future debugging could surface a space.proxy.getFallbackOriginStats() to see hit rates and TTL bucketing.
  • Cross-tab cache invalidation (storage change events). If one tab notices a recovered upstream, no other open tab knows. Acceptable for this PR — TTL closes the gap eventually — but worth noting.

Relationship to prior PRs


🤖 Generated with Claude Code

installFetchProxy already remembers cross-origin endpoints whose direct
fetch was blocked, so subsequent calls to the same origin go straight
through /api/proxy without re-trying the doomed direct fetch. The cache
was an in-memory Set, which means every page load starts cold — the
first call to chatgpt.com (Codex provider), to a polling status
widget's external API, or to any other CORS-blocked upstream pays one
failed-direct-fetch round trip and emits a red Access-Control-Allow-
Origin error in the DevTools console before the wrapper transparently
retries via the proxy.

Persist the origin set in localStorage with a 7-day TTL so a known-
CORS-blocked origin routes directly through the proxy from the first
fetch of every page load. No upstream config change, no caller change,
no new opt-in. The widget skill explicitly endorses bare fetch(url)
for external HTTP because "the runtime already retries blocked origins
through /api/proxy" — this PR makes that promise hold from the first
fetch of every page load instead of only after the in-memory cache is
warm.

Implementation:

- proxyFallbackOrigins changes from Set<origin> to Map<origin, lastSeenAtMs>.
- loadPersistedProxyFallbackOrigins on module load reads the JSON object
  at localStorage["space.framework.proxy-fallback-origins"], drops
  stale entries (>7 days) and corrupt entries, and seeds the Map. Any
  storage error (sandboxed context, disabled localStorage) is silent.
- persistProxyFallbackOrigins writes the current Map back as a JSON
  object. Removes the localStorage entry entirely when the Map is
  empty.
- hasProxyFallbackOrigin checks lastSeenAt against the TTL and evicts
  stale entries on read so a recovered upstream (a service that finally
  added Access-Control-Allow-Origin) can leave the cache naturally.
- rememberProxyFallbackOrigin writes the current timestamp; subsequent
  reads refresh the TTL window. Active origins stay cached
  indefinitely; origins not seen in 7 days fall out.
- clearProxyFallbackOrigins clears localStorage too.

The cache key remains the origin string returned by `new URL(...).origin`,
so the existing per-origin (not per-URL) fallback semantics are preserved.

No API change, no caller change. The helper functions exposed on the
proxiedFetch closure keep their signatures.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant