-
Notifications
You must be signed in to change notification settings - Fork 168
Use new server endpoint v2 #791
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| history.replaceState(null, '', '/'); | ||
|
|
||
| // Fetch the actual widget HTML using toolId | ||
| const response = await fetch('/api/mcp/openai/widget-content/${toolId}'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| (function() { | ||
| 'use strict'; | ||
|
|
||
| console.log('[OpenAI Widget] Initializing bridge for tool: ${toolName}'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: XSS via Unescaped Tool Metadata in Templates
User-controlled toolName and toolId values, sourced from the /widget/store endpoint, are directly interpolated into JavaScript template literals on lines 208 and 216. This creates an XSS vulnerability, enabling attackers to inject and execute arbitrary JavaScript.
WalkthroughThis pull request refactors the OpenAI widget integration from client-side HTML embedding to a server-backed URL workflow. The client component now stores widget data on the server, retrieves a widget URL, and renders an iframe pointing to that URL instead of embedding HTML directly. The server implements new endpoints for widget storage (with TTL-based cleanup), content retrieval, and injects an OpenAI Apps SDK bridge script that enables state management, tool calls, and parent-iframe communication. The changes include updated loading states, error handling, theme propagation via postMessage, and security headers for the widget container. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
client/src/components/chat-v2/openai-app-renderer.tsx(12 hunks)server/routes/mcp/index.ts(2 hunks)server/routes/mcp/openai.ts(1 hunks)
🧰 Additional context used
📓 Path-based instructions (11)
**/*.{ts,tsx}
📄 CodeRabbit inference engine (CLAUDE.md)
**/*.{ts,tsx}: Prefer interfaces for defining object shapes
Constrain generics with appropriate type bounds
Use type guards to narrow unknown or union types before usage
Enforce import ordering consistently
**/*.{ts,tsx}: Prefer named exports in TypeScript modules
Use 2-space indentation
Declare types and interfaces using PascalCase
Files:
server/routes/mcp/index.tsserver/routes/mcp/openai.tsclient/src/components/chat-v2/openai-app-renderer.tsx
server/routes/**/*.ts
📄 CodeRabbit inference engine (CLAUDE.md)
server/routes/**/*.ts: Validate server responses (schema, content, status codes, headers) in route handlers
Perform input validation on incoming requests (types, sanitization, size, format)
Files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
server/**/*.ts
📄 CodeRabbit inference engine (CLAUDE.md)
server/**/*.ts: Implement robust error handling with safe error messages and log sanitization
Set appropriate security headers for backend responses
Apply rate limiting and backoff strategies to protect APIsUse TypeScript for all backend source files
Files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
server/{app.ts,routes/**/*.ts}
📄 CodeRabbit inference engine (server/CLAUDE.md)
Use Hono.js for API routing and middleware in the app setup and all route handlers
Files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
server/{routes/mcp/**,utils/mcp-utils.ts,types/mcp.ts}
📄 CodeRabbit inference engine (server/CLAUDE.md)
Implement MCP protocol using Mastra (@mastra/core, @mastra/mcp) for MCP handling, helpers, and types
Files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
server/routes/mcp/**/*.ts
📄 CodeRabbit inference engine (server/CLAUDE.md)
Place all MCP protocol handlers under server/routes/mcp
Files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
client/src/**/*.tsx
📄 CodeRabbit inference engine (CLAUDE.md)
Follow consistent React component structure in the frontend
Files:
client/src/components/chat-v2/openai-app-renderer.tsx
client/src/{app,components}/**/*.tsx
📄 CodeRabbit inference engine (client/CLAUDE.md)
client/src/{app,components}/**/*.tsx: Use React functional components with React.FC typing for all UI components and pages
Define explicit Props interfaces for components, including proper children prop handling
Type event handlers with React-provided types (e.g., React.MouseEvent, FormEvent)
Follow React 19 patterns: hooks-centric components, Suspense and Error Boundaries where appropriate, and concurrent features
Maintain component isolation: avoid excessive prop drilling, use Context wisely, favor composition, and optimize renders
Build responsive layouts using the established Tailwind breakpoint system, grids, flex, and container queries
Ensure accessibility: proper ARIA attributes, keyboard navigation, focus management, and screen reader support
Files:
client/src/components/chat-v2/openai-app-renderer.tsx
client/src/**/*.{ts,tsx}
📄 CodeRabbit inference engine (client/CLAUDE.md)
Implement strict TypeScript types: strict prop types, event types, state interfaces, and utility types
In client code, use the @/ alias for imports
Files:
client/src/components/chat-v2/openai-app-renderer.tsx
client/src/{app,components,hooks}/**/*.{ts,tsx}
📄 CodeRabbit inference engine (client/CLAUDE.md)
Use React hooks for local state: apply useState/useReducer patterns, create custom hooks, and always clean up effects
Files:
client/src/components/chat-v2/openai-app-renderer.tsx
client/src/components/**/*.tsx
📄 CodeRabbit inference engine (client/CLAUDE.md)
Use Radix UI primitives for dialogs, dropdowns, form controls, and tooltips
React components in client/src/components should be functional and saved as PascalCase.tsx files
Files:
client/src/components/chat-v2/openai-app-renderer.tsx
🧠 Learnings (16)
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/routes/llm/openai/**/*.ts : Keep OpenAI integration (client setup, model management, streaming, errors) under server/routes/llm/openai
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.tsclient/src/components/chat-v2/openai-app-renderer.tsx
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/{routes/mcp/**,utils/mcp-utils.ts,types/mcp.ts} : Implement MCP protocol using Mastra (mastra/core, mastra/mcp) for MCP handling, helpers, and types
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/routes/mcp/**/*.ts : Place all MCP protocol handlers under server/routes/mcp
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
📚 Learning: 2025-10-19T23:06:33.450Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: client/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:06:33.450Z
Learning: Applies to client/src/{lib/api,hooks/llm}/**/*.{ts,tsx} : Implement OpenAI integration with a dedicated API client, model configuration, robust response handling, and error recovery
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.tsclient/src/components/chat-v2/openai-app-renderer.tsx
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/utils/mcp-utils.ts : Put MCP helper utilities in server/utils/mcp-utils.ts
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
📚 Learning: 2025-10-19T23:06:33.450Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: client/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:06:33.450Z
Learning: Applies to client/src/{hooks/mcp,stores/{chat,servers}}/**/*.{ts,tsx} : Synchronize MCP state: track connections, requests, responses, and errors consistently
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/types/mcp.ts : Define MCP types in server/types/mcp.ts
Applied to files:
server/routes/mcp/index.ts
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/routes/mcp/tools/**/*.ts : Implement tool validation logic under server/routes/mcp/tools
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.ts
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/routes/mcp/resources/**/*.ts : Implement resource handlers and validation under server/routes/mcp/resources
Applied to files:
server/routes/mcp/index.ts
📚 Learning: 2025-10-19T23:06:33.450Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: client/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:06:33.450Z
Learning: Applies to client/src/{hooks/llm,stores/chat}/**/*.{ts,tsx} : Manage AI model state: model selection, generation parameters, stream management, and history persistence
Applied to files:
server/routes/mcp/index.tsserver/routes/mcp/openai.tsclient/src/components/chat-v2/openai-app-renderer.tsx
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/routes/mcp/prompts/**/*.ts : Implement prompt processing (context, tokens, format, streaming) under server/routes/mcp/prompts
Applied to files:
server/routes/mcp/index.ts
📚 Learning: 2025-10-19T23:07:08.982Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: server/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:07:08.982Z
Learning: Applies to server/routes/llm/ollama/**/*.ts : Keep Ollama support (local setup, model loading, inference config) under server/routes/llm/ollama
Applied to files:
server/routes/mcp/openai.ts
📚 Learning: 2025-10-19T23:06:33.450Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: client/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:06:33.450Z
Learning: Applies to client/src/{hooks/llm,stores/chat}/**/*.{ts,tsx} : Manage chat context: track window size/tokens, prune context, and persist state
Applied to files:
client/src/components/chat-v2/openai-app-renderer.tsx
📚 Learning: 2025-10-19T23:06:33.450Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: client/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:06:33.450Z
Learning: Applies to client/src/{app,components,hooks}/**/*.{ts,tsx} : Use React hooks for local state: apply useState/useReducer patterns, create custom hooks, and always clean up effects
Applied to files:
client/src/components/chat-v2/openai-app-renderer.tsx
📚 Learning: 2025-10-19T23:07:40.722Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: AGENTS.md:0-0
Timestamp: 2025-10-19T23:07:40.722Z
Learning: Applies to client/**/*.{tsx} : Prefer React functional components in the client UI
Applied to files:
client/src/components/chat-v2/openai-app-renderer.tsx
📚 Learning: 2025-10-19T23:06:33.450Z
Learnt from: CR
Repo: MCPJam/inspector PR: 0
File: client/CLAUDE.md:0-0
Timestamp: 2025-10-19T23:06:33.450Z
Learning: Applies to client/src/{app,components}/**/*.tsx : Follow React 19 patterns: hooks-centric components, Suspense and Error Boundaries where appropriate, and concurrent features
Applied to files:
client/src/components/chat-v2/openai-app-renderer.tsx
🔍 Remote MCP Context7
Additional Context for PR #791 Review
Based on the documentation retrieved, here is the relevant context to support your pull request review:
OpenAI Apps SDK Widget Integration Pattern
The OpenAI Apps SDK enables building applications that integrate with ChatGPT through custom widgets and Model Context Protocol (MCP) servers. The PR implements the server-backed widget URL workflow, which aligns with the recommended architecture:
- Widget Registration: Tools should be registered with metadata including
"openai/outputTemplate"that points to the widget URI (e.g.,"ui://widget/kanban-board.html") - State Management: Widget state can be persisted using
window.openai.setWidgetState(), which communicates state back to ChatGPT for preservation across conversation turns
Model Context Protocol (MCP) Architecture
MCP establishes a standardized client-server communication protocol where the client (ChatGPT) discovers and invokes tools, resources, and prompts exposed by the MCP server. The new endpoint in server/routes/mcp/openai.ts follows this pattern by:
- Storing widget data server-side (1-hour TTL with 5-minute cleanup)
- Exposing retrieval endpoints that inject the OpenAI Apps SDK bridge script
- Managing widget lifecycle independently from the client
IFrame Security and CSP Configuration
The implementation includes critical security measures:
- Content Security Policy: Widgets require CSP configuration with
connect_domains(for network requests) andresource_domains(for styles, images, fonts) to enable secure sandbox execution - Frame Protection: The PR adds headers for frame/iframe protections and no-cache directives, which align with standard iframe security patterns documented in the SDK
Widget-Host Communication Pattern
The window.openai global API provides methods for widgets to call tools, send follow-up messages, open external links, and request display mode changes. The PR's postMessage-based iframe communication replaces direct DOM manipulation with this standardized API.
Hono Framework for Server Routes
Hono is a small, simple, and ultrafast web framework that works on any JavaScript runtime, making it suitable for the lightweight MCP server integration added in server/routes/mcp/openai.ts.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Cursor Bugbot
- GitHub Check: prettier-and-build
| (async function() { | ||
| // Change URL to "/" BEFORE loading widget (for React Router) | ||
| history.replaceState(null, '', '/'); | ||
|
|
||
| // Fetch the actual widget HTML using toolId | ||
| const response = await fetch('/api/mcp/openai/widget-content/${toolId}'); | ||
| const html = await response.text(); | ||
|
|
||
| // Replace entire document with widget HTML | ||
| document.open(); | ||
| document.write(html); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Escape toolId before embedding in container script
Embedding the raw toolId inside the inline <script> means a crafted identifier such as foo');alert(1);// will terminate the string literal and run arbitrary script as soon as the container page loads. Because /widget/store accepts arbitrary toolId, this becomes a stored XSS vector. Please encode the identifier before injecting it into the script.
- const response = await fetch('/api/mcp/openai/widget-content/${toolId}');
+ const storedToolId = ${JSON.stringify(toolId)};
+ const response = await fetch(`/api/mcp/openai/widget-content/${encodeURIComponent(storedToolId)}`);Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In server/routes/mcp/openai.ts around lines 104 to 114, the inline script embeds
the raw toolId into a JS string which allows crafted values (e.g.
foo');alert(1);//) to break out and execute script; fix by escaping/serializing
the identifier before injection — on the server side serialize the toolId with a
safe JS string encoder (e.g. JSON.stringify) or place the id into a
data-attribute or element textContent and read it from the DOM in the script,
then use that safe value when constructing the fetch URL; ensure you remove the
direct ${toolId} interpolation so untrusted input cannot terminate the script
string.
| console.log('[OpenAI Widget] Initializing bridge for tool: ${toolName}'); | ||
|
|
||
| const openaiAPI = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sanitize toolName when emitting the bridge script
The inline script logs toolName inside single quotes. A malicious name such as my-tool');alert(1);// will close the string and execute arbitrary JavaScript in every widget view (stored XSS). Ensure interpolated values are emitted via JSON.stringify (or similar) before embedding.
- console.log('[OpenAI Widget] Initializing bridge for tool: ${toolName}');
+ console.log('[OpenAI Widget] Initializing bridge for tool:', ${JSON.stringify(toolName)});Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In server/routes/mcp/openai.ts around lines 216 to 218, the inline bridge script
injects toolName directly into a single-quoted console.log which allows stored
XSS (e.g. toolName = "my-tool');alert(1);//"). Replace the direct interpolation
with a safe serialized value (use JSON.stringify(toolName) or an equivalent
escape function) when emitting the script so the value is emitted as a
JavaScript string literal; update the console.log and any other embedded
occurrences in this script to use the serialized value.
Note
Render OpenAI App widgets from server-hosted content using new MCP /openai endpoints with in-memory storage and an injected bridge, updating the client to iframe a URL and handle messaging/theme updates.
srcDocHTML embedding to iframesrcpointing to server URL'/api/mcp/openai/widget/:toolId'.'/api/mcp/openai/widget/store'; handle loading/error states.outputTemplatefromtoolMetadataand supportstructuredContentfromtoolOutput.toolId(toolName+ timestamp) and manage display/height via postMessage events.resize,setWidgetState,callTool,sendFollowup,requestDisplayMode,openExternal.openai:set_globals.openairouter and mount at'/openai'.POST /openai/widget/store: save widget context in-memory (TTL ~1h, periodic cleanup).GET /openai/widget/:toolId: container page that rewrites URL then loads content.GET /openai/widget-content/:toolId: fetch widget HTML from MCP resource, inject Apps SDK bridge (window.openai/webplus), and serve with CSP/security headers.Written by Cursor Bugbot for commit 286a4b2. This will update automatically on new commits. Configure here.