-
Notifications
You must be signed in to change notification settings - Fork 1.4k
feat: enable codex models #1666
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,5 @@ | ||
| --- | ||
| "@browserbasehq/stagehand": patch | ||
| --- | ||
|
|
||
| add support for codex models |
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -130,9 +130,11 @@ export class AISdkClient extends LLMClient { | |||||||||||||||||||||
|
|
||||||||||||||||||||||
| let objectResponse: Awaited<ReturnType<typeof generateObject>>; | ||||||||||||||||||||||
| const isGPT5 = this.model.modelId.includes("gpt-5"); | ||||||||||||||||||||||
| const isCodex = this.model.modelId.includes("codex"); | ||||||||||||||||||||||
| const usesLowReasoningEffort = | ||||||||||||||||||||||
| this.model.modelId.includes("gpt-5.1") || | ||||||||||||||||||||||
| this.model.modelId.includes("gpt-5.2"); | ||||||||||||||||||||||
| (this.model.modelId.includes("gpt-5.1") || | ||||||||||||||||||||||
| this.model.modelId.includes("gpt-5.2")) && | ||||||||||||||||||||||
|
Comment on lines
+135
to
+136
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can these be consolidated into includes gpt-5.?
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. i think so, will test if gpt-5-2025-08-07 also has the same reasoning requirements
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. gpt 5 reasoning requirement is different |
||||||||||||||||||||||
| !isCodex; | ||||||||||||||||||||||
| const isDeepSeek = this.model.modelId.includes("deepseek"); | ||||||||||||||||||||||
| // Kimi models only support temperature=1 | ||||||||||||||||||||||
| const isKimi = this.model.modelId.includes("kimi"); | ||||||||||||||||||||||
|
|
@@ -173,8 +175,12 @@ You must respond in JSON format. respond WITH JSON. Do not include any other tex | |||||||||||||||||||||
| providerOptions: isGPT5 | ||||||||||||||||||||||
| ? { | ||||||||||||||||||||||
| openai: { | ||||||||||||||||||||||
| textVerbosity: "low", // Making these the default for gpt-5 for now | ||||||||||||||||||||||
| reasoningEffort: usesLowReasoningEffort ? "low" : "minimal", | ||||||||||||||||||||||
| textVerbosity: isCodex ? "medium" : "low", // codex models only support 'medium' | ||||||||||||||||||||||
| reasoningEffort: isCodex | ||||||||||||||||||||||
| ? "medium" | ||||||||||||||||||||||
| : usesLowReasoningEffort | ||||||||||||||||||||||
| ? "low" | ||||||||||||||||||||||
| : "minimal", | ||||||||||||||||||||||
|
Comment on lines
+179
to
+183
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Logic bug: codex models that are
Suggested change
Prompt To Fix With AIThis is a comment left during a code review.
Path: packages/core/lib/v3/llm/aisdk.ts
Line: 175:179
Comment:
Logic bug: codex models that are `gpt-5` (not `gpt-5.1` or `gpt-5.2`) will get `reasoningEffort: "minimal"` instead of `"medium"`. The ternary checks `isCodex` first, but if the model is `gpt-5-codex`, `usesLowReasoningEffort` is false (doesn't match `gpt-5.1` or `gpt-5.2`), so it falls through to `"minimal"`.
```suggestion
reasoningEffort: isCodex
? "medium"
: usesLowReasoningEffort
? "low"
: "minimal",
```
How can I resolve this? If you propose a fix, please make it concise. |
||||||||||||||||||||||
| }, | ||||||||||||||||||||||
| } | ||||||||||||||||||||||
| : undefined, | ||||||||||||||||||||||
|
|
||||||||||||||||||||||
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -134,14 +134,15 @@ export class AISdkClientWrapped extends LLMClient { | |||||||||||||||||||||
|
|
||||||||||||||||||||||
| let objectResponse: Awaited<ReturnType<typeof generateObject>>; | ||||||||||||||||||||||
| const isGPT5 = this.model.modelId.includes("gpt-5"); | ||||||||||||||||||||||
| const isCodex = this.model.modelId.includes("codex"); | ||||||||||||||||||||||
| const usesLowReasoningEffort = | ||||||||||||||||||||||
| this.model.modelId.includes("gpt-5.1") || | ||||||||||||||||||||||
| this.model.modelId.includes("gpt-5.2"); | ||||||||||||||||||||||
| (this.model.modelId.includes("gpt-5.1") || | ||||||||||||||||||||||
| this.model.modelId.includes("gpt-5.2")) && | ||||||||||||||||||||||
| !isCodex; | ||||||||||||||||||||||
| const isDeepSeek = this.model.modelId.includes("deepseek"); | ||||||||||||||||||||||
| // Kimi models only support temperature=1 | ||||||||||||||||||||||
| const isKimi = this.model.modelId.includes("kimi"); | ||||||||||||||||||||||
| const temperature = isKimi ? 1 : options.temperature; | ||||||||||||||||||||||
|
|
||||||||||||||||||||||
| if (options.response_model) { | ||||||||||||||||||||||
| if (isDeepSeek || isKimi) { | ||||||||||||||||||||||
| const parsedSchema = JSON.stringify( | ||||||||||||||||||||||
|
|
@@ -164,8 +165,12 @@ You must respond in JSON format. respond WITH JSON. Do not include any other tex | |||||||||||||||||||||
| providerOptions: isGPT5 | ||||||||||||||||||||||
| ? { | ||||||||||||||||||||||
| openai: { | ||||||||||||||||||||||
| textVerbosity: "low", // Making these the default for gpt-5 for now | ||||||||||||||||||||||
| reasoningEffort: usesLowReasoningEffort ? "low" : "minimal", | ||||||||||||||||||||||
| textVerbosity: isCodex ? "medium" : "low", // codex models only support 'medium' | ||||||||||||||||||||||
| reasoningEffort: isCodex | ||||||||||||||||||||||
| ? "medium" | ||||||||||||||||||||||
| : usesLowReasoningEffort | ||||||||||||||||||||||
| ? "low" | ||||||||||||||||||||||
| : "minimal", | ||||||||||||||||||||||
|
Comment on lines
+169
to
+173
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Same logic bug as in
Suggested change
Prompt To Fix With AIThis is a comment left during a code review.
Path: packages/evals/lib/AISdkClientWrapped.ts
Line: 152:156
Comment:
Same logic bug as in `aisdk.ts`: codex models that are `gpt-5` (not `gpt-5.1` or `gpt-5.2`) will get `reasoningEffort: "minimal"` instead of `"medium"`.
```suggestion
reasoningEffort: isCodex
? "medium"
: usesLowReasoningEffort
? "low"
: "minimal",
```
How can I resolve this? If you propose a fix, please make it concise. |
||||||||||||||||||||||
| }, | ||||||||||||||||||||||
| } | ||||||||||||||||||||||
| : undefined, | ||||||||||||||||||||||
|
|
||||||||||||||||||||||
Uh oh!
There was an error while loading. Please reload this page.