fix (telemetry): serialize UInt8Arrays as base64 for inner telemetry spans#6357
Merged
lgrammel merged 1 commit intovercel:mainfrom May 17, 2025
lmnr-ai:fix/json-stringify-image
Merged
fix (telemetry): serialize UInt8Arrays as base64 for inner telemetry spans#6357lgrammel merged 1 commit intovercel:mainfrom lmnr-ai:fix/json-stringify-image
lgrammel merged 1 commit intovercel:mainfrom
lmnr-ai:fix/json-stringify-image
Conversation
dinmukhamedm
commented
May 16, 2025
| ...part, | ||
| image: | ||
| part.image instanceof Uint8Array | ||
| ? convertDataContentToBase64String(part.image) |
Contributor
Author
There was a problem hiding this comment.
This will keep the raw base64 data, e.g.
IGcfqljA=
And NOT data:image/png;base64,IGcfqljA=
I am open to suggestions on this, but I decided not to add the data URL scheme (RFC 2397), because:
mimeTypeis optional on theImagePart, and not always known- current Uint8Array is raw data as well and is not aware of mimeTypes
- Telemetry backends can use the
mimeTypefield, and there is lots of other heuristics to infer that this is base64
lgrammel
approved these changes
May 17, 2025
4 tasks
jacobkerber
pushed a commit
to jacobkerber/ai
that referenced
this pull request
Jul 15, 2025
…spans (vercel#6357) ## Background `generateObject`, `generateText`, `streamText`, and `streamObject` currently call `JSON.stringify` on the input messages. If the input messages contain an image, it is most likely normalized into a `Uint8Array`. `JSON.stringify` does not the most obvious things to TypedArrays including `Uint8Array`. ```javascript // this returns '{"0": 1,"1": 2,"2": 3}', where I'd expect this to be '[1,2,3]' JSON.stringify(new Uint8array([1, 2, 3])) ``` In practice, this results in bloating images by about 5-15x depending on the original image size. For Laminar, for example, a span with 3 avg sized images will not be able to be sent as it is larger than the (reasonably high) gRPC payload size for our traces endpoint. From [MDN docs](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify#examples): ```javascript // TypedArray JSON.stringify([new Int8Array([1]), new Int16Array([1]), new Int32Array([1])]); // '[{"0":1},{"0":1},{"0":1}]' JSON.stringify([ new Uint8Array([1]), new Uint8ClampedArray([1]), new Uint16Array([1]), new Uint32Array([1]), ]); // '[{"0":1},{"0":1},{"0":1},{"0":1}]' JSON.stringify([new Float32Array([1]), new Float64Array([1])]); // '[{"0":1},{"0":1}]' ``` ## Summary Added a function that maps over messages in a `LanguageModelV1Prompt` and maps over content parts in each message, replacing `UInt8Array`s with raw base64 strings instead. Call this function when calling `recordSpan` for the inner (doStream/doGenerate) span in `generateObject`, `generateText`, `streamText`, and `streamObject`. ## Verification Ran this small script against a local instance of Laminar and logged the Telemetry payloads (span attributes) on the backend to verify that they are indeed base64. ```javascript import { Laminar, getTracer } from '@lmnr-ai/lmnr' Laminar.initialize(); import { openai } from '@ai-sdk/openai' import { generateText, generateObject, streamText, streamObject, tool } from "ai"; import { z } from "zod"; import dotenv from "dotenv"; dotenv.config(); const handle = async () => { const imageUrl = "https://upload.wikimedia.org/wikipedia/commons/b/bc/CoinEx.png" const imageData = await fetch(imageUrl) .then(response => response.arrayBuffer()) .then(buffer => Buffer.from(buffer).toString('base64')); const o = streamObject({ schema: z.object({ text: z.string(), companyName: z.string().optional().nullable(), }), messages: [ { role: "user", content: [ { type: "text", text: "Describe this image briefly" }, { type: "image", image: imageData, mimeType: "image/png" } ] } ], model: openai("gpt-4.1-nano"), experimental_telemetry: { isEnabled: true, tracer: getTracer() } }); for await (const chunk of o.fullStream) { console.log(chunk); } await Laminar.shutdown(); }; handle().then((r) => { console.log(r); }); ``` ## Related Issues Fixes vercel#6210
jacobkerber
pushed a commit
to jacobkerber/ai
that referenced
this pull request
Jul 15, 2025
…spans (vercel#6357) `generateObject`, `generateText`, `streamText`, and `streamObject` currently call `JSON.stringify` on the input messages. If the input messages contain an image, it is most likely normalized into a `Uint8Array`. `JSON.stringify` does not the most obvious things to TypedArrays including `Uint8Array`. ```javascript // this returns '{"0": 1,"1": 2,"2": 3}', where I'd expect this to be '[1,2,3]' JSON.stringify(new Uint8array([1, 2, 3])) ``` In practice, this results in bloating images by about 5-15x depending on the original image size. For Laminar, for example, a span with 3 avg sized images will not be able to be sent as it is larger than the (reasonably high) gRPC payload size for our traces endpoint. From [MDN docs](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify#examples): ```javascript // TypedArray JSON.stringify([new Int8Array([1]), new Int16Array([1]), new Int32Array([1])]); // '[{"0":1},{"0":1},{"0":1}]' JSON.stringify([ new Uint8Array([1]), new Uint8ClampedArray([1]), new Uint16Array([1]), new Uint32Array([1]), ]); // '[{"0":1},{"0":1},{"0":1},{"0":1}]' JSON.stringify([new Float32Array([1]), new Float64Array([1])]); // '[{"0":1},{"0":1}]' ``` Added a function that maps over messages in a `LanguageModelV1Prompt` and maps over content parts in each message, replacing `UInt8Array`s with raw base64 strings instead. Call this function when calling `recordSpan` for the inner (doStream/doGenerate) span in `generateObject`, `generateText`, `streamText`, and `streamObject`. Ran this small script against a local instance of Laminar and logged the Telemetry payloads (span attributes) on the backend to verify that they are indeed base64. ```javascript import { Laminar, getTracer } from '@lmnr-ai/lmnr' Laminar.initialize(); import { openai } from '@ai-sdk/openai' import { generateText, generateObject, streamText, streamObject, tool } from "ai"; import { z } from "zod"; import dotenv from "dotenv"; dotenv.config(); const handle = async () => { const imageUrl = "https://upload.wikimedia.org/wikipedia/commons/b/bc/CoinEx.png" const imageData = await fetch(imageUrl) .then(response => response.arrayBuffer()) .then(buffer => Buffer.from(buffer).toString('base64')); const o = streamObject({ schema: z.object({ text: z.string(), companyName: z.string().optional().nullable(), }), messages: [ { role: "user", content: [ { type: "text", text: "Describe this image briefly" }, { type: "image", image: imageData, mimeType: "image/png" } ] } ], model: openai("gpt-4.1-nano"), experimental_telemetry: { isEnabled: true, tracer: getTracer() } }); for await (const chunk of o.fullStream) { console.log(chunk); } await Laminar.shutdown(); }; handle().then((r) => { console.log(r); }); ``` Fixes vercel#6210
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Background
generateObject,generateText,streamText, andstreamObjectcurrently callJSON.stringifyon the input messages. If the input messages contain an image, it is most likely normalized into aUint8Array.JSON.stringifydoes not the most obvious things to TypedArrays includingUint8Array.In practice, this results in bloating images by about 5-15x depending on the original image size. For Laminar, for example, a span with 3 avg sized images will not be able to be sent as it is larger than the (reasonably high) gRPC payload size for our traces endpoint.
From MDN docs:
Summary
Added a function that maps over messages in a
LanguageModelV1Promptand maps over content parts in each message, replacingUInt8Arrays with raw base64 strings instead.Call this function when calling
recordSpanfor the inner (doStream/doGenerate) span ingenerateObject,generateText,streamText, andstreamObject.Verification
Ran this small script against a local instance of Laminar and logged the Telemetry payloads (span attributes) on the backend to verify that they are indeed base64.
Tasks
pnpm changesetin the project root)pnpm prettier-fixin the project root)Future Work
Related Issues
Fixes #6210