Skip to content

feat: integrate OpenTelemetry#5245

Open
tianhuil wants to merge 3 commits intoanomalyco:devfrom
tianhuil:dev
Open

feat: integrate OpenTelemetry#5245
tianhuil wants to merge 3 commits intoanomalyco:devfrom
tianhuil:dev

Conversation

@tianhuil
Copy link

@tianhuil tianhuil commented Dec 8, 2025

Hi @thdxr, @adamdotdevin, and @rekram1-node

I love opencode. Trying to run it with open telemetry and wanted to add back some code to the project. If you generaly like this commit, I can work on resolving the merge conflict (just imports, nothing big).

Summary

  • Integrate OpenTelemetry for enhanced tracing and monitoring capabilities
  • Add type annotations to execute and toModelOutput methods in SessionPrompt

Changes

  • Added OpenTelemetry dependencies for comprehensive observability
  • Fixed type annotations in SessionPrompt methods for better type safety
  • Updated package lock file with new dependencies

- Updated bun.lock and package.json to include OpenTelemetry dependencies.
- Implemented tracing in various components including session management and tool execution.
- Enhanced error handling and event logging for better observability.
@tianhuil tianhuil changed the title feat: integrate OpenTelemetry and fix type annotations feat: integrate OpenTelemetry Dec 8, 2025
@rekram1-node
Copy link
Collaborator

I think opencode already emits telemetry spans if you set:

experimental_opentelemetry to true in your opencode.json

…lemetry

- Moved Tracing initialization to occur before the root CLI span is created.
- Wrapped CLI parsing in a span to include command-line arguments in telemetry.
@tianhuil
Copy link
Author

tianhuil commented Dec 8, 2025

Thanks for the quick response @rekram1-node.

I set the following and it didn't work:

//.opencode/opencode.jsonc
{
  "$schema": "https://opencode.ai/config.json",
  "experimental": {
    "openTelemetry": true
  },
}

I had to modify the repo to get it to emit open telemetry.

@franroa
Copy link

franroa commented Jan 3, 2026

@didier-durand
Copy link
Contributor

Hi,
Just setting:

  "$schema": "https://opencode.ai/config.json",
  "experimental": {
    "openTelemetry": true
  },
}

doesn't work for me either.

Minimally, additional packages from OpenTelemetry JavaScript SDK need to be added

@pai4451
Copy link

pai4451 commented Jan 13, 2026

@rekram1-node @tianhuil

I love this feature, is it possible to add open telemetry support and contains productivity metrics (such as acceptance rate or lines of code) like claude code?

https://code.claude.com/docs/en/monitoring-usage#lines-of-code-counter

@didier-durand
Copy link
Contributor

Working on a PR to fully enable OTEL for opencode in a container image. Hope to be able to share it soon

@Phantal
Copy link

Phantal commented Jan 16, 2026

If it helps. I submitted a merge request to add otel support awhile back, but it got very little traction:

#2735

@didier-durand
Copy link
Contributor

@Phantal : Yes, I saw your PR. On my side, I am trying to obtain full leverage of AI SDK Otel implementation by Vercel. I want to add much less code than your PR and just minimally change the current Otel setup.

@fiktor
Copy link

fiktor commented Feb 10, 2026

Let me document how OTel in OpenCode works without this PR.

I tested the current OpenCode (opencode-ai@1.1.51, opencode-ai@1.1.53, without this PR), and the spans are emitted/exported subject to the following pre-requisites:

  1. a modern version of @opentelemetry is installed in the opencode bun environment. I installed "@opentelemetry/sdk-node" version "0.200". There are two ways to accomplish this
    • global: add the package name to ~/.config/opencode/packages.json and run bun install inside ~/.config/opencode.
    • should be also possible to do it when using per-project config / plugins but I didn't test it.
    • This step can be omitted if one uses npm-installed package (by specifying it in opencode.json) as opposed to local module (by putting it in plugins/ directory) if @opentelemetry/... (e.g. @opentelemetry/sdk-node) is a dependency of that plugin's npm package: in that case opencode will install the dependency automatically before loading the plugin.
  2. experimental.openTelemetry = true is set in opencode.json.
  3. something (e.g. a plugin) should initialize and start NodeSDK. E.g. I implemented a plugin with the following key code in the main plugin function, and it worked:
  try {
    const processor = new JsonlSpanProcessor(stream);
    const sdk_node_module = await import("@opentelemetry/sdk-node");
    sdk = new sdk_node_module.NodeSDK({ spanProcessors: [processor] });
    sdk!.start()
  } catch (e) {
    ...
  }

Here:

  • the JsonlSpanProcessor I implemented just dumps spans to a file,
  • import is intentionally inside the plugin function
    • With standard import statement at the top of the file, if opentelemetry is missing, OpenCode freezes due to failure to load the plugin, which is hard to debug.
    • Inside the function, I opened the log already and can log the failure to import.
  • the next to lines (sdk= and sdk.start) are essentially from the example in add experimental.open_telemetry config option to enable OTEL spans #4978 (comment)
  1. Finally, the OTel processor (like JsonlSpanProcessor in my example above) should do something to make the spans exported and visible to external observers (e.g. write to a file in my example, or, e.g., export via http).

There are 3 types (i.e. 3 values of span.name) logged: ai.streamText, ai.toolCall, ai.streamText.doStream. The spans are not coming directly from OpenCode but from Vercel AI SDK opencode is using. They contain full text of prompts. Here is one of 3 lines where span is created: https://github.com/vercel/ai/blob/c36a873ce00892a4c587c2e9492220b392aefd09/packages/ai/src/generate-text/stream-text.ts#L1142

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants