Skip to content

Conversation

@JayaSurya-27
Copy link
Collaborator

@JayaSurya-27 JayaSurya-27 commented Jun 5, 2025

Pull Request

Description

Describe the changes in this pull request:

  • What feature/bug does this PR address?
  • Provide any relevant links or screenshots.

Checklist

  • Code compiles correctly.
  • Created/updated tests.
  • Linting and formatting applied.
  • Documentation updated.

Related Issues

Closes #<issue_number>

Summary by CodeRabbit

  • New Features

    • Added a new example script demonstrating OpenTelemetry tracing integration with OpenAI in TypeScript.
    • Introduced support for the MCP framework with documentation and quickstart instructions.
  • Bug Fixes

    • Improved context and metadata handling for more accurate telemetry and trace attributes.
    • Enhanced shutdown handling and span tracking for more robust telemetry processing.
  • Refactor

    • Simplified TypeScript instrumentation class signatures for easier maintenance.
    • Reorganized imports and improved code formatting across multiple modules for clarity.
  • Chores

    • Updated and streamlined package dependencies for better compatibility and performance.
    • Changed project name identifier in tracing example to reflect a new version.
    • Updated framework dependencies to their latest versions.

@coderabbitai
Copy link

coderabbitai bot commented Jun 5, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

The updates enhance context propagation and metadata extraction in the Python LangChain tracer, refine TypeScript OpenTelemetry dependencies and instrumentation, and add a new OpenAI tracing example. TypeScript code also simplifies instrumentation type signatures. No major control flow changes are introduced, but context handling and dependency management are improved.

Changes

File(s) Change Summary
python/frameworks/langchain/traceai_langchain/_tracer.py Improved context attribute capturing/propagation, enhanced metadata extraction, updated locking dict, and method signatures.
typescript/packages/fi-core/package.json
typescript/packages/traceai_openai/package.json
Updated OpenTelemetry dependencies, added new OTEL packages, and adjusted version specifiers.
typescript/packages/traceai_openai/examples/basic-otel-test.ts Changed projectName in tracer registration from "v3" to "v4".
typescript/packages/traceai_openai/examples/chat.ts Added a new example demonstrating OpenTelemetry tracing with OpenAI and TraceAI instrumentation.
typescript/packages/traceai_openai/src/instrumentation.ts Removed explicit type parameters from the OpenAIInstrumentation class and related method signatures.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant OpenAIExample
    participant FITracerProvider
    participant OpenAIInstrumentation
    participant OpenAIClient

    User->>OpenAIExample: Run chat.ts
    OpenAIExample->>FITracerProvider: register({ projectName, ... })
    OpenAIExample->>OpenAIInstrumentation: register()
    OpenAIExample->>OpenAIClient: import and initialize
    OpenAIExample->>OpenAIClient: createChatCompletion()
    OpenAIClient-->>OpenAIExample: Return response
    OpenAIExample->>User: Log response
Loading

Suggested reviewers

  • NVJKKartik

Poem

In code and context, changes bloom,
Tracing rabbits hop from room to room.
Metadata mapped, dependencies neat,
Type hints trimmed for a cleaner seat.
With OpenAI chat, the journey’s begun—
Otel carrots for everyone! 🥕
Debugging in fields, our work is done.


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between baff05c and 3fd25fd.

📒 Files selected for processing (66)
  • python/fi_instrumentation/__init__.py (1 hunks)
  • python/fi_instrumentation/fi_types.py (13 hunks)
  • python/fi_instrumentation/instrumentation/_attributes.py (1 hunks)
  • python/fi_instrumentation/instrumentation/_protect_wrapper.py (6 hunks)
  • python/fi_instrumentation/instrumentation/_spans.py (2 hunks)
  • python/fi_instrumentation/instrumentation/_tracers.py (3 hunks)
  • python/fi_instrumentation/instrumentation/_types.py (1 hunks)
  • python/fi_instrumentation/instrumentation/config.py (2 hunks)
  • python/fi_instrumentation/otel.py (11 hunks)
  • python/fi_instrumentation/settings.py (3 hunks)
  • python/frameworks/anthropic/CHANGELOG.md (1 hunks)
  • python/frameworks/anthropic/examples/guardrail_example.py (4 hunks)
  • python/frameworks/anthropic/traceai_anthropic/__init__.py (4 hunks)
  • python/frameworks/anthropic/traceai_anthropic/_wrappers.py (7 hunks)
  • python/frameworks/autogen/CHANGELOG.md (1 hunks)
  • python/frameworks/autogen/traceai_autogen/__init__.py (10 hunks)
  • python/frameworks/autogen/traceai_autogen/utils.py (0 hunks)
  • python/frameworks/bedrock/CHANGELOG.md (1 hunks)
  • python/frameworks/bedrock/traceai_bedrock/__init__.py (2 hunks)
  • python/frameworks/crewai/CHANGELOG.md (1 hunks)
  • python/frameworks/dspy/CHANGELOG.md (1 hunks)
  • python/frameworks/dspy/traceai_dspy/__init__.py (1 hunks)
  • python/frameworks/groq/CHANGELOG.md (1 hunks)
  • python/frameworks/groq/traceai_groq/__init__.py (2 hunks)
  • python/frameworks/groq/traceai_groq/_request_attributes_extractor.py (2 hunks)
  • python/frameworks/guardrails/CHANGELOG.md (1 hunks)
  • python/frameworks/guardrails/traceai_guardrails/__init__.py (4 hunks)
  • python/frameworks/guardrails/traceai_guardrails/_wrap_guard_call.py (16 hunks)
  • python/frameworks/guardrails/traceai_guardrails/guardrails_example.py (3 hunks)
  • python/frameworks/haystack/CHANGELOG.md (1 hunks)
  • python/frameworks/haystack/traceai_haystack/__init__.py (2 hunks)
  • python/frameworks/instructor/CHANGELOG.md (1 hunks)
  • python/frameworks/instructor/traceai_instructor/__init__.py (2 hunks)
  • python/frameworks/langchain/CHANGELOG.md (1 hunks)
  • python/frameworks/langchain/examples/tool_calling_agent.py (3 hunks)
  • python/frameworks/langchain/traceai_langchain/__init__.py (3 hunks)
  • python/frameworks/langchain/traceai_langchain/_tracer.py (11 hunks)
  • python/frameworks/litellm/CHANGELOG.md (1 hunks)
  • python/frameworks/litellm/traceai_litellm/__init__.py (2 hunks)
  • python/frameworks/llama_index/CHANGELOG.md (1 hunks)
  • python/frameworks/llama_index/traceai_llamaindex/__init__.py (3 hunks)
  • python/frameworks/llama_index/traceai_llamaindex/_handler.py (1 hunks)
  • python/frameworks/mcp/CHANGELOG.md (1 hunks)
  • python/frameworks/mcp/README.md (1 hunks)
  • python/frameworks/mcp/traceai_mcp/__init__.py (7 hunks)
  • python/frameworks/mcp/traceai_mcp/package.py (1 hunks)
  • python/frameworks/mcp/traceai_mcp/version.py (1 hunks)
  • python/frameworks/mistralai/CHANGELOG.md (1 hunks)
  • python/frameworks/openai-agents/CHANGELOG.md (1 hunks)
  • python/frameworks/openai-agents/examples/handoffs/message_filter.py (2 hunks)
  • python/frameworks/openai-agents/examples/handoffs/message_filter_streaming.py (2 hunks)
  • python/frameworks/openai-agents/examples/hello_world.py (0 hunks)
  • python/frameworks/openai-agents/examples/tutor.py (0 hunks)
  • python/frameworks/openai-agents/tests/conftest.py (1 hunks)
  • python/frameworks/openai-agents/tests/test_span_attribute_helpers.py (14 hunks)
  • python/frameworks/openai-agents/traceai_openai_agents/__init__.py (2 hunks)
  • python/frameworks/openai-agents/traceai_openai_agents/_processor.py (9 hunks)
  • python/frameworks/openai/CHANGELOG.md (1 hunks)
  • python/frameworks/openai/traceai_openai/__init__.py (1 hunks)
  • python/frameworks/smolagents/CHANGELOG.md (1 hunks)
  • python/frameworks/smolagents/tests/openinference/instrumentation/smolagents/test_instrumentor.py (11 hunks)
  • python/frameworks/smolagents/traceai_smolagents/__init__.py (3 hunks)
  • python/frameworks/smolagents/traceai_smolagents/_wrappers.py (10 hunks)
  • python/frameworks/vertexai/CHANGELOG.md (1 hunks)
  • python/frameworks/vertexai/traceai_vertexai/__init__.py (1 hunks)
  • python/frameworks/vertexai/traceai_vertexai/_wrapper.py (2 hunks)
✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🧹 Nitpick comments (4)
typescript/packages/fi-core/package.json (1)

30-30: Consider the implications of exact version pinning.

The change from ^0.1.0 to 0.1.0 for @traceai/fi-semantic-conventions removes automatic patch updates. Ensure this exact pinning is intentional and consider whether patch updates should be allowed for internal packages.

typescript/packages/traceai_openai/examples/chat.ts (1)

14-18: Consider moving setup inside the main function.

The current module-level setup could cause issues with initialization order. Consider moving the tracer registration and instrumentation setup inside the main() function for better control flow.

+async function main() {
+  // 1. Register FI Core TracerProvider (sets up exporter)
+  const tracerProvider = register({
+    projectName: "test4",
+    projectType: ProjectType.OBSERVE,
+    sessionName: "test-session-" + Date.now(),
+  });
+
+  // 2. Register OpenAI Instrumentation *BEFORE* importing/using OpenAI client
+  console.log("Registering OpenAI Instrumentation...");
+  registerInstrumentations({
+    tracerProvider: tracerProvider,
+    instrumentations: [new OpenAIInstrumentation()], 
+  });
+
+  const OpenAI = (await import('openai')).default;
python/frameworks/langchain/traceai_langchain/_tracer.py (2)

926-951: Comprehensive metadata handling with good backward compatibility.

The enhanced metadata implementation:

  • Maps standard fields to conventional attribute names (e.g., user_iduser.id)
  • Maintains backward compatibility with legacy session ID keys
  • Includes both individual attributes and full JSON

Consider prefixing custom metadata fields to avoid potential attribute name conflicts:

            else:
                # Add other metadata with metadata prefix to avoid conflicts
-               yield f"{key}", str_value
+               yield f"metadata.{key}", str_value

1043-1087: Function works correctly but has high cyclomatic complexity.

The _filter_images function now returns four values and handles various message formats. While functional, the static analysis correctly identifies high branch complexity (15 branches).

Consider extracting helper functions to reduce complexity:

def _extract_message_content(message: dict) -> Tuple[List[dict], List[str], List[str]]:
    """Extract content from a single message."""
    filtered_data = []
    images = []
    eval_input = []
    
    kwargs = message.get("kwargs", {})
    if not isinstance(kwargs, dict):
        return filtered_data, images, eval_input
        
    content = kwargs.get("content")
    if isinstance(content, str):
        filtered_data.append({"type": "text", "text": content})
        eval_input.append(content)
    elif isinstance(content, list):
        for obj in content:
            # Process content items...
            
    return filtered_data, images, eval_input

def _filter_images(messages: Any) -> Tuple[Any, List[str], str, str]:
    """Simplified main function using helper."""
    if not messages or not isinstance(messages, list):
        return messages, [], "", ""
        
    all_filtered_data = []
    all_images = []
    all_eval_input = []
    
    for message_group in messages:
        if isinstance(message_group, list):
            for message in message_group:
                if isinstance(message, dict):
                    data, imgs, eval = _extract_message_content(message)
                    all_filtered_data.extend(data)
                    all_images.extend(imgs)
                    all_eval_input.extend(eval)
                    
    # Rest of the logic...
🧰 Tools
🪛 Pylint (3.3.7)

[refactor] 1052-1082: Too many nested blocks (7/5)

(R1702)


[refactor] 1052-1082: Too many nested blocks (7/5)

(R1702)


[refactor] 1043-1043: Too many branches (15/12)

(R0912)

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3321e33 and baff05c.

📒 Files selected for processing (6)
  • python/frameworks/langchain/traceai_langchain/_tracer.py (10 hunks)
  • typescript/packages/fi-core/package.json (1 hunks)
  • typescript/packages/traceai_openai/examples/basic-otel-test.ts (1 hunks)
  • typescript/packages/traceai_openai/examples/chat.ts (1 hunks)
  • typescript/packages/traceai_openai/package.json (2 hunks)
  • typescript/packages/traceai_openai/src/instrumentation.ts (2 hunks)
🧰 Additional context used
🪛 Pylint (3.3.7)
python/frameworks/langchain/traceai_langchain/_tracer.py

[refactor] 1043-1043: Too many branches (15/12)

(R0912)

🔇 Additional comments (5)
typescript/packages/traceai_openai/examples/basic-otel-test.ts (1)

35-35: LGTM! Project name version update.

The version bump from v3 to v4 aligns with the broader updates in this PR.

typescript/packages/traceai_openai/src/instrumentation.ts (2)

97-97: LGTM! Type signature simplification.

Removing the explicit generic type parameter simplifies the class declaration while maintaining functionality. The TypeScript compiler can still infer the correct types.


129-137: LGTM! Simplified method signature.

The removal of explicit return type annotation and generic type parameter makes the code cleaner while preserving type safety through TypeScript's type inference.

python/frameworks/langchain/traceai_langchain/_tracer.py (2)

87-89: Good improvement to align with standard dictionary behavior.

The addition of the default parameter to the get method makes _DictWithLock more compatible with the standard dictionary interface.


195-208: Excellent implementation of context preservation.

The context capturing mechanism ensures that span attributes reflect the context at span start rather than at span end, which is crucial for accurate tracing in concurrent environments. The thread-safe implementation using _DictWithLock is appropriate.

Also applies to: 249-256

Comment on lines +28 to +30
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/core": "^1.25.1",
"@traceai/fi-semantic-conventions": "0.1.0"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify OpenTelemetry package version compatibility.

The OpenTelemetry packages have version mismatches that could cause compatibility issues:

  • @opentelemetry/api: ^1.9.0
  • @opentelemetry/core: ^1.25.1
  • Other OpenTelemetry packages in devDependencies: ^1.19.0 and ^1.25.1

Ensure these version differences are intentional and compatible.


🏁 Script executed:

#!/bin/bash
# Check OpenTelemetry package compatibility and latest versions
echo "Checking OpenTelemetry package versions..."

# Check latest versions of each package
packages=("@opentelemetry/api" "@opentelemetry/core" "@opentelemetry/context-async-hooks" "@opentelemetry/resources" "@opentelemetry/sdk-trace-base" "@opentelemetry/sdk-trace-node" "@opentelemetry/semantic-conventions")

for package in "${packages[@]}"; do
  echo "Checking $package..."
  curl -s "https://registry.npmjs.org/$package/latest" | jq -r '.version' | sed "s/^/$package: /"
done

Length of output: 2475


To get an accurate picture of every OpenTelemetry package version in your fi-core module, let’s extract both dependencies and devDependencies directly from package.json:


🏁 Script executed:

#!/bin/bash
set -e

FILE="typescript/packages/fi-core/package.json"
echo "OpenTelemetry versions in $FILE:"
jq -r '
  (.dependencies // {}) + (.devDependencies // {})
  | to_entries[]
  | select(.key | test("^@opentelemetry/"))
  | "\(.key): \(.value)"
' "$FILE"

Length of output: 648


Align OpenTelemetry dependencies to a single minor version

The current fi-core package.json mixes several OpenTelemetry minor versions, which can lead to internal API mismatches at runtime:

  • @opentelemetry/api: ^1.9.0
  • @opentelemetry/core & @opentelemetry/context-async-hooks: ^1.25.1
  • @opentelemetry/resources, @opentelemetry/sdk-trace-base, @opentelemetry/sdk-trace-node, @opentelemetry/semantic-conventions: ^1.19.0

Action items:

  • Consolidate all @opentelemetry/* packages to the same minor version (ideally the latest stable, e.g. ^1.25.x or newer).
  • Ensure your test/integration suite covers end-to-end tracing to catch any breaking changes.
  • Confirm removal of @opentelemetry/sdk-node is intentional—if you rely on automatic Node SDK initialization, add it back or initialize the tracer manually.
🤖 Prompt for AI Agents
In typescript/packages/fi-core/package.json around lines 28 to 30, the
OpenTelemetry package versions are inconsistent across dependencies and
devDependencies, which can cause runtime API mismatches. To fix this, update all
@opentelemetry/* package versions to the same minor version, preferably the
latest stable such as ^1.25.x, ensuring uniformity. Also verify if the removal
of @opentelemetry/sdk-node is intentional; if automatic Node SDK initialization
is needed, add it back or implement manual tracer initialization. Finally, run
your test and integration suites to confirm tracing functionality remains
intact.

// 2. Register OpenAI Instrumentation *BEFORE* importing/using OpenAI client
console.log("Registering OpenAI Instrumentation...");
registerInstrumentations({
tracerProvider: tracerProvider as any,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Remove unnecessary type assertion.

The type casting tracerProvider as any suggests a type compatibility issue. Consider fixing the underlying type issue instead of using type assertion.

-  tracerProvider: tracerProvider as any,
+  tracerProvider: tracerProvider,
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
tracerProvider: tracerProvider as any,
tracerProvider: tracerProvider,
🤖 Prompt for AI Agents
In typescript/packages/traceai_openai/examples/chat.ts at line 23, remove the
unnecessary type assertion 'as any' on tracerProvider. Instead, investigate and
resolve the underlying type compatibility issue by ensuring that tracerProvider
matches the expected type without casting. Adjust the type definitions or the
tracerProvider assignment as needed to align types correctly.

apiKey: process.env.OPENAI_API_KEY,
});
const response = await openai.chat.completions.create({
model: "gpt-4.1-mini",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix the invalid OpenAI model name.

The model "gpt-4.1-mini" doesn't exist. This should likely be "gpt-4o-mini" or another valid OpenAI model.

-        model: "gpt-4.1-mini",
+        model: "gpt-4o-mini",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
model: "gpt-4.1-mini",
model: "gpt-4o-mini",
🤖 Prompt for AI Agents
In typescript/packages/traceai_openai/examples/chat.ts at line 34, the model
name "gpt-4.1-mini" is invalid. Replace it with a valid OpenAI model name such
as "gpt-4o-mini" or another correct model identifier to ensure the API call
works properly.

Comment on lines +29 to +32
async function main() {
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add error handling for missing environment variables.

The code doesn't handle the case where OPENAI_API_KEY is missing, which would cause a runtime error.

 async function main() {
+    if (!process.env.OPENAI_API_KEY) {
+        console.error("OPENAI_API_KEY environment variable must be set.");
+        process.exit(1);
+    }
+    
     const openai = new OpenAI({
         apiKey: process.env.OPENAI_API_KEY,
     });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async function main() {
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
async function main() {
if (!process.env.OPENAI_API_KEY) {
console.error("OPENAI_API_KEY environment variable must be set.");
process.exit(1);
}
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
// ...
}
🤖 Prompt for AI Agents
In typescript/packages/traceai_openai/examples/chat.ts around lines 29 to 32,
the code initializes the OpenAI client using the OPENAI_API_KEY environment
variable without checking if it is set. Add a check before creating the OpenAI
instance to verify that process.env.OPENAI_API_KEY is defined. If it is missing,
throw an informative error or log a clear message and exit the process to
prevent runtime errors caused by an undefined API key.

Comment on lines +39 to +43
"@opentelemetry/exporter-trace-otlp-proto": "^0.50.0",
"@opentelemetry/resources": "^1.25.1",
"@opentelemetry/sdk-trace-base": "^1.25.1",
"@opentelemetry/sdk-trace-node": "^1.25.1",
"@opentelemetry/semantic-conventions": "^1.25.1"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Ensure devDependencies match runtime dependency versions.

The devDependencies use different version patterns than the main dependencies:

  • Main dependencies use older versions (e.g., @opentelemetry/api@^1.9.0)
  • DevDependencies use newer versions (e.g., @opentelemetry/resources@^1.25.1)

This version mismatch between dependencies and devDependencies could cause issues during development vs. production.

Consider aligning the OpenTelemetry package versions for consistency:

  "dependencies": {
-   "@opentelemetry/api": "^1.9.0",
-   "@opentelemetry/instrumentation": "^0.46.0",
+   "@opentelemetry/api": "^1.9.0",
+   "@opentelemetry/instrumentation": "^0.52.0",
    "@opentelemetry/core": "^1.25.1",
    "@traceai/fi-core": "0.1.0",
    "@traceai/fi-semantic-conventions": "0.1.0"
  },

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In typescript/packages/traceai_openai/package.json around lines 39 to 43, the
versions of OpenTelemetry packages in devDependencies do not match those in the
main dependencies, causing potential inconsistencies. Update the devDependencies
to use the same version numbers as the main dependencies for all OpenTelemetry
packages to ensure consistency between development and production environments.

Comment on lines +27 to +31
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/instrumentation": "^0.46.0",
"@opentelemetry/core": "^1.25.1",
"@traceai/fi-core": "0.1.0",
"@traceai/fi-semantic-conventions": "0.1.0"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify OpenTelemetry version compatibility across packages.

The dependency updates show mixed version patterns:

  • @opentelemetry/api is at ^1.9.0
  • @opentelemetry/instrumentation is at ^0.46.0 (downgraded from ^0.48.0)
  • @opentelemetry/core is at ^1.25.1

This version mismatch could lead to compatibility issues. Additionally, using exact versions (0.1.0) for internal packages removes flexibility for patch updates.

Run the following script to check for OpenTelemetry version consistency recommendations:


🌐 Web query:

What are the recommended version combinations for @opentelemetry/api, @opentelemetry/instrumentation, and @opentelemetry/core packages to ensure compatibility?

💡 Result:

To ensure compatibility among the @opentelemetry/api, @opentelemetry/core, and @opentelemetry/instrumentation packages, it's important to align their versions appropriately. The OpenTelemetry JavaScript project maintains a compatibility matrix to guide users in selecting compatible versions.

As of June 2025, the latest stable versions are:

  • @opentelemetry/api: 1.9.0
  • @opentelemetry/core: 1.28.0
  • @opentelemetry/instrumentation: 0.55.0

These versions are compatible and should work seamlessly together. It's advisable to use matching versions across these packages to avoid potential issues.

For detailed information on version compatibility and to stay updated with the latest releases, refer to the OpenTelemetry JavaScript GitHub repository. (github.com)

Additionally, the OpenTelemetry JavaScript documentation provides guidance on supported versions and best practices. (opentelemetry.io)

By adhering to these recommendations, you can ensure a stable and compatible setup for your OpenTelemetry instrumentation in JavaScript.

Citations:


Align OpenTelemetry package versions and relax internal package pins

To ensure compatibility, update your OpenTelemetry dependencies to the latest stable, matching versions, and allow patch releases for internal packages:

• Upgrade OpenTelemetry packages:

  • Keep "@opentelemetry/api": "^1.9.0"
  • Change "@opentelemetry/instrumentation"^0.55.0
  • Change "@opentelemetry/core"^1.28.0

• Relax internal package pins to allow patch updates:

  • Change "@traceai/fi-core"^0.1.0
  • Change "@traceai/fi-semantic-conventions"^0.1.0

Suggested diff:

     "@opentelemetry/api": "^1.9.0",
-    "@opentelemetry/instrumentation": "^0.46.0",
-    "@opentelemetry/core": "^1.25.1",
-    "@traceai/fi-core": "0.1.0",
-    "@traceai/fi-semantic-conventions": "0.1.0"
+    "@opentelemetry/instrumentation": "^0.55.0",
+    "@opentelemetry/core": "^1.28.0",
+    "@traceai/fi-core": "^0.1.0",
+    "@traceai/fi-semantic-conventions": "^0.1.0"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/instrumentation": "^0.46.0",
"@opentelemetry/core": "^1.25.1",
"@traceai/fi-core": "0.1.0",
"@traceai/fi-semantic-conventions": "0.1.0"
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/instrumentation": "^0.55.0",
"@opentelemetry/core": "^1.28.0",
"@traceai/fi-core": "^0.1.0",
"@traceai/fi-semantic-conventions": "^0.1.0"
🤖 Prompt for AI Agents
In typescript/packages/traceai_openai/package.json around lines 27 to 31, the
OpenTelemetry package versions are inconsistent and internal package versions
are pinned exactly, which can cause compatibility and update issues. Update
"@opentelemetry/instrumentation" to "^0.55.0" and "@opentelemetry/core" to
"^1.28.0" to align with "@opentelemetry/api" at "^1.9.0". Also, change
"@traceai/fi-core" and "@traceai/fi-semantic-conventions" versions from "0.1.0"
to "^0.1.0" to allow patch updates and improve flexibility.

@NVJKKartik NVJKKartik merged commit 15a7319 into main Jun 5, 2025
1 check was pending
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants