Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
daf60db
WIP
tconley1428 Jan 15, 2026
2785bd3
Add OpenTelemetry integration for OpenAI Agents
tconley1428 Jan 20, 2026
9a767c4
Remove opentelemetryv2 additions - moved to separate branch
tconley1428 Jan 20, 2026
f79e320
Revert opentelemetry.py changes - no longer needed
tconley1428 Jan 20, 2026
a455df8
Remove debug prints from OpenAI agents OTEL integration
tconley1428 Jan 20, 2026
dde9237
Linting fixes
tconley1428 Jan 20, 2026
b247152
Fix linting
tconley1428 Jan 20, 2026
242ddce
Fix OpenAI Agents tracing to require explicit trace context for custo…
tconley1428 Jan 21, 2026
3e2a94f
Merge branch 'main' into openai/otel
tconley1428 Jan 21, 2026
9a91ddd
Cleanup
tconley1428 Jan 21, 2026
a087b2d
Ensure telemetry interceptor is added to replayer
tconley1428 Jan 21, 2026
54ead57
Merge branch 'main' into openai/otel
tconley1428 Jan 21, 2026
63911f9
Update character to work on windows
tconley1428 Jan 22, 2026
9d8f9c3
Merge branch 'main' into openai/otel
tconley1428 Jan 22, 2026
a7238dc
Add support for direct OpenTelemetry API calls in workflows
tconley1428 Jan 23, 2026
a9ac2ab
Merge branch 'main' into openai/otel
tconley1428 Jan 23, 2026
c5dfee1
Fix linting errors in OpenAI agents OpenTelemetry integration
tconley1428 Jan 23, 2026
251a0e4
Fix test_sdk_trace_to_otel_span_parenting flaky test failure
tconley1428 Jan 23, 2026
0fdb86f
Fix issues when the client trace is started outside of worker context.
tconley1428 Feb 4, 2026
789ae1f
Merge remote-tracking branch 'origin/main' into openai/otel
tconley1428 Feb 13, 2026
a0fe50a
:boom: Update to be in line with new changes. Breaking change because…
tconley1428 Feb 13, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ dev = [
"toml>=0.10.2,<0.11",
"twine>=4.0.1,<5",
"maturin>=1.8.2",
"openinference-instrumentation-openai-agents>=0.1.0",
"pytest-cov>=6.1.1",
"httpx>=0.28.1",
"pytest-pretty>=1.3.0",
Expand Down
194 changes: 194 additions & 0 deletions temporalio/contrib/openai_agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -536,6 +536,200 @@ SQLite storage is not suited to a distributed environment.
| :--------------- | :-------: |
| OpenAI platform | Yes |

## OpenTelemetry Integration

This integration provides seamless export of OpenAI agent telemetry to OpenTelemetry (OTEL) endpoints for observability and monitoring. The integration automatically handles workflow replay semantics, ensuring spans are only exported when workflows actually complete.

### Quick Start

To enable OTEL telemetry export, simply provide exporters to the `OpenAIAgentsPlugin` or `AgentEnvironment`:

```python
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter

# Your OTEL endpoint configuration
exporters = [
OTLPSpanExporter(endpoint="http://localhost:4317"),
# Add multiple exporters for different endpoints as needed
]

# For production applications
client = await Client.connect(
"localhost:7233",
plugins=[
OpenAIAgentsPlugin(
otel_exporters=exporters, # Enable OTEL integration
model_params=ModelActivityParameters(
start_to_close_timeout=timedelta(seconds=30)
)
),
],
)

# For testing
from temporalio.contrib.openai_agents.testing import AgentEnvironment

async with AgentEnvironment(
model=my_test_model,
otel_exporters=exporters # Enable OTEL integration for tests
) as env:
client = env.applied_on_client(base_client)
```

### Features

- **Multiple Exporters**: Send telemetry to multiple OTEL endpoints simultaneously
- **Replay-Safe**: Spans are only exported when workflows actually complete, not during replays
- **Deterministic IDs**: Consistent span IDs across workflow replays for reliable correlation
- **Automatic Setup**: No manual instrumentation required - just provide exporters
- **Graceful Degradation**: Works seamlessly whether OTEL dependencies are installed or not

### Dependencies

OTEL integration requires additional dependencies:

```bash
pip install openinference-instrumentation-openai-agents opentelemetry-sdk
```

Choose the appropriate OTEL exporter for your monitoring system:

```bash
# For OTLP (works with most OTEL collectors and monitoring systems)
pip install opentelemetry-exporter-otlp

# For Console output (development/debugging)
pip install opentelemetry-exporter-console

# Other exporters available for specific systems
pip install opentelemetry-exporter-<your-system>
```

### Example: Multiple Exporters

```python
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.exporter.console import ConsoleSpanExporter

exporters = [
# Production monitoring system
OTLPSpanExporter(
endpoint="https://your-monitoring-system:4317",
headers={"api-key": "your-api-key"}
),

# Secondary monitoring endpoint
OTLPSpanExporter(endpoint="https://backup-collector:4317"),

# Development debugging
ConsoleSpanExporter(),
]

plugin = OpenAIAgentsPlugin(otel_exporters=exporters)
```

### Error Handling

If you provide OTEL exporters but the required dependencies are not installed, you'll receive a clear error message:

```
ImportError: OTEL dependencies not available. Install with: pip install openinference-instrumentation-openai-agents opentelemetry-sdk
```

### Direct OpenTelemetry API Calls in Workflows

When using direct OpenTelemetry API calls within workflows (e.g., `opentelemetry.trace.get_tracer(__name__).start_as_current_span()`), you need to ensure proper context bridging and sandbox configuration.

#### Sandbox Configuration

Workflows run in a sandbox that restricts module access. To use direct OTEL API calls, you must explicitly allow OpenTelemetry passthrough:

```python
from temporalio.worker import Worker
from temporalio.worker.workflow_sandbox import SandboxedWorkflowRunner, SandboxRestrictions

# Configure worker with OpenTelemetry passthrough
worker = Worker(
client,
task_queue="my-task-queue",
workflows=[MyWorkflow],
workflow_runner=SandboxedWorkflowRunner(
SandboxRestrictions.default.with_passthrough_modules("opentelemetry")
)
)
```

#### Context Bridging Pattern

Direct OTEL spans must be created within an active OpenAI Agents SDK span to ensure proper parenting:

```python
import opentelemetry.trace
from agents import custom_span
from temporalio import workflow

@workflow.defn
class MyWorkflow:
@workflow.run
async def run(self) -> str:
# Start an SDK span first to establish OTEL context bridge
with custom_span("Workflow coordination"):
# Now direct OTEL spans will be properly parented
tracer = opentelemetry.trace.get_tracer(__name__)
with tracer.start_as_current_span("Custom workflow span"):
# Your workflow logic here
result = await self.do_work()
return result
```

#### Why This Pattern is Required

- **OpenInference instrumentation** bridges OpenAI Agents SDK spans to OpenTelemetry context
- **Direct OTEL API calls** without an active SDK span become root spans with no parent
- **SDK spans** (`custom_span()`) establish the context bridge that allows subsequent direct OTEL spans to inherit proper trace parenting

#### Complete Example

```python
import opentelemetry.trace
from agents import custom_span
from temporalio import workflow
from temporalio.worker import Worker
from temporalio.worker.workflow_sandbox import SandboxedWorkflowRunner, SandboxRestrictions

@workflow.defn
class TracedWorkflow:
@workflow.run
async def run(self) -> str:
# Establish OTEL context with SDK span
with custom_span("Main workflow"):
# Create direct OTEL spans for fine-grained tracing
tracer = opentelemetry.trace.get_tracer(__name__)

with tracer.start_as_current_span("Data processing"):
data = await self.process_data()

with tracer.start_as_current_span("Business logic"):
result = await self.execute_business_logic(data)

return result

# Worker configuration
worker = Worker(
client,
task_queue="traced-workflows",
workflows=[TracedWorkflow],
workflow_runner=SandboxedWorkflowRunner(
SandboxRestrictions.default.with_passthrough_modules("opentelemetry")
)
)
```

This ensures your direct OTEL spans are properly parented within the trace hierarchy initiated by your client SDK traces.

If no OTEL exporters are provided, the integration works normally without any OTEL setup.

### Voice

| Mode | Supported |
Expand Down
86 changes: 86 additions & 0 deletions temporalio/contrib/openai_agents/_otel_trace_interceptor.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
"""OTEL-aware variant of OpenAI Agents trace interceptor."""

from __future__ import annotations

from collections.abc import Mapping
from typing import Any, Protocol

import opentelemetry.trace

import temporalio.api.common.v1
import temporalio.converter

from ..opentelemetry._id_generator import TemporalIdGenerator
from ._trace_interceptor import (
OpenAIAgentsContextPropagationInterceptor,
)


class _InputWithHeaders(Protocol):
headers: Mapping[str, temporalio.api.common.v1.Payload]


class OTelOpenAIAgentsContextPropagationInterceptor(
OpenAIAgentsContextPropagationInterceptor
):
"""OTEL-aware variant that enhances headers with OpenTelemetry span context."""

def __init__(
self,
otel_id_generator: TemporalIdGenerator,
payload_converter: temporalio.converter.PayloadConverter = temporalio.converter.default().payload_converter,
add_temporal_spans: bool = True,
) -> None:
"""Initialize OTEL-aware context propagation interceptor.

Args:
otel_id_generator: Generator for OTEL-compatible IDs.
payload_converter: Converter for serializing trace context.
add_temporal_spans: Whether to add Temporal-specific spans.
"""
super().__init__(payload_converter, add_temporal_spans, start_traces=True)
self._otel_id_generator = otel_id_generator

def header_contents(self) -> dict[str, Any]:
"""Get header contents enhanced with OpenTelemetry span context.

Returns:
Dictionary containing trace context with OTEL span information.
"""
otel_span = opentelemetry.trace.get_current_span()

if otel_span and otel_span.get_span_context().is_valid:
otel_span_id = otel_span.get_span_context().span_id
return {
**super().header_contents(),
"otelSpanId": otel_span_id,
}
else:
return super().header_contents()

def context_from_header(
self,
input: _InputWithHeaders,
):
"""Extracts and initializes trace information the input header."""
span_info = self.get_header_contents(input)

if span_info is None:
return
otel_span_id = span_info.get("otelSpanId")

# If only a trace was propagated from the caller, we need to seed for trace context
if otel_span_id and self._otel_id_generator and span_info.get("spanId") is None:
self._otel_id_generator.seed_span_id(otel_span_id)

super().trace_context_from_header_contents(span_info)

# If a span was propagated from the caller, we need to seed for span context
if (
otel_span_id
and self._otel_id_generator
and span_info.get("spanId") is not None
):
self._otel_id_generator.seed_span_id(otel_span_id)

super().span_context_from_header_contents(span_info)
Loading
Loading