Documentation Index
Fetch the complete documentation index at: https://docs.agnost.ai/llms.txt
Use this file to discover all available pages before exploring further.
The openai SDK ships no first-party OTel — use OpenInference’s auto-instrumentation. Pick your language in the code blocks below; the choice persists across the page.
1. Install
Already have OpenInference + an OTLP exporter wired up? Skip.
No setup yet?
pip install openai openinference-instrumentation-openai \
opentelemetry-sdk opentelemetry-exporter-otlp-proto-http
2. Wire OpenInference + OTLP exporter pointing at Agnost
Already have OpenInference (or any OTel TracerProvider) running? Append Agnost as an additional span processor on the existing provider:
from opentelemetry import trace
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
trace.get_tracer_provider().add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(
endpoint="https://otel.agnost.ai/v1/traces",
headers={"X-Agnost-Org-ID": os.environ["AGNOST_ORG_ID"]},
)
)
)
No OTel yet? Full setup:
import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from openinference.instrumentation.openai import OpenAIInstrumentor
provider = TracerProvider(resource=Resource.create({"service.name": "openai-py"}))
provider.add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(
endpoint="https://otel.agnost.ai/v1/traces",
headers={"X-Agnost-Org-ID": os.environ["AGNOST_ORG_ID"]},
)
)
)
trace.set_tracer_provider(provider)
OpenAIInstrumentor().instrument(tracer_provider=provider)
3. Pass userId / sessionId per call
from openinference.instrumentation import using_attributes
with using_attributes(user_id="user-42", session_id="conv-abc123"):
client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}],
)
using_attributes (Python) and setUser / setSession (TS) propagate via OTel context, landing as user.id / session.id on every span the OpenAI SDK emits inside the block.
TypeScript version compatibility
Pin matching majors — version mismatches throw does not provide an export named 'APIPromise' at import time:
openai | @arizeai/openinference-instrumentation-openai |
|---|
^6.7.0 | ^4.0.0 |
^4.95.0 – ^5.x | ~2.3.1 |
References