diff --git a/docs/intelligentapps/images/tracing/trace_details_azure_monocle.png b/docs/intelligentapps/images/tracing/trace_details_azure_monocle.png
new file mode 100644
index 0000000000..56dab442ec
--- /dev/null
+++ b/docs/intelligentapps/images/tracing/trace_details_azure_monocle.png
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:4f6c130a06d60e0373a6123813943ae66fbc8c1d94682c39e9901a2b5ae97f6a
+size 266271
diff --git a/docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png b/docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png
new file mode 100644
index 0000000000..53a8d0bcd9
--- /dev/null
+++ b/docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:1078171ae03274746daef75d861da1d45571c50d1de4d197e56e4ba3fb4e09fc
+size 386062
diff --git a/docs/intelligentapps/images/tracing/trace_list_azure_monocle.png b/docs/intelligentapps/images/tracing/trace_list_azure_monocle.png
new file mode 100644
index 0000000000..849fdbd33a
--- /dev/null
+++ b/docs/intelligentapps/images/tracing/trace_list_azure_monocle.png
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:e86332447f39c2d46efe317de3d5a3d9c8d0f05b71fc5df5755609fe511e08c7
+size 105215
diff --git a/docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png b/docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png
new file mode 100644
index 0000000000..3fe84f0302
--- /dev/null
+++ b/docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:14e96c6d30ecaa8446ffb40ddcddf8d7b9fa98b5dd472424eca4dce0dd74a173
+size 110034
diff --git a/docs/intelligentapps/tracing.md b/docs/intelligentapps/tracing.md
index bd53541d99..9b4250fd43 100644
--- a/docs/intelligentapps/tracing.md
+++ b/docs/intelligentapps/tracing.md
@@ -14,7 +14,7 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge
| | Azure AI Inference | Foundry Agent Service | Anthropic | Gemini | LangChain | OpenAI SDK 3 | OpenAI Agents SDK |
|---|---|---|---|---|---|---|---|
-| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 | ✅ | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib))1 | ✅ ([Logfire](https://github.com/pydantic/logfire))1,2 |
+| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([monocle](https://github.com/monocle2ai/monocle)) | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib), [monocle](https://github.com/monocle2ai/monocle))1 | ✅ ([Logfire](https://github.com/pydantic/logfire), [monocle](https://github.com/monocle2ai/monocle))1,2 |
| **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2|❌|
> 1. The SDKs in brackets are non-Microsoft tools that add OTLP support because the official SDKs do not support OTLP.
@@ -89,6 +89,7 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from azure.ai.inference.tracing import AIInferenceInstrumentor
AIInferenceInstrumentor().instrument(True)
```
+
@@ -221,6 +222,8 @@ registerInstrumentations({
Anthropic - Python
+### OpenTelemetry
+
**Installation:**
```bash
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic
@@ -258,6 +261,32 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
AnthropicInstrumentor().instrument()
```
+
+### Monocle
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace
+```
+
+**Setup:**
+```python
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+# Import monocle_apptrace
+from monocle_apptrace import setup_monocle_telemetry
+
+# Setup Monocle telemetry with OTLP span exporter for traces
+setup_monocle_telemetry(
+ workflow_name="opentelemetry-instrumentation-anthropic",
+ span_processors=[
+ BatchSpanProcessor(
+ OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
+ )
+ ]
+)
+```
@@ -284,6 +313,8 @@ initialize({
Google Gemini - Python
+### OpenTelemetry
+
**Installation:**
```bash
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai
@@ -321,11 +352,39 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)
```
+
+### Monocle
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace
+```
+
+**Setup:**
+```python
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+# Import monocle_apptrace
+from monocle_apptrace import setup_monocle_telemetry
+
+# Setup Monocle telemetry with OTLP span exporter for traces
+setup_monocle_telemetry(
+ workflow_name="opentelemetry-instrumentation-google-genai",
+ span_processors=[
+ BatchSpanProcessor(
+ OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
+ )
+ ]
+)
+```
LangChain - Python
+### LangSmith
+
**Installation:**
```bash
pip install langsmith[otel]
@@ -338,6 +397,32 @@ os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318"
```
+
+### Monocle
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace
+```
+
+**Setup:**
+```python
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+# Import monocle_apptrace
+from monocle_apptrace import setup_monocle_telemetry
+
+# Setup Monocle telemetry with OTLP span exporter for traces
+setup_monocle_telemetry(
+ workflow_name="opentelemetry-instrumentation-langchain",
+ span_processors=[
+ BatchSpanProcessor(
+ OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
+ )
+ ]
+)
+```
@@ -362,6 +447,8 @@ initialize({
OpenAI - Python
+### OpenTelemetry
+
**Installation:**
```bash
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-openai-v2
@@ -411,6 +498,32 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
# Enable OpenAI instrumentation
OpenAIInstrumentor().instrument()
```
+
+### Monocle
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace
+```
+
+**Setup:**
+```python
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+# Import monocle_apptrace
+from monocle_apptrace import setup_monocle_telemetry
+
+# Setup Monocle telemetry with OTLP span exporter for traces
+setup_monocle_telemetry(
+ workflow_name="opentelemetry-instrumentation-openai",
+ span_processors=[
+ BatchSpanProcessor(
+ OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
+ )
+ ]
+)
+```
@@ -435,6 +548,8 @@ initialize({
OpenAI Agents SDK - Python
+### Logfire
+
**Installation:**
```bash
pip install logfire
@@ -453,9 +568,35 @@ logfire.configure(
)
logfire.instrument_openai_agents()
```
+
+### Monocle
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace
+```
+
+**Setup:**
+```python
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+# Import monocle_apptrace
+from monocle_apptrace import setup_monocle_telemetry
+
+# Setup Monocle telemetry with OTLP span exporter for traces
+setup_monocle_telemetry(
+ workflow_name="opentelemetry-instrumentation-openai-agents",
+ span_processors=[
+ BatchSpanProcessor(
+ OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
+ )
+ ]
+)
+```
-## Example: set up tracing with the Azure AI Inference SDK
+## Example 1: set up tracing with the Azure AI Inference SDK using Opentelemetry
The following end-to-end example uses the Azure AI Inference SDK in Python and shows how to set up the tracing provider and instrumentation.
@@ -464,7 +605,7 @@ The following end-to-end example uses the Azure AI Inference SDK in Python and s
To run this example, you need the following prerequisites:
- [Visual Studio Code](https://code.visualstudio.com/)
-- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-ai-toolkit.vscode-ai-toolkit)
+- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio)
- [Azure AI Inference SDK](https://pypi.org/project/azure-ai-inference/)
- [OpenTelemetry](https://opentelemetry.io/)
- [Python latest version](https://www.python.org/downloads)
@@ -630,6 +771,217 @@ Use the following instructions to deploy a preconfigured development environment

+## Example 2: set up tracing with the OpenAI Agents SDK using Monocle
+
+The following end-to-end example uses the OpenAI Agents SDK in Python with Monocle and shows how to set up tracing for a multi-agent travel booking system.
+
+### Prerequisites
+
+To run this example, you need the following prerequisites:
+
+- [Visual Studio Code](https://code.visualstudio.com/)
+- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio)
+- [Okahu Trace Visualizer](https://marketplace.visualstudio.com/items?itemName=OkahuAI.okahu-ai-observability)
+- [OpenAI Agents SDK](https://github.com/openai/agents)
+- [OpenTelemetry](https://opentelemetry.io/)
+- [Monocle](https://github.com/monocle2ai/monocle)
+- [Python latest version](https://www.python.org/downloads)
+- OpenAI API key
+
+### Set up your development environment
+
+Use the following instructions to deploy a preconfigured development environment containing all required dependencies to run this example.
+
+1. Create environment variable
+
+ Create an environment variable for your OpenAI API key using one of the following code snippets. Replace `` with your actual OpenAI API key.
+
+ bash:
+
+ ```bash
+ export OPENAI_API_KEY=""
+ ```
+
+ powershell:
+
+ ```powershell
+ $Env:OPENAI_API_KEY=""
+ ```
+
+ Windows command prompt:
+
+ ```cmd
+ set OPENAI_API_KEY=
+ ```
+
+ Alternatively, create a `.env` file in your project directory:
+
+ ```
+ OPENAI_API_KEY=
+ ```
+
+1. Install Python packages
+
+ Create a `requirements.txt` file with the following content:
+
+ ```
+ opentelemetry-sdk
+ opentelemetry-exporter-otlp-proto-http
+ monocle_apptrace
+ openai-agents
+ python-dotenv
+ ```
+
+ Install the packages using:
+
+ ```bash
+ pip install -r requirements.txt
+ ```
+
+1. Set up tracing
+
+ 1. Create a new local directory on your computer for the project.
+
+ ```shell
+ mkdir my-agents-tracing-app
+ ```
+
+ 1. Navigate to the directory you created.
+
+ ```shell
+ cd my-agents-tracing-app
+ ```
+
+ 1. Open Visual Studio Code in that directory:
+
+ ```shell
+ code .
+ ```
+
+1. Create the Python file
+
+ 1. In the `my-agents-tracing-app` directory, create a Python file named `main.py`.
+
+ You'll add the code to set up tracing with Monocle and interact with the OpenAI Agents SDK.
+
+ 1. Add the following code to `main.py` and save the file:
+
+ ```python
+ import os
+
+ from dotenv import load_dotenv
+
+ # Load environment variables from .env file
+ load_dotenv()
+
+ from opentelemetry.sdk.trace.export import BatchSpanProcessor
+ from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+ # Import monocle_apptrace
+ from monocle_apptrace import setup_monocle_telemetry
+
+ # Setup Monocle telemetry with OTLP span exporter for traces
+ setup_monocle_telemetry(
+ workflow_name="opentelemetry-instrumentation-openai-agents",
+ span_processors=[
+ BatchSpanProcessor(
+ OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
+ )
+ ]
+ )
+
+ from agents import Agent, Runner, function_tool
+
+ # Define tool functions
+ @function_tool
+ def book_flight(from_airport: str, to_airport: str) -> str:
+ """Book a flight between airports."""
+ return f"Successfully booked a flight from {from_airport} to {to_airport} for 100 USD."
+
+ @function_tool
+ def book_hotel(hotel_name: str, city: str) -> str:
+ """Book a hotel reservation."""
+ return f"Successfully booked a stay at {hotel_name} in {city} for 50 USD."
+
+ @function_tool
+ def get_weather(city: str) -> str:
+ """Get weather information for a city."""
+ return f"The weather in {city} is sunny and 75°F."
+
+ # Create specialized agents
+ flight_agent = Agent(
+ name="Flight Agent",
+ instructions="You are a flight booking specialist. Use the book_flight tool to book flights.",
+ tools=[book_flight],
+ )
+
+ hotel_agent = Agent(
+ name="Hotel Agent",
+ instructions="You are a hotel booking specialist. Use the book_hotel tool to book hotels.",
+ tools=[book_hotel],
+ )
+
+ weather_agent = Agent(
+ name="Weather Agent",
+ instructions="You are a weather information specialist. Use the get_weather tool to provide weather information.",
+ tools=[get_weather],
+ )
+
+ # Create a coordinator agent with tools
+ coordinator = Agent(
+ name="Travel Coordinator",
+ instructions="You are a travel coordinator. Delegate flight bookings to the Flight Agent, hotel bookings to the Hotel Agent, and weather queries to the Weather Agent.",
+ tools=[
+ flight_agent.as_tool(
+ tool_name="flight_expert",
+ tool_description="Handles flight booking questions and requests.",
+ ),
+ hotel_agent.as_tool(
+ tool_name="hotel_expert",
+ tool_description="Handles hotel booking questions and requests.",
+ ),
+ weather_agent.as_tool(
+ tool_name="weather_expert",
+ tool_description="Handles weather information questions and requests.",
+ ),
+ ],
+ )
+
+ # Run the multi-agent workflow
+ if __name__ == "__main__":
+ import asyncio
+
+ result = asyncio.run(
+ Runner.run(
+ coordinator,
+ "Book me a flight today from SEA to SFO, then book the best hotel there and tell me the weather.",
+ )
+ )
+ print(result.final_output)
+ ```
+
+1. Run the code
+
+ 1. Open a new terminal in Visual Studio Code.
+
+ 1. In the terminal, run the code using the command `python main.py`.
+
+1. Check the trace data in AI Toolkit
+
+ After you run the code and refresh the tracing webview, there's a new trace in the list.
+
+ Select the trace to open the trace details webview.
+
+ 
+
+ Check the complete execution flow of your app in the left span tree view, including agent invocations, tool calls, and agent delegations.
+
+ Select a span in the right span details view to see generative AI messages in the **Input + Output** tab.
+
+ Select the **Metadata** tab to view the raw metadata.
+
+ 
+
## What you learned
In this article, you learned how to: