From 0243c4a1df13ba7f446a9be834102fa2a22dd87e Mon Sep 17 00:00:00 2001 From: hocokahu Date: Tue, 25 Nov 2025 19:36:41 -0800 Subject: [PATCH 1/2] Adding monocle to traces doc --- .../tracing/trace_details_azure_monocle.png | 3 + .../trace_details_openai_agents_monocle.png | 3 + .../tracing/trace_list_azure_monocle.png | 3 + .../trace_list_openai_agents_monocle.png | 3 + docs/intelligentapps/tracing.md | 541 +++++++++++++++++- 5 files changed, 551 insertions(+), 2 deletions(-) create mode 100644 docs/intelligentapps/images/tracing/trace_details_azure_monocle.png create mode 100644 docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png create mode 100644 docs/intelligentapps/images/tracing/trace_list_azure_monocle.png create mode 100644 docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png diff --git a/docs/intelligentapps/images/tracing/trace_details_azure_monocle.png b/docs/intelligentapps/images/tracing/trace_details_azure_monocle.png new file mode 100644 index 0000000000..56dab442ec --- /dev/null +++ b/docs/intelligentapps/images/tracing/trace_details_azure_monocle.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4f6c130a06d60e0373a6123813943ae66fbc8c1d94682c39e9901a2b5ae97f6a +size 266271 diff --git a/docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png b/docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png new file mode 100644 index 0000000000..53a8d0bcd9 --- /dev/null +++ b/docs/intelligentapps/images/tracing/trace_details_openai_agents_monocle.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:1078171ae03274746daef75d861da1d45571c50d1de4d197e56e4ba3fb4e09fc +size 386062 diff --git a/docs/intelligentapps/images/tracing/trace_list_azure_monocle.png b/docs/intelligentapps/images/tracing/trace_list_azure_monocle.png new file mode 100644 index 0000000000..849fdbd33a --- /dev/null +++ b/docs/intelligentapps/images/tracing/trace_list_azure_monocle.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e86332447f39c2d46efe317de3d5a3d9c8d0f05b71fc5df5755609fe511e08c7 +size 105215 diff --git a/docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png b/docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png new file mode 100644 index 0000000000..3fe84f0302 --- /dev/null +++ b/docs/intelligentapps/images/tracing/trace_list_openai_agents_monocle.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:14e96c6d30ecaa8446ffb40ddcddf8d7b9fa98b5dd472424eca4dce0dd74a173 +size 110034 diff --git a/docs/intelligentapps/tracing.md b/docs/intelligentapps/tracing.md index bd53541d99..bb7de490a1 100644 --- a/docs/intelligentapps/tracing.md +++ b/docs/intelligentapps/tracing.md @@ -14,7 +14,7 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge | | Azure AI Inference | Foundry Agent Service | Anthropic | Gemini | LangChain | OpenAI SDK 3 | OpenAI Agents SDK | |---|---|---|---|---|---|---|---| -| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 | ✅ | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib))1 | ✅ ([Logfire](https://github.com/pydantic/logfire))1,2 | +| **Python** | ✅ ([monocle](https://github.com/monocle2ai/monocle)) | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([monocle](https://github.com/monocle2ai/monocle)) | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib), [monocle](https://github.com/monocle2ai/monocle))1 | ✅ ([Logfire](https://github.com/pydantic/logfire), [monocle](https://github.com/monocle2ai/monocle))1,2 | | **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2|❌| > 1. The SDKs in brackets are non-Microsoft tools that add OTLP support because the official SDKs do not support OTLP. @@ -48,6 +48,8 @@ The process is similar for all SDKs:
Azure AI Inference SDK - Python +### OpenTelemetry + **Installation:** ```bash pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry] @@ -89,6 +91,32 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider)) from azure.ai.inference.tracing import AIInferenceInstrumentor AIInferenceInstrumentor().instrument(True) ``` + +### Monocle + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace +``` + +**Setup:** +```python +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +# Import monocle_apptrace +from monocle_apptrace import setup_monocle_telemetry + +# Setup Monocle telemetry with OTLP span exporter for traces +setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-azure-ai-inference", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] +) +```
@@ -221,6 +249,8 @@ registerInstrumentations({
Anthropic - Python +### OpenTelemetry + **Installation:** ```bash pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic @@ -258,6 +288,32 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider)) from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor AnthropicInstrumentor().instrument() ``` + +### Monocle + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace +``` + +**Setup:** +```python +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +# Import monocle_apptrace +from monocle_apptrace import setup_monocle_telemetry + +# Setup Monocle telemetry with OTLP span exporter for traces +setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-anthropic", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] +) +```
@@ -284,6 +340,8 @@ initialize({
Google Gemini - Python +### OpenTelemetry + **Installation:** ```bash pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai @@ -321,11 +379,39 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider)) from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True) ``` + +### Monocle + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace +``` + +**Setup:** +```python +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +# Import monocle_apptrace +from monocle_apptrace import setup_monocle_telemetry + +# Setup Monocle telemetry with OTLP span exporter for traces +setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-google-genai", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] +) +```
LangChain - Python +### LangSmith + **Installation:** ```bash pip install langsmith[otel] @@ -338,6 +424,32 @@ os.environ["LANGSMITH_OTEL_ENABLED"] = "true" os.environ["LANGSMITH_TRACING"] = "true" os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318" ``` + +### Monocle + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace +``` + +**Setup:** +```python +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +# Import monocle_apptrace +from monocle_apptrace import setup_monocle_telemetry + +# Setup Monocle telemetry with OTLP span exporter for traces +setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-langchain", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] +) +```
@@ -362,6 +474,8 @@ initialize({
OpenAI - Python +### OpenTelemetry + **Installation:** ```bash pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-openai-v2 @@ -411,6 +525,32 @@ _events.set_event_logger_provider(EventLoggerProvider(logger_provider)) # Enable OpenAI instrumentation OpenAIInstrumentor().instrument() ``` + +### Monocle + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace +``` + +**Setup:** +```python +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +# Import monocle_apptrace +from monocle_apptrace import setup_monocle_telemetry + +# Setup Monocle telemetry with OTLP span exporter for traces +setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-openai", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] +) +```
@@ -435,6 +575,8 @@ initialize({
OpenAI Agents SDK - Python +### Logfire + **Installation:** ```bash pip install logfire @@ -453,9 +595,35 @@ logfire.configure( ) logfire.instrument_openai_agents() ``` + +### Monocle + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace +``` + +**Setup:** +```python +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +# Import monocle_apptrace +from monocle_apptrace import setup_monocle_telemetry + +# Setup Monocle telemetry with OTLP span exporter for traces +setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-openai-agents", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] +) +```
-## Example: set up tracing with the Azure AI Inference SDK +## Example 1: set up tracing with the Azure AI Inference SDK using Opentelemetry The following end-to-end example uses the Azure AI Inference SDK in Python and shows how to set up the tracing provider and instrumentation. @@ -630,6 +798,375 @@ Use the following instructions to deploy a preconfigured development environment ![Screenshot showing the Trace Details view in the Tracing webview.](./images/tracing/trace_details.png) +## Example 2: set up tracing with the Azure AI Inference SDK using Monocle + +The following end-to-end example uses the Azure AI Inference SDK in Python with Monocle and shows how to set up the tracing provider and instrumentation. + +### Prerequisites + +To run this example, you need the following prerequisites: + +- [Visual Studio Code](https://code.visualstudio.com/) +- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-ai-toolkit.vscode-ai-toolkit) +- [Azure AI Inference SDK](https://pypi.org/project/azure-ai-inference/) +- [Monocle](https://github.com/monocle2ai/monocle) +- [Python latest version](https://www.python.org/downloads) +- Azure OpenAI endpoint and API key + +### Set up your development environment + +Use the following instructions to deploy a preconfigured development environment containing all required dependencies to run this example. + +1. Create environment variables + + Create a `.env` file in your project directory with the following variables: + + ``` + AZURE_OPENAI_ENDPOINT= + AZURE_OPENAI_API_KEY= + AZURE_OPENAI_API_DEPLOYMENT=gpt-4o-2024-08-06 + AZURE_OPENAI_API_VERSION=2025-02-01-preview + ``` + + Replace the values with your actual Azure OpenAI credentials. + +1. Install Python packages + + Create a `requirements.txt` file with the following content: + + ``` + opentelemetry-sdk + opentelemetry-exporter-otlp-proto-http + monocle_apptrace + azure-ai-inference[opentelemetry] + python-dotenv + ``` + + Install the packages using: + + ```bash + pip install -r requirements.txt + ``` + +1. Set up tracing + + 1. Create a new local directory on your computer for the project. + + ```shell + mkdir my-monocle-tracing-app + ``` + + 1. Navigate to the directory you created. + + ```shell + cd my-monocle-tracing-app + ``` + + 1. Open Visual Studio Code in that directory: + + ```shell + code . + ``` + +1. Create the Python file + + 1. In the `my-monocle-tracing-app` directory, create a Python file named `main.py`. + + You'll add the code to set up tracing with Monocle and interact with the Azure AI Inference SDK. + + 1. Add the following code to `main.py` and save the file: + + ```python + import os + from dotenv import load_dotenv + + # Load environment variables from .env file + load_dotenv() + + ### Set up for OpenTelemetry tracing ### + os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true" + os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry" + + from opentelemetry.sdk.trace.export import BatchSpanProcessor + from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + + # Import monocle_apptrace + from monocle_apptrace import setup_monocle_telemetry + + azure_openai_endpoint = os.environ["AZURE_OPENAI_ENDPOINT"] + azure_openai_api_key = os.environ["AZURE_OPENAI_API_KEY"] + azure_openai_api_deployment = os.environ["AZURE_OPENAI_API_DEPLOYMENT"] + azure_openai_api_version = os.environ["AZURE_OPENAI_API_VERSION"] + endpoint = f"{azure_openai_endpoint}/openai/deployments/{azure_openai_api_deployment}" + + # Setup Monocle telemetry with OTLP span exporter for traces + setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-azure-ai-inference", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] + ) + + from azure.ai.inference import ChatCompletionsClient + from azure.ai.inference.models import UserMessage + from azure.ai.inference.models import TextContentItem + from azure.core.credentials import AzureKeyCredential + + client = ChatCompletionsClient( + endpoint = endpoint, + credential = AzureKeyCredential(azure_openai_api_key), + api_version = azure_openai_api_version, + ) + + response = client.complete( + messages = [ + UserMessage(content = [ + TextContentItem(text = "hello from Monocle!"), + ]), + ], + model = azure_openai_api_deployment, + tools = [], + response_format = "text", + temperature = 1, + top_p = 1, + ) + + print(response.choices[0].message.content) + ``` + +1. Run the code + + 1. Open a new terminal in Visual Studio Code. + + 1. In the terminal, run the code using the command `python main.py`. + +1. Check the trace data in AI Toolkit + + After you run the code and refresh the tracing webview, there's a new trace in the list. + + Select the trace to open the trace details webview. + + ![Screenshot showing selecting a trace from the Trace List in the Tracing webview.](./images/tracing/trace_list_azure_monocle.png) + + Check the complete execution flow of your app in the left span tree view. + + Select a span in the right span details view to see generative AI messages in the **Input + Output** tab. + + Select the **Metadata** tab to view the raw metadata. + + ![Screenshot showing the Trace Details view in the Tracing webview.](./images/tracing/trace_details_azure_monocle.png) + +## Example 3: set up tracing with the OpenAI Agents SDK using Monocle + +The following end-to-end example uses the OpenAI Agents SDK in Python with Monocle and shows how to set up tracing for a multi-agent travel booking system. + +### Prerequisites + +To run this example, you need the following prerequisites: + +- [Visual Studio Code](https://code.visualstudio.com/) +- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-ai-toolkit.vscode-ai-toolkit) +- [OpenAI Agents SDK](https://github.com/openai/agents) +- [Monocle](https://github.com/monocle2ai/monocle) +- [Python latest version](https://www.python.org/downloads) +- OpenAI API key + +### Set up your development environment + +Use the following instructions to deploy a preconfigured development environment containing all required dependencies to run this example. + +1. Create environment variable + + Create an environment variable for your OpenAI API key using one of the following code snippets. Replace `` with your actual OpenAI API key. + + bash: + + ```bash + export OPENAI_API_KEY="" + ``` + + powershell: + + ```powershell + $Env:OPENAI_API_KEY="" + ``` + + Windows command prompt: + + ```cmd + set OPENAI_API_KEY= + ``` + + Alternatively, create a `.env` file in your project directory: + + ``` + OPENAI_API_KEY= + ``` + +1. Install Python packages + + Create a `requirements.txt` file with the following content: + + ``` + opentelemetry-sdk + opentelemetry-exporter-otlp-proto-http + monocle_apptrace + openai-agents + python-dotenv + ``` + + Install the packages using: + + ```bash + pip install -r requirements.txt + ``` + +1. Set up tracing + + 1. Create a new local directory on your computer for the project. + + ```shell + mkdir my-agents-tracing-app + ``` + + 1. Navigate to the directory you created. + + ```shell + cd my-agents-tracing-app + ``` + + 1. Open Visual Studio Code in that directory: + + ```shell + code . + ``` + +1. Create the Python file + + 1. In the `my-agents-tracing-app` directory, create a Python file named `main.py`. + + You'll add the code to set up tracing with Monocle and interact with the OpenAI Agents SDK. + + 1. Add the following code to `main.py` and save the file: + + ```python + import os + + from dotenv import load_dotenv + + # Load environment variables from .env file + load_dotenv() + + from opentelemetry.sdk.trace.export import BatchSpanProcessor + from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + + # Import monocle_apptrace + from monocle_apptrace import setup_monocle_telemetry + + # Setup Monocle telemetry with OTLP span exporter for traces + setup_monocle_telemetry( + workflow_name="opentelemetry-instrumentation-openai-agents", + span_processors=[ + BatchSpanProcessor( + OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") + ) + ] + ) + + from agents import Agent, Runner, function_tool + + # Define tool functions + @function_tool + def book_flight(from_airport: str, to_airport: str) -> str: + """Book a flight between airports.""" + return f"Successfully booked a flight from {from_airport} to {to_airport} for 100 USD." + + @function_tool + def book_hotel(hotel_name: str, city: str) -> str: + """Book a hotel reservation.""" + return f"Successfully booked a stay at {hotel_name} in {city} for 50 USD." + + @function_tool + def get_weather(city: str) -> str: + """Get weather information for a city.""" + return f"The weather in {city} is sunny and 75°F." + + # Create specialized agents + flight_agent = Agent( + name="Flight Agent", + instructions="You are a flight booking specialist. Use the book_flight tool to book flights.", + tools=[book_flight], + ) + + hotel_agent = Agent( + name="Hotel Agent", + instructions="You are a hotel booking specialist. Use the book_hotel tool to book hotels.", + tools=[book_hotel], + ) + + weather_agent = Agent( + name="Weather Agent", + instructions="You are a weather information specialist. Use the get_weather tool to provide weather information.", + tools=[get_weather], + ) + + # Create a coordinator agent with tools + coordinator = Agent( + name="Travel Coordinator", + instructions="You are a travel coordinator. Delegate flight bookings to the Flight Agent, hotel bookings to the Hotel Agent, and weather queries to the Weather Agent.", + tools=[ + flight_agent.as_tool( + tool_name="flight_expert", + tool_description="Handles flight booking questions and requests.", + ), + hotel_agent.as_tool( + tool_name="hotel_expert", + tool_description="Handles hotel booking questions and requests.", + ), + weather_agent.as_tool( + tool_name="weather_expert", + tool_description="Handles weather information questions and requests.", + ), + ], + ) + + # Run the multi-agent workflow + if __name__ == "__main__": + import asyncio + + result = asyncio.run( + Runner.run( + coordinator, + "Book me a flight today from SEA to SFO, then book the best hotel there and tell me the weather.", + ) + ) + print(result.final_output) + ``` + +1. Run the code + + 1. Open a new terminal in Visual Studio Code. + + 1. In the terminal, run the code using the command `python main.py`. + +1. Check the trace data in AI Toolkit + + After you run the code and refresh the tracing webview, there's a new trace in the list. + + Select the trace to open the trace details webview. + + ![Screenshot showing selecting a trace from the Trace List in the Tracing webview.](./images/tracing/trace_list_openai_agents_monocle.png) + + Check the complete execution flow of your app in the left span tree view, including agent invocations, tool calls, and agent delegations. + + Select a span in the right span details view to see generative AI messages in the **Input + Output** tab. + + Select the **Metadata** tab to view the raw metadata. + + ![Screenshot showing the Trace Details view in the Tracing webview.](./images/tracing/trace_details_openai_agents_monocle.png) + ## What you learned In this article, you learned how to: From b73227f9f40e3b85ec6a25cec564bf9ca0d8cb9b Mon Sep 17 00:00:00 2001 From: hocokahu Date: Fri, 5 Dec 2025 16:09:59 -0800 Subject: [PATCH 2/2] Update Azure AI Inference and remove previous Example 2 --- docs/intelligentapps/tracing.md | 197 +------------------------------- 1 file changed, 6 insertions(+), 191 deletions(-) diff --git a/docs/intelligentapps/tracing.md b/docs/intelligentapps/tracing.md index bb7de490a1..9b4250fd43 100644 --- a/docs/intelligentapps/tracing.md +++ b/docs/intelligentapps/tracing.md @@ -14,7 +14,7 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge | | Azure AI Inference | Foundry Agent Service | Anthropic | Gemini | LangChain | OpenAI SDK 3 | OpenAI Agents SDK | |---|---|---|---|---|---|---|---| -| **Python** | ✅ ([monocle](https://github.com/monocle2ai/monocle)) | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([monocle](https://github.com/monocle2ai/monocle)) | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib), [monocle](https://github.com/monocle2ai/monocle))1 | ✅ ([Logfire](https://github.com/pydantic/logfire), [monocle](https://github.com/monocle2ai/monocle))1,2 | +| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([monocle](https://github.com/monocle2ai/monocle)) | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk), [monocle](https://github.com/monocle2ai/monocle))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib), [monocle](https://github.com/monocle2ai/monocle))1 | ✅ ([Logfire](https://github.com/pydantic/logfire), [monocle](https://github.com/monocle2ai/monocle))1,2 | | **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2|❌| > 1. The SDKs in brackets are non-Microsoft tools that add OTLP support because the official SDKs do not support OTLP. @@ -48,8 +48,6 @@ The process is similar for all SDKs:
Azure AI Inference SDK - Python -### OpenTelemetry - **Installation:** ```bash pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry] @@ -92,31 +90,6 @@ from azure.ai.inference.tracing import AIInferenceInstrumentor AIInferenceInstrumentor().instrument(True) ``` -### Monocle - -**Installation:** -```bash -pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace -``` - -**Setup:** -```python -from opentelemetry.sdk.trace.export import BatchSpanProcessor -from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter - -# Import monocle_apptrace -from monocle_apptrace import setup_monocle_telemetry - -# Setup Monocle telemetry with OTLP span exporter for traces -setup_monocle_telemetry( - workflow_name="opentelemetry-instrumentation-azure-ai-inference", - span_processors=[ - BatchSpanProcessor( - OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") - ) - ] -) -```
@@ -632,7 +605,7 @@ The following end-to-end example uses the Azure AI Inference SDK in Python and s To run this example, you need the following prerequisites: - [Visual Studio Code](https://code.visualstudio.com/) -- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-ai-toolkit.vscode-ai-toolkit) +- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio) - [Azure AI Inference SDK](https://pypi.org/project/azure-ai-inference/) - [OpenTelemetry](https://opentelemetry.io/) - [Python latest version](https://www.python.org/downloads) @@ -798,167 +771,7 @@ Use the following instructions to deploy a preconfigured development environment ![Screenshot showing the Trace Details view in the Tracing webview.](./images/tracing/trace_details.png) -## Example 2: set up tracing with the Azure AI Inference SDK using Monocle - -The following end-to-end example uses the Azure AI Inference SDK in Python with Monocle and shows how to set up the tracing provider and instrumentation. - -### Prerequisites - -To run this example, you need the following prerequisites: - -- [Visual Studio Code](https://code.visualstudio.com/) -- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-ai-toolkit.vscode-ai-toolkit) -- [Azure AI Inference SDK](https://pypi.org/project/azure-ai-inference/) -- [Monocle](https://github.com/monocle2ai/monocle) -- [Python latest version](https://www.python.org/downloads) -- Azure OpenAI endpoint and API key - -### Set up your development environment - -Use the following instructions to deploy a preconfigured development environment containing all required dependencies to run this example. - -1. Create environment variables - - Create a `.env` file in your project directory with the following variables: - - ``` - AZURE_OPENAI_ENDPOINT= - AZURE_OPENAI_API_KEY= - AZURE_OPENAI_API_DEPLOYMENT=gpt-4o-2024-08-06 - AZURE_OPENAI_API_VERSION=2025-02-01-preview - ``` - - Replace the values with your actual Azure OpenAI credentials. - -1. Install Python packages - - Create a `requirements.txt` file with the following content: - - ``` - opentelemetry-sdk - opentelemetry-exporter-otlp-proto-http - monocle_apptrace - azure-ai-inference[opentelemetry] - python-dotenv - ``` - - Install the packages using: - - ```bash - pip install -r requirements.txt - ``` - -1. Set up tracing - - 1. Create a new local directory on your computer for the project. - - ```shell - mkdir my-monocle-tracing-app - ``` - - 1. Navigate to the directory you created. - - ```shell - cd my-monocle-tracing-app - ``` - - 1. Open Visual Studio Code in that directory: - - ```shell - code . - ``` - -1. Create the Python file - - 1. In the `my-monocle-tracing-app` directory, create a Python file named `main.py`. - - You'll add the code to set up tracing with Monocle and interact with the Azure AI Inference SDK. - - 1. Add the following code to `main.py` and save the file: - - ```python - import os - from dotenv import load_dotenv - - # Load environment variables from .env file - load_dotenv() - - ### Set up for OpenTelemetry tracing ### - os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true" - os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry" - - from opentelemetry.sdk.trace.export import BatchSpanProcessor - from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter - - # Import monocle_apptrace - from monocle_apptrace import setup_monocle_telemetry - - azure_openai_endpoint = os.environ["AZURE_OPENAI_ENDPOINT"] - azure_openai_api_key = os.environ["AZURE_OPENAI_API_KEY"] - azure_openai_api_deployment = os.environ["AZURE_OPENAI_API_DEPLOYMENT"] - azure_openai_api_version = os.environ["AZURE_OPENAI_API_VERSION"] - endpoint = f"{azure_openai_endpoint}/openai/deployments/{azure_openai_api_deployment}" - - # Setup Monocle telemetry with OTLP span exporter for traces - setup_monocle_telemetry( - workflow_name="opentelemetry-instrumentation-azure-ai-inference", - span_processors=[ - BatchSpanProcessor( - OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces") - ) - ] - ) - - from azure.ai.inference import ChatCompletionsClient - from azure.ai.inference.models import UserMessage - from azure.ai.inference.models import TextContentItem - from azure.core.credentials import AzureKeyCredential - - client = ChatCompletionsClient( - endpoint = endpoint, - credential = AzureKeyCredential(azure_openai_api_key), - api_version = azure_openai_api_version, - ) - - response = client.complete( - messages = [ - UserMessage(content = [ - TextContentItem(text = "hello from Monocle!"), - ]), - ], - model = azure_openai_api_deployment, - tools = [], - response_format = "text", - temperature = 1, - top_p = 1, - ) - - print(response.choices[0].message.content) - ``` - -1. Run the code - - 1. Open a new terminal in Visual Studio Code. - - 1. In the terminal, run the code using the command `python main.py`. - -1. Check the trace data in AI Toolkit - - After you run the code and refresh the tracing webview, there's a new trace in the list. - - Select the trace to open the trace details webview. - - ![Screenshot showing selecting a trace from the Trace List in the Tracing webview.](./images/tracing/trace_list_azure_monocle.png) - - Check the complete execution flow of your app in the left span tree view. - - Select a span in the right span details view to see generative AI messages in the **Input + Output** tab. - - Select the **Metadata** tab to view the raw metadata. - - ![Screenshot showing the Trace Details view in the Tracing webview.](./images/tracing/trace_details_azure_monocle.png) - -## Example 3: set up tracing with the OpenAI Agents SDK using Monocle +## Example 2: set up tracing with the OpenAI Agents SDK using Monocle The following end-to-end example uses the OpenAI Agents SDK in Python with Monocle and shows how to set up tracing for a multi-agent travel booking system. @@ -967,8 +780,10 @@ The following end-to-end example uses the OpenAI Agents SDK in Python with Monoc To run this example, you need the following prerequisites: - [Visual Studio Code](https://code.visualstudio.com/) -- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-ai-toolkit.vscode-ai-toolkit) +- [AI Toolkit extension](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio) +- [Okahu Trace Visualizer](https://marketplace.visualstudio.com/items?itemName=OkahuAI.okahu-ai-observability) - [OpenAI Agents SDK](https://github.com/openai/agents) +- [OpenTelemetry](https://opentelemetry.io/) - [Monocle](https://github.com/monocle2ai/monocle) - [Python latest version](https://www.python.org/downloads) - OpenAI API key