This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
This is the Agent365-Samples repository containing sample agents demonstrating the Microsoft Agent 365 SDK across three languages (C#/.NET, Python, Node.js/TypeScript) and multiple AI orchestrators (OpenAI, Claude, Semantic Kernel, Agent Framework, LangChain, CrewAI, etc.).
The Microsoft Agent 365 SDK extends the Microsoft 365 Agents SDK with enterprise-grade capabilities for observability, notifications, MCP tooling, and runtime utilities.
Agent365-Samples/
├── dotnet/ # C#/.NET samples
│ ├── agent-framework/ # Agent Framework orchestrator
│ └── semantic-kernel/ # Semantic Kernel orchestrator
├── python/ # Python samples
│ ├── agent-framework/
│ ├── openai/
│ ├── claude/
│ ├── crewai/
│ └── google-adk/
├── nodejs/ # Node.js/TypeScript samples
│ ├── openai/
│ ├── claude/
│ ├── langchain/
│ ├── devin/
│ ├── n8n/
│ ├── perplexity/
│ └── vercel-sdk/
├── docs/ # Repository-wide documentation
│ └── design.md # Architectural patterns
├── prompts/ # AI development prompts
└── scripts/ # Utility scripts
Each language directory contains a docs/design.md with language-specific design patterns.
Build:
dotnet build <path-to-sln-or-csproj>Run:
dotnet run --project <path-to-csproj>Solutions:
dotnet/agent-framework/AgentFrameworkSample.slndotnet/semantic-kernel/SemanticKernelSampleAgent.sln
Setup:
cd <sample-directory>
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -e .Run:
python host_agent_server.py
# or
python start_with_generic_host.pyPython samples use pyproject.toml for dependency management. Most samples support uv for faster dependency resolution.
Setup:
cd <sample-directory>
npm installBuild:
npm run buildRun:
npm start # Production mode
npm run dev # Development mode with hot reloadAll sample agents follow a consistent initialization and message processing flow:
- Load configuration (environment variables, config files)
- Configure observability (Agent 365 tracing/telemetry)
- Initialize LLM client (orchestrator-specific)
- Register tools (local tools + MCP servers)
- Configure authentication (bearer token or auth handlers)
- Start HTTP server (listen on
/api/messages)
- Authentication: JWT validation / auth handlers
- Observability Context Setup: Start trace span, set baggage (tenant, agent, conversation)
- Tool Registration: Load MCP servers and local tools
- LLM Invocation: Process with AI orchestrator
- Response: Stream or send response
- Cleanup: Close connections, end spans
Authentication is configured in this priority order:
- Bearer Token (development): Set
BEARER_TOKENenvironment variable - Auth Handlers (production): Configured in
appsettings.json(C#),.env(Python/Node.js) - No Auth (fallback): Bare LLM mode with graceful degradation
- C#:
appsettings.json,appsettings.Development.json - Python:
.env,pyproject.toml - Node.js:
.env,package.json
Configuration typically includes:
- LLM client settings (OpenAI/Azure OpenAI endpoints and API keys)
- Observability settings (Agent 365 tracing)
- Authentication settings (bearer token or auth handlers)
- MCP tooling configuration
All samples integrate Microsoft Agent 365 observability:
- Token caching for agentic auth tokens
- Baggage propagation (tenant, agent, conversation IDs)
- Framework-specific instrumentation (OpenTelemetry)
- Custom span attributes for tracing
Agents can dynamically load tools from MCP servers:
- Tools are configured per agent identity
- Authentication flows through the same mechanism as the agent
- Samples demonstrate both local tools (weather, datetime) and remote MCP servers (Graph API)
- Entry point:
Program.cs - Agent implementation:
Agent/MyAgent.cs(or similar) - Tools:
Tools/directory - Configuration:
appsettings.json
- Entry point:
host_agent_server.pyorstart_with_generic_host.py - Agent implementation:
agent.py - Configuration:
.envor embedded inpyproject.toml
- Entry point:
src/index.ts - Agent implementation:
src/agent.ts - LLM client:
src/client.ts - Token cache:
src/token-cache.ts - Configuration:
.env
All source files MUST have Microsoft copyright headers:
C# (.cs):
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT License.Python (.py):
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.JavaScript/TypeScript (.js, .ts):
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT License.Exclusions: Auto-generated files, test files, configuration files (.json, .yaml, .md), and third-party code.
Never use "Kairo" in code - this is a legacy reference that should be replaced with appropriate Agent 365 terminology.
- Never commit API keys, tokens, or secrets
- Use placeholders like
<<YOUR_API_KEY>>or<<PLACEHOLDER>>in config examples - Use environment variables, user secrets, or key vaults for sensitive data
Each sample SHOULD have:
- README.md with:
- What the sample demonstrates
- Prerequisites (runtime version, API keys, tools)
- Configuration section with example config snippets
- How to run instructions
- Testing options (Playground, WebChat, Teams/M365)
- Troubleshooting section
- Agent Code Walkthrough (optional but recommended)
- manifest/ folder with:
manifest.json(Teams app manifest)color.png(192x192 icon)outline.png(32x32 icon)
| Component | C#/.NET | Python | Node.js/TypeScript |
|---|---|---|---|
| HTTP Server | ASP.NET Core | aiohttp/FastAPI | Express.js |
| LLM Client | Azure.AI.OpenAI | openai/anthropic | @openai/agents |
| Observability | OpenTelemetry | OpenTelemetry | OpenTelemetry |
| Agent 365 SDK | Microsoft.Agents.* | microsoft_agents.* | @microsoft/agents-* |
- Main README: Overview and links to SDK documentation
- Architecture Design: Cross-cutting architectural patterns
- Contributing Guide: How to contribute
- Copilot Instructions: Automated code review rules
For language-specific design patterns, see: