Skip to content

Azure-Samples/az-ai-devcontainer

Repository files navigation

Azure AI DevContainer Template

This repository is a template for working in Development Containers or GitHub Codespaces with Python, Azure AI Foundry, and Jupyter notebooks.

Feedback and bug reports are welcome. Please open a GitHub issue if you find something that needs fixing or improvement.

DevContainer Picture

Getting Started

Open in GitHub Codespaces Open in Dev Containers

Warning

Do NOT git clone the application under Windows and then open a DevContainer. This would create issues with file end of lines. For DevContainer click on the button above and let Visual Studio Code download the repository for you. Alternatively you can also git clone under Windows Subsystem for Linux (WSL) and ask Visual Studio Code to Re-Open in Container.

Provision Azure Resources

Login with AZD:

azd auth login

To provision your Azure resources run:

azd up

If you want to deploy Azure AI Search run:

azd env set USE_AI_SEARCH true
azd up

Note

Azure AI Search is not provisioned by default due to the increased cost and provisioning time.

Start working

🚀 You can start working straight away by modifying notebooks/SampleNotebook.ipynb!

Document Intelligence And Content Understanding

After azd up, the AZD environment includes SDK-friendly endpoint variables that the sample notebook already loads via azd env get-values:

  • CONTENTUNDERSTANDING_ENDPOINT and AZURE_CONTENT_UNDERSTANDING_ENDPOINT
  • DOCUMENTINTELLIGENCE_ENDPOINT and AZURE_DOCUMENT_INTELLIGENCE_ENDPOINT

This template does not deploy any extra resource for these variables.

  • For Azure Content Understanding, the documented SDK endpoint is your Microsoft Foundry resource endpoint, so this template reuses AI_FOUNDRY_ENDPOINT directly.
  • For Azure Document Intelligence, this template also defaults DOCUMENTINTELLIGENCE_ENDPOINT to the same Foundry endpoint so shared-resource access is available immediately.

Use them from Python like this:

import os

content_understanding_endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
document_intelligence_endpoint = os.environ["DOCUMENTINTELLIGENCE_ENDPOINT"]

If you want to use Microsoft Entra ID with the Azure Document Intelligence SDK, Microsoft documentation requires a dedicated single-service Document Intelligence resource. In that case, set the endpoint you want to expose before provisioning:

azd env set DOCUMENTINTELLIGENCE_ENDPOINT https://<your-document-intelligence-resource>.cognitiveservices.azure.com/
azd up

Pre-configured AI Models

This template declares its default Azure AI Foundry model catalog in infra/deployments.yaml.

The Foundry account and project are provisioned by Bicep first. Model deployments are then reconciled in a separate post-provision step by infra/scripts/deploy_models.py, which is triggered automatically by AZD and can also be run manually.

You can refresh API-backed model metadata in infra/deployments.yaml from the live Foundry account with infra/scripts/sync_deployments_catalog.py. The sync script uses the Azure management models endpoint behind az cognitiveservices account list-models and updates the current catalog entries in place.

To run the model deployment stage manually:

uv run python infra/scripts/deploy_models.py --mode manual

To refresh the catalog from Azure before deploying:

uv run python infra/scripts/sync_deployments_catalog.py --dry-run
uv run python infra/scripts/sync_deployments_catalog.py

The catalog sync preserves local curation fields such as runModes, allowedRegions, requiresRegistration, registrationUrl, and notes. Existing sku.capacity values are also preserved by default so the sync does not overwrite your chosen quota allocations. Use --sync-capacity only if you explicitly want to reset them to Azure's current default capacity. Use --sync-available-capacity to pull the currently available deployment capacity for each model in your target region from the live Azure capacity API.

infra/scripts/deploy_models.py now runs in best-effort mode for expected Azure-side deployment blockers. Models that are currently blocked by quota, deprecation, marketplace restrictions, missing provider metadata, or unsupported SKU/model combinations are reported as blocked and do not fail the whole reconciliation run. Unexpected errors still fail the command.

--append-new is intentionally not part of the normal workflow right now. Keep new model additions curated manually so each new entry can be reviewed for region limits, registration requirements, and deployment intent before it is committed to the catalog.

To skip the automatic post-provision rollout for an environment:

azd env set DEPLOY_AI_FOUNDRY_MODELS false

The active catalog includes a mix of OpenAI, Microsoft, and partner models. Some entries remain commented out because they are region-limited, require registration, or need extra access configuration.

Commented Out (available in other regions or require registration)

Additional models are available but commented out in deployments.yaml. Uncomment to enable:

Provider Models Notes
DeepSeek DeepSeek-R1, DeepSeek-V3.1, DeepSeek-V3.2, DeepSeek-V3.2-Speciale May require different region
Meta Llama-4-Maverick, Llama-3.2 Vision, Meta-Llama-3.x series May require different region
Mistral AI Mistral-Large-2411/3, Mistral-Nemo, Ministral-3B, Codestral-2501 May require different region
Cohere Cohere-command-a/r, embed-v-4-0 May require different region
xAI grok-3, grok-3-mini, grok-4 series May require different region
Microsoft Phi-4, Phi-3.5 series, MAI-DS-R1, model-router May require different region
Moonshot AI Kimi-K2-Thinking May require different region
OpenAI (Image/Video) gpt-image-1 series, sora, sora-2 Require registration
Black Forest Labs FLUX.2-pro, FLUX.1-Kontext-pro, FLUX-1.1-pro Image generation

Note

Model availability varies by Azure region. This template is tested in Sweden Central. Some models (o3-pro, codex-mini, gpt-5.2-codex) are only available in East US2 & Sweden Central.

For the latest model availability, see Azure AI Foundry Models Documentation.

Contents

About

AZ AI DevContainer: Prebuilt AI Developer DevContainer/Codespace Environment including Python, Jupyter, Infra as Code deployment, AI Foundry, Testing, Evaluation, etc.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors