You can run this solution using GitHub Codespaces. The button will open a web-based VS Code instance in your browser:
-
Open the solution accelerator (this may take several minutes):
-
Accept the default values on the create Codespaces page
-
Open a terminal window if it is not already open
-
Continue with the deploying steps
-
Log into your Azure subscription:
azd auth login
-
Return to the codespaces window and type below command:
az login
The Azure CLI is used to validate available AI model quota.
-
Return to codespace terminal and type the below command for initializing the environment.
azd init
-
Enter the environment name.
Note: Length of the environment name should be less than or equal to 12 characters.
-
Now start the deployment of the infrastructure by typing the below command:
azd up
⚠️ Note: The latest version of the Azure Developer CLI (AZD) is currently limited on prompting for missing parameters. The feature flag parameters in this solution have been temporarily defaulted to'disabled'until this limitation is lifted and prompting will resume.
Log in to Azure for authentication.
This step will allow you to choose from the subscriptions you have available, based on the account you logged in with in the login step. Next it will prompt you for the region to deploy the resources into as well as any additional Azure resources to be provisioned and configured.Important: Be sure to remember the vm password. This will be used in a later step. You are still required to log into Azure once you connect through the virtual machine.
⚠️ Note: -
The automated model quota check will run, and will check if the location selected will have the necessary quota for the AI Models that are listed in the parameters file prior to deploying any resources.

If the location selected has sufficient quota for the models you plan to deploy, the provisioning will begin without notification.
If the location selected does not have the available quota for the models selected in your parameters, there will be a message back to the user, prior to any provisioning of resources. This will allow the developer to change the location of the provisiong and try again. Note that in our example, Italy North had capacity for gpt-4o but not for text-embedding-ada-002. This terminated the entire provisioning, because both models could not be deployed due to a quota issue.
-
After completeing the required paramters that you were prompted for, and a successful model quota validation, the provisioning of resources will run and deploy the Network Isolated AI Foundry development portal and dependent resources in about 20 minutes.
These steps will help to check that the isolated environment was set up correctly. Follow these steps to check the creation of the required private endpoints in the environment (when set to networkIsolation = true).
One way to verify whether access is private to the foundry is by launching Azure AI Foundry from the portal.
When a user that is not connected through the virtual network via an RDP approved connection will see the following screen in their browser. This is the intended behavior!
A more thourough check is to look for the networking settings and checking for private end points.
-
Go to the Azure Portal and select your Azure AI hub that was just created.
-
Click on Resource Management and then Networking.
Here, you will find the private endpoints that are connected to the resources within the foundry managed virtual network. Ensure that these private endpoints are active. The foundry should show that Public access is ‘disabled’.
-
Navigate to the resource group where the isolated AI Foundry was deployed to and select the virtual machine.
-
Be sure that the Virtual Machine is running. If not, start the VM.
-
Select “Bastion” under the ‘Connect’ heading in the VM resource.
-
Supply the username and the password you created as environment variables and press the connect button.
-
Your virtual machine will launch and you will see a different screen.
-
Launch Edge browser and navigate to your Azure AI Foundry. https://ai.azure.com Sign in using your credentials.
-
You are challenged by MFA to connect.
-
You will now be able to view the Azure AI Foundry which is contained in an isolated network.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.



















