Skip to content

Latest commit

 

History

History
153 lines (95 loc) · 8.71 KB

File metadata and controls

153 lines (95 loc) · 8.71 KB

GitHub Codespaces

You can run this solution using GitHub Codespaces. The button will open a web-based VS Code instance in your browser:

  1. Open the solution accelerator (this may take several minutes):

    Open in GitHub Codespaces

  2. Accept the default values on the create Codespaces page

  3. Open a terminal window if it is not already open

  4. Continue with the deploying steps

Steps to Provision Network Isolated environment using GitHub Codespaces using AZD CLI

  1. Log into your Azure subscription:

     azd auth login

    Image showing the entering of the command 'azd auth' in the terminal of VS Code

    image showing the authorization window opening in the browser

    Image showing the password prompt for azure

  2. Return to the codespaces window and type below command:

    az login

    The Azure CLI is used to validate available AI model quota.

    image showing theaz login in the vs code terminal

  3. Return to codespace terminal and type the below command for initializing the environment.

    azd init

    image showing the initial screen in the vs code terminal

  4. Enter the environment name.

    Note: Length of the environment name should be less than or equal to 12 characters.

    aImage showing entering a new environment name

  5. Now start the deployment of the infrastructure by typing the below command:

    azd up

    ⚠️ Note: The latest version of the Azure Developer CLI (AZD) is currently limited on prompting for missing parameters. The feature flag parameters in this solution have been temporarily defaulted to 'disabled' until this limitation is lifted and prompting will resume.

    image showing the terminal in vs code Log in to Azure for authentication. alt text This step will allow you to choose from the subscriptions you have available, based on the account you logged in with in the login step. Next it will prompt you for the region to deploy the resources into as well as any additional Azure resources to be provisioned and configured.

    Important: Be sure to remember the vm password. This will be used in a later step. You are still required to log into Azure once you connect through the virtual machine.

    ⚠️ Note:

    1. For WAF Deployment, Select the Network Isolation as 'True'.
      alt text
    2. For Sample App Deployment, Select the appSampleEnabled as 'True'.
      alt text
  6. The automated model quota check will run, and will check if the location selected will have the necessary quota for the AI Models that are listed in the parameters file prior to deploying any resources. image showing model quota pre-provision code executing

    If the location selected has sufficient quota for the models you plan to deploy, the provisioning will begin without notification.

    image showing model quota pre-provision pass

    If the location selected does not have the available quota for the models selected in your parameters, there will be a message back to the user, prior to any provisioning of resources. This will allow the developer to change the location of the provisiong and try again. Note that in our example, Italy North had capacity for gpt-4o but not for text-embedding-ada-002. This terminated the entire provisioning, because both models could not be deployed due to a quota issue.

    image showing model quota pre-provision fail

  7. After completeing the required paramters that you were prompted for, and a successful model quota validation, the provisioning of resources will run and deploy the Network Isolated AI Foundry development portal and dependent resources in about 20 minutes.

Post Deployment Steps:

These steps will help to check that the isolated environment was set up correctly. Follow these steps to check the creation of the required private endpoints in the environment (when set to networkIsolation = true).

One way to verify whether access is private to the foundry is by launching Azure AI Foundry from the portal.

Image showing if network isolation is checked

When a user that is not connected through the virtual network via an RDP approved connection will see the following screen in their browser. This is the intended behavior!

Image showing the virtual machine in the browser

A more thourough check is to look for the networking settings and checking for private end points.

  1. Go to the Azure Portal and select your Azure AI hub that was just created.

  2. Click on Resource Management and then Networking.

    Image showing the Azure Portal for AI Foundry Hub and the settings blade

    Here, you will find the private endpoints that are connected to the resources within the foundry managed virtual network. Ensure that these private endpoints are active. The foundry should show that Public access is ‘disabled’.

Connecting to the isolated network via RDP

  1. Navigate to the resource group where the isolated AI Foundry was deployed to and select the virtual machine.

    Image showing the Azure Portal for the virtual machine

  2. Be sure that the Virtual Machine is running. If not, start the VM.

    Image showing the Azure Portal VM and the start/stop button

  3. Select “Bastion” under the ‘Connect’ heading in the VM resource.

    Image showing the bastion blade selected

  4. Supply the username and the password you created as environment variables and press the connect button.

    Image showing the screen to enter the VM Admin info and the connect to bastion button

  5. Your virtual machine will launch and you will see a different screen.

    Image showing the opening of the Virtual machine in another browser tab

  6. Launch Edge browser and navigate to your Azure AI Foundry. https://ai.azure.com Sign in using your credentials.

  7. You are challenged by MFA to connect.

    Image showing the Multi Factor Authentication popup

  8. You will now be able to view the Azure AI Foundry which is contained in an isolated network.

    Image showing the Azure Foundry AI Hub with a private bubble icon

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.