Connect Grok from Azure AI Foundry to GitHub Copilot Chat

Now that Grok-3 from xAI is available in Azure AI Foundry, I wanted to try it out with GitHub Copilot Chat. This is possible through the Bring Your Own Key feature released in Visual Studio Code. This feature allows you to connect other AI Platforms such as Anthropic, Ollama (local), Azure, and many more to GitHub Copilot Chat.

Grok is a model that excels in different technical and scientific areas, such as math, code generation and debugging, technical documentation, and much more.

In this blog, you will learn how to deploy Azure AI Foundry using Azure Bicep and how to connect and configure Grok-3 from Azure AI Foundry to GitHub Copilot Chat.

Deploying Azure AI Foundry with Bicep

Below is a Bicep template that deploys an Azure AI Foundry instance using minimal configuration, along with a model deployment of Grok-3 using the GlobalStandard SKU.

param parName string = 'aifoundry-${uniqueString(resourceGroup().id)}'
param parLocation string = resourceGroup().location
resource resAzureAIFoundry 'Microsoft.CognitiveServices/accounts@2025-04-01-preview' = {
name: parName
location: parLocation
sku: {
name: 'S0'
}
kind: 'AIServices'
properties: {
allowProjectManagement: false
publicNetworkAccess: 'Enabled'
customSubDomainName: parName
}
}
resource resGrokDeployment 'Microsoft.CognitiveServices/accounts/deployments@2025-04-01-preview' = {
name: 'grok-3'
parent: resAzureAIFoundry
sku: {
name: 'GlobalStandard'
capacity: 1
}
properties: {
model: {
name: 'grok-3'
format: 'xAI'
version: '1'
}
versionUpgradeOption: 'OnceNewDefaultVersionAvailable'
}
}
view raw main.bicep hosted with ❤ by GitHub

On the Azure AI Foundry portal, check whether Grok-3 has been deployed correctly. You can access Azure AI Foundry at https://ai.azure.com. Then follow these steps:

  1. Make sure you are in the instance you deployed via the Bicep template. Select your instance.
  2. Navigate to Playgrounds.
  3. Choose the Chat playground and verify that the Grok-3 deployment is available.
Azure AI Foundry portal model playground showing grok-3 deployment

Change the URL of the deployment model

At the time of writing, GitHub Copilot Chat only supports models served from OpenAI URLs in the following format: https://<foundry-instance-name>.openai.azure.com. However, the Grok model is served from: https://<foundry-instance-name>.services.ai.azure.com/models.

To make it work with GitHub Copilot Chat, you need to update the URL so it’s served from https://*.openai.azure.com instead:

https://<foundry-instance-name>.openai.azure.com/openai/deployments/<grok-deployment-name>/chat/completions?api-version=2025-01-01-preview

In my case, the foundry-instance-name would be aifoundry-msdctmffwp5fs and grok-deployment-name would be grok-3.

Configuring GitHub Copilot Chat to use Grok-3

Now that the model is deployed in your own Azure environment, you can configure it to be used with GitHub Copilot Chat. Make sure you have copied the endpoint API key:

Azure AI Foundry endpoint API key location

Set up GitHub Copilot Chat

To connect to the Grok-3 model, a one-time configuration is required.

1. Open GitHub Copilot Chat

First, open the GitHub Copilot pane via the GitHub Copilot icon in Visual Studio Code. Alternatively, you can use a shortcut (Mac) CTRL + CMD + I or (Windows) CTRL + ALT + I.

Open GitHub Copilot pane

2. Manage models

First, click on the model selector. This opens a menu containing all the different models. In this menu, click on the option Manage Models….

Managing models in GitHub Copilot Chat

3. Configure AI Provider and Model

After you click on Manage Models..., a modal opens showing a list of providers, including Anthropic, Azure, Ollama, and more. Select Azure.

AI provider selection pane

After selecting Azure, you will be prompted to enter a custom Azure model ID. I used grok-23052025 as the ID.

GitHub Copilot custom model ID configuration

Next, you will be asked to provide the deployment URL. This is the URL of your model deployment in Azure AI Foundry. Make sure to use the *.openai.azure.com format.

GitHub Copilot custom model deployment URL configuration

Finally, you need to configure the API key:

GitHub Copilot custom model API key configuration

4. Token configuration

In this step, you can configure token limits, a friendly name, and whether the model supports tool calling and/or vision. For most users, the default settings will work fine.

GitHub Copilot custom model token configuration

After this step, a pop-up appears confirming that the registration was successful:

Successfully added grok-3 model to GitHub Copilot Chat

5. Select the model

In the model selector, you will now see that grok-3 is available from Azure. The model can now be used in GitHub Copilot Chat.

GitHub Copilot model selection pop-up showing grok-3 from Azure AI Foundry
“Who is the creator of Grok?”

Using grok-3 with GitHub Copilot Chat

Below are a few examples with grok-3 in action:

grok-3 replacing the contents of a div with a contact form

Benefits

Bring Your Own Key gives flexibility over which AI providers power GitHub Copilot Chat. Besides that, it also allows you to swap out generic models for fine-tuned/tailored models.

If you connect GitHub Copilot Chat with Azure AI Foundry:

  • Enhanced security and data privacy: You will get elevated privacy by using your own instance. Data generated in GitHub Copilot Chat will stay within your organisation. Perfect for sensitive data or private codebases.
  • Customisation: You get the ability to use fine-tuned/tailored models that fit your business needs.
  • Monitoring and observability: You get a enterprise-grade GenAI Ops monitoring toolkit for things like token consumption, request rates and latency. You can monitor how Grok-3 is performing in real time ensuring the model usage is reliable, safe and high-quality.

Conclusion

This is how you can connect almost any AI platform to GitHub Copilot Chat with your own API keys. You can also run frontier models without any data, leaving your Azure tenant, making GitHub Copilot Chat more secure, and strengthening data privacy. This does not replace the code completion model GitHub Copilot is using; it replaces the model in the chat functionality.

Learn more about Grok-3 on Azure AI Foundry here: https://devblogs.microsoft.com/foundry/announcing-grok-3-and-grok-3-mini-on-azure-ai-foundry/?WT.mc_id=MVP_323261

Leave a comment