An Azure machine learning service for building and deploying models.
Hello oya,
Thank you for reaching out to Microsoft Q&A,
Even though:
- Foundry hub and Databricks are in the same VNet
- Private Link is configured
- Public access to Databricks is disabled
The failure happens because Azure AI Foundry agent tool execution does NOT run inside your VNet by default
So when the Genie tool runs:
- It tries to call Databricks APIs over public internet
- Your Databricks blocks it (since only Private Endpoint is allowed)
- Result - Error occurred while processing your request
Clarification
You mentioned “Deploy Foundry agents into VNet”
That would solve the problem if fully supported, but today:
- Foundry workspace/hub can be VNet integrated
- But agent runtime is still Microsoft-managed
- It does not automatically inherit your VNet or Private Endpoints
So even with Private Link configured correctly, the agent itself cannot directly use it.
What a fully private setup would require
If this were fully supported, your setup would look like:
- VNet-injected runtime
- Agent runs inside your VNet
- All outbound traffic goes through your network
- Databricks Private Link
Private endpoints for:
-
api.azuredatabricks.net(control plane) -
{workspace-id}.privatelink.azuredatabricks.net(data plane)
Private DNS zone:
-
privatelink.azuredatabricks.netlinked to VNet
- Network rules
NSG allows Agent subnet --> Databricks private endpoint (port 443)
Databricks allows That subnet/IP range
Proper auth Managed Identity / Service Principal
This architecture is correct in principle, but the missing piece is: agent runtime is not inside that subnet today
Supported solution
Since direct Private Link access from Foundry agents isn’t supported yet, you need a proxy layer inside your VNet.
Use an intermediate service
Deploy one of these inside your VNet:
- Azure Function (best choice)
- Azure Container Apps
- AKS / App Service
Flow:
Foundry Agent → HTTPS call → Your API (public endpoint)
Why this works
Agent can call public HTTPS endpoint
Your service:
- Lives inside VNet
- Uses Private Endpoint to reach Databricks
- Fully respects your network restrictions
What to configure in this model
- Databricks Private Link
Private endpoints for:
- Control plane (
api.azuredatabricks.net) - Workspace endpoint
Private DNS zone configured and linked
- Proxy service
Deploy Azure Function in same VNet
Enable VNet integration
Use Managed Identity or PAT token
Call Databricks Genie API internally
- Networking
NSG Allow Function subnet --> Databricks private endpoint (443)
No need to expose Databricks publicly
Alternative
Enable Databricks public access
Restrict using IP allowlist, Authentication
Not recommended for strict private environments
Even though your VNet + Private Link setup is correct, Azure AI Foundry agents cannot directly use private endpoints today because their execution environment is not VNet-injected, so you must introduce a VNet-based proxy to bridge the connection securely.
Please refer this
Configure Private Link for Foundry: https://dori-uw-1.kuma-moon.com/en-us/azure/foundry/how-to/configure-private-link
Stuck or failed web service call (common REST API errors): https://docs.microsoft.com/azure/machine-learning/studio/web-service-error-codes
Unable to Login to Azure AI Foundry (network connectivity section): https://dori-uw-1.kuma-moon.com/azure/ai-studio/concepts/ai-resources
HTTP 500 / 503 / Service Unavailable (troubleshooting service outages): https://azure.microsoft.com/updates?id=492574
I Hope this helps. Do let me know if you have any further queries.
Thank you!