Azure Functions MCP Support Goes GA: Secure Agent Integration for .NET Developers

TL;DR Microsoft has moved its Model Context Protocol (MCP) support for Azure Functions to General Availability, signaling a shift toward standardized, identity-secure agentic workflows. The MCP extension now supports .NET, Java, JavaScript, Python, and TypeScript, while a new self-hosted option lets developers deploy existing MCP SDK-based servers without code changes. If you’re building AI agents on Azure, this removes a major friction point around secure tool access.


Why This Matters Right Now

Anthropic’s Model Context Protocol (MCP), a “USB-C for AI” that lets AI agents talk to external tools like databases, search engines, and APIs, proved the missing connective tissue and is quickly becoming the standard. OpenAI and Microsoft have publicly embraced MCP, and Anthropic recently donated it to the Linux Foundation’s new Agentic AI Foundation. Translation: MCP is becoming the lingua franca for agent-to-tool communication. But it had a critical gap—security. By integrating native OBO authentication and streamable HTTP transport, the update aims to solve the ‘security pain point’ that has historically prevented AI agents from accessing sensitive downstream enterprise data. For .NET teams, this is huge. You can now wire your Azure Functions directly into your AI agents without building custom authentication plumbing.


What Changed: The Developer Experience

Before (Preview)

  • MCP was available but required manual setup and didn’t natively handle Azure identity.
  • You’d need workarounds to pass credentials securely to agents.
  • Cold starts and latency were unpredictable.

Now (GA)

With built-in OBO authentication and streamable HTTP transport, it addresses key security concerns. Now supporting multiple languages and self-hosting, MCP empowers developers to deploy with ease while safeguarding sensitive data. —

Practical Integration Path for .NET

Here’s how to wire an Azure Function as an MCP server for your agents:

// Azure Function with MCP support
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;

public class AgentToolFunction
{
    [Function("QueryDatabase")]
    public async Task<object> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] 
        HttpRequestData req,
        FunctionContext context,
        ILogger log)
    {
        // Your tool logic here
        // MCP handles auth via OBO token automatically
        var result = await QueryYourDatabase(req);
        return result;
    }
}

The key: Azure Functions offers multiple hosting plans tailored to different MCP server requirements. The Flex Consumption plan provides automatic scaling based on demand with a pay-per-execution billing model and scale-to-zero economics. When MCP tools sit idle, costs drop to zero while maintaining fast wake-up times. For mission-critical workflows, the Premium plan supports “always-ready” instances that remain pre-initialized, eliminating cold-start latency, which is critical for mission-critical tools where initialization delays can cause SSE connection timeouts and poor agent response times. —

Watch Out For

Current limitations practitioners should be aware of: nested arrays and complex types must be serialized as comma-separated strings when integrating with Azure AI Foundry, and programmatic tool approval using require_approval=”never” is necessary for production workflows since UI-based approvals don’t persist in automated deployments. Translation: if your tool returns deeply nested JSON, flatten it. And automate approvals from day one—don’t rely on manual clicks in production.


Broader Context: The Agentic Shift

This GA release isn’t isolated. With MCP reducing the friction of connecting agents to real systems, 2026 is likely to be the year agentic workflows finally move from demos into day-to-day practice. Meanwhile, the Azure SDK’s major update brings feature support for Microsoft Foundry Agents Service, integration with the new Azure.AI.Projects.OpenAI package, expanded evaluation capabilities, insights, red teaming, schedules, and more—a significant expansion of AI capabilities for .NET developers working with Azure AI services. The pieces are clicking into place. Secure tools, standardized protocols, and .NET-first tooling mean you can ship production agents without reinventing the wheel.


Further reading

  • https://www.infoq.com/news/2026/01/azure-functions-mcp-support/
  • https://learn.microsoft.com/en-us/azure/databricks/release-notes/product/2026/january
  • https://devblogs.microsoft.com/azure-sdk/azure-sdk-release-january-2026/
  • https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/
  • https://devblogs.microsoft.com/dotnet/dotnet-ai-essentials-the-core-building-blocks-explained/
  • https://learn.microsoft.com/en-us/dotnet/ai/dotnet-ai-ecosystem