Brokers are solely as succesful because the instruments you give them—and solely as reliable because the governance behind these instruments.
This weblog publish is the second out of a six-part weblog collection known as Agent Manufacturing facility which can share greatest practices, design patterns, and instruments to assist information you thru adopting and constructing agentic AI.
Within the earlier weblog, we explored 5 widespread design patterns of agentic AI—from software use and reflection to planning, multi-agent collaboration, and adaptive reasoning. These patterns present how brokers could be structured to attain dependable, scalable automation in real-world environments.
Throughout the trade, we’re seeing a transparent shift. Early experiments centered on single-model prompts and static workflows. Now, the dialog is about extensibility—tips on how to give brokers a broad, evolving set of capabilities with out locking into one vendor or rewriting integrations for every new want. Platforms are competing on how shortly builders can:
- Combine with lots of of APIs, providers, knowledge sources, and workflows.
- Reuse these integrations throughout completely different groups and runtime environments.
- Preserve enterprise-grade management over who can name what, when, and with what knowledge.
The lesson from the previous yr of agentic AI evolution is straightforward: brokers are solely as succesful because the instruments you give them—and solely as reliable because the governance behind these instruments.
Extensibility by means of open requirements
Within the early phases of agent growth, integrating instruments was usually a bespoke, platform-specific effort. Every framework had its personal conventions for outlining instruments, passing knowledge, and dealing with authentication. This created a number of constant blockers:
- Duplication of effort—the identical inside API needed to be wrapped in another way for every runtime.
- Brittle integrations—small modifications to schemas or endpoints might break a number of brokers without delay.
- Restricted reusability—instruments constructed for one crew or setting have been laborious to share throughout initiatives or clouds.
- Fragmented governance—completely different runtimes enforced completely different safety and coverage fashions.
As organizations started deploying brokers throughout hybrid and multi-cloud environments, these inefficiencies turned main obstacles. Groups wanted a solution to standardize how instruments are described, found, and invoked, whatever the internet hosting setting.
That’s the place open protocols entered the dialog. Simply as HTTP remodeled the online by creating a typical language for purchasers and servers, open protocols for brokers purpose to make instruments transportable, interoperable, and simpler to manipulate.
One of the crucial promising examples is the Mannequin Context Protocol (MCP)—a normal for outlining software capabilities and I/O schemas so any MCP-compliant agent can dynamically uncover and invoke them. With MCP:
- Instruments are self-describing, making discovery and integration sooner.
- Brokers can discover and use instruments at runtime with out guide wiring.
- Instruments could be hosted wherever—on-premises, in a accomplice cloud, or in one other enterprise unit—with out dropping governance.
Azure AI Foundry helps MCP, enabling you to convey current MCP servers straight into your brokers. This provides you the advantages of open interoperability plus enterprise-grade safety, observability, and administration. Be taught extra about MCP at MCP Dev Days.

After getting a normal for portability by means of open protocols like MCP, the subsequent query turns into: what sorts of instruments ought to your brokers have, and the way do you arrange them to allow them to ship worth shortly whereas staying adaptable?
In Azure AI Foundry, we consider this as constructing an enterprise toolchain—a layered set of capabilities that steadiness pace (getting one thing precious working immediately), differentiation (capturing what makes your small business distinctive), and attain (connecting throughout all of the techniques the place work truly occurs).
1. Constructed-in instruments for fast worth: Azure AI Foundry consists of ready-to-use instruments for widespread enterprise wants: looking out throughout SharePoint and knowledge lake, executing Python for knowledge evaluation, performing multi-step net analysis with Bing, and triggering browser automation duties. These aren’t simply conveniences—they let groups arise practical, high-value brokers in days as a substitute of weeks, with out the friction of early integration work.

2. Customized instruments in your aggressive edge: Each group has proprietary techniques and processes that may’t be replicated by off-the-shelf instruments. Azure AI Foundry makes it simple to wrap these as agentic AI instruments—whether or not they’re APIs out of your ERP, a producing high quality management system, or a accomplice’s service. By invoking them by means of OpenAPI or MCP, these instruments develop into transportable and discoverable throughout groups, initiatives, and even clouds, whereas nonetheless benefiting from Foundry’s identification, coverage, and observability layers.

3. Connectors for optimum attain: Via Azure Logic Apps, Foundry can join brokers to over 1,400 SaaS and on-premises techniques—CRM, ERP, ITSM, knowledge warehouses, and extra. This dramatically reduces integration raise, permitting you to plug into current enterprise processes with out constructing each connector from scratch.

One instance of this toolchain in motion comes from NTT DATA, which constructed brokers in Azure AI Foundry that combine Microsoft Cloth Knowledge Agent alongside different enterprise instruments. These brokers permit staff throughout HR, operations, and different capabilities to work together naturally with knowledge—revealing real-time insights and enabling actions—decreasing time-to-market by 50% and giving non‑technical customers intuitive, self-service entry to enterprise intelligence.
Extensibility have to be paired with governance to maneuver from prototype to enterprise-ready automation. Azure AI Foundry addresses this with a secure-by-default method to software administration:
- Authentication and identification in built-in connectors: Enterprise-grade connectors—like SharePoint and Microsoft Cloth—already use on-behalf-of (OBO) authentication. When an agent invokes these instruments, Foundry ensures that the decision respects the tip consumer’s permissions by way of managed Entra IDs, preserving current authorization guidelines. With Microsoft Entra Agent ID, each agentic mission created in Azure AI Foundry routinely seems in an agent-specific software view throughout the Microsoft Entra admin heart. This offers safety groups with a unified listing view of all brokers and agent functions they should handle throughout Microsoft. This integration marks step one towards standardizing governance for AI brokers firm extensive. Whereas Entra ID is native, Azure AI Foundry additionally helps integrations with exterior identification techniques. Via federation, clients who use suppliers akin to Okta or Google Id can nonetheless authenticate brokers and customers to name instruments securely.
- Customized instruments with OpenAPI and MCP: OpenAPI-specified instruments allow seamless connectivity utilizing managed identities, API keys, or unauthenticated entry. These instruments could be registered straight in Foundry, and align with customary API design greatest practices. Foundry can also be increasing MCP safety to incorporate saved credentials, project-level managed identities, and third-party OAuth flows, together with safe non-public networking—advancing towards a totally enterprise-grade, end-to-end MCP integration mannequin.
- API governance with Azure API Administration (APIM): APIM offers a robust management airplane for managing software calls: it permits centralized publishing, coverage enforcement (authentication, price limits, payload validation), and monitoring. Moreover, you may deploy self-hosted gateways inside VNets or on-prem environments to implement enterprise insurance policies near backend techniques. Complementing this, Azure API Heart acts as a centralized, design-time API stock and discovery hub—permitting groups to register, catalog, and handle non-public MCP servers alongside different APIs. These capabilities present the identical governance you anticipate in your APIs—prolonged to agentic AI instruments with out further engineering.
- Observability and auditability: Each software invocation in Foundry—whether or not inside or exterior—is traced with step-level logging. This consists of identification, software title, inputs, outputs, and outcomes, enabling steady reliability monitoring and simplified auditing.
Enterprise-grade administration ensures instruments are safe and observable—however success additionally depends upon the way you design and function them from day one. Drawing on Azure AI Foundry steerage and buyer expertise, a number of rules stand out:
- Begin with the contract. Deal with each software like an API product. Outline clear inputs, outputs, and error behaviors, and hold schemas constant throughout groups. Keep away from overloading a single software with a number of unrelated actions; smaller, single-purpose instruments are simpler to check, monitor, and reuse.
- Select the appropriate packaging. For proprietary APIs, determine early whether or not OpenAPI or MCP most closely fits your wants. OpenAPI instruments are simple for well-documented REST APIs, whereas MCP instruments excel when portability and cross-environment reuse are priorities.
- Centralize governance. Publish customized instruments behind Azure API Administration or a self-hosted gateway so authentication, throttling, and payload inspection are enforced persistently. This retains coverage logic out of software code and makes modifications simpler to roll out.
- Bind each motion to identification. All the time know which consumer or agent is invoking the software. For built-in connectors, leverage identification passthrough or OBO. For customized instruments, use Entra ID or the suitable API key/credential mannequin, and apply least-privilege entry.
- Instrument early. Add tracing, logging, and analysis hooks earlier than shifting to manufacturing. Early observability enables you to monitor efficiency traits, detect regressions, and tune instruments with out downtime.
Following these practices ensures that the instruments you combine immediately stay safe, transportable, and maintainable as your agent ecosystem grows.
What’s subsequent
Partly three of the Agent Manufacturing facility collection, we’ll concentrate on observability for AI brokers—tips on how to hint each step, consider software efficiency, and monitor agent habits in actual time. We’ll cowl the built-in capabilities in Azure AI Foundry, integration patterns with Azure Monitor, and greatest practices for turning telemetry into steady enchancment.
Did you miss the primary publish within the collection? Test it out: The brand new period of agentic AI—widespread use instances and design patterns.