HomeCloud ComputingConstructed for Agentic Scale and Cloud‑Native Apps

Constructed for Agentic Scale and Cloud‑Native Apps


2025 was a pivotal yr in Azure Storage, and we’re heading into 2026 with a transparent concentrate on serving to prospects flip AI into actual impression.

2025 was a pivotal yr in Azure Storage, and we’re heading into 2026 with a transparent concentrate on serving to prospects flip AI into actual impression. As outlined in final December’s Azure Storage improvements: Unlocking the way forward for knowledge, Azure Storage is evolving as a unified clever platform that helps the total AI lifecycle at enterprise scale with the efficiency trendy workloads demand.

Waiting for 2026, our investments span the total breadth of that lifecycle as AI turns into foundational throughout each {industry}. We’re advancing storage efficiency for frontier mannequin coaching, delivering function‑constructed options for giant‑scale AI inferencing and rising agentic functions, and empowering cloud‑native functions to function at agentic scale. In parallel, we’re simplifying adoption for mission‑essential workloads, reducing TCO, and deepening partnerships to co‑engineer AI‑optimized options with our prospects.

We’re grateful to our prospects and companions for his or her belief and collaboration, and excited to form the subsequent chapter of Azure Storage collectively within the yr forward.

Extending from coaching to inference

AI workloads lengthen from massive, centralized mannequin coaching to inference at scale, the place fashions are utilized repeatedly throughout merchandise, workflows, and real-world determination making. LLM coaching continues to run on Azure, and we’re investing to remain forward by increasing scale, bettering throughput, and optimizing how mannequin recordsdata, checkpoints, and coaching datasets movement via storage.

Improvements that helped OpenAI to function at unprecedented scale are actually out there for all enterprises. Blob scaled accounts enable storage to scale throughout tons of of scale items inside a area, dealing with hundreds of thousands of objects required to allow enterprise knowledge for use as coaching and tuning datasets for utilized AI. Our partnership with NVIDIA DGX on Azure reveals that scale interprets into real-world inference. DGX cloud was co-engineered to run on Azure, pairing accelerated compute with high-performance storage, Azure Managed Lustre (AMLFS), to assist LLM analysis, automotive, and robotics functions. AMLFS gives the most effective price-performance for holding GPU fleets repeatedly fed. We lately launched Preview assist for 25 PiB namespaces and as much as 512 GBps of throughput, making AMLFS finest in school managed Lustre deployment on Cloud.

As we glance forward, we’re deepening integration throughout in style first and third-party AI frameworks equivalent to Microsoft Foundry, Ray, Anyscale, and LangChain, enabling seamless connections to Azure Storage out of field. Our native Azure Blob Storage integration inside Foundry permits enterprise knowledge consolidation into Foundry IQ, making blob storage the foundational layer for grounding enterprise data, fine-tuning fashions, and serving low-latency context to inference, all below the tenant’s safety and governance controls.

From coaching via full-scale inferencing, Azure Storage helps your entire agent lifecycle: from distributing massive mannequin recordsdata effectively, storing and retrieving long-lived context, to serving knowledge from RAG vector shops. By optimizing for every sample end-to-end, Azure Storage has performant options for each stage of AI inference.

Evolving cloud native functions for agentic scale

As inference turns into the dominant AI workload, autonomous brokers are reshaping how cloud native functions work together with knowledge. Not like human-driven methods with predictable question patterns, brokers function repeatedly, issuing an order of magnitude extra queries than conventional customers ever did. This surge in concurrency stresses databases and storage layers, pushing enterprises to rethink how they architect new cloud native functions.

Azure Storage is constructing with SaaS leaders like ServiceNow, Databricks, and Elastic to optimize for agentic scale leveraging our block storage portfolio. Trying ahead, Elastic SAN turns into a core constructing block for these cloud native workloads, beginning with remodeling Microsoft’s personal database options. It gives absolutely managed block storage swimming pools for various workloads to share provisioned sources with guardrails for internet hosting multi-tenant knowledge. We’re pushing the boundaries on max scale items to allow denser packing and capabilities for SaaS suppliers to handle agentic site visitors patterns.

As cloud native workloads undertake Kubernetes to scale quickly, we’re simplifying the event of stateful functions via our Kubernetes native storage orchestrator, Azure Container Storage (ACStor) alongside CSI drivers. Our latest ACStor launch alerts two directional adjustments that may information upcoming investments: adopting the Kubernetes operator mannequin to carry out extra advanced orchestration and open sourcing the code base to collaborate and innovate with the broader Kubernetes group.

Collectively, these investments set up a robust basis for the subsequent era of cloud native functions the place storage should scale seamlessly and ship excessive effectivity to function the info platform for agentic scale methods.

Breaking worth efficiency limitations for mission essential workloads

In addition to evolving AI workloads, enterprises proceed to develop their mission essential workloads on Azure.

SAP and Microsoft are partnering collectively to increase core SAP efficiency whereas introducing AI-driven brokers like Joule that enrich Microsoft 365 Copilot with enterprise context. Azure’s newest M-series developments add substantial scale-up headroom for SAP HANA, pushing disk storage efficiency to ~780k IOPS and 16 GB/s throughput. For shared storage, Azure NetApp Information (ANF) and Azure Premium Information ship the excessive throughput NFS/SMB foundations SAP landscapes depend on, whereas optimizing TCO with ANF Versatile Service Degree and Azure Information Provisioned v2. Coming quickly, we’ll introduce Elastic ZRS storage service stage in ANF, bringing zone‑redundant excessive availability and constant efficiency via synchronous replication throughout availability zones leveraging Azure’s ZRS structure, with out added operational complexity.

Equally, Extremely Disks have change into foundational to platforms like BlackRock’s Aladdin, which should react immediately to market shifts and maintain high-performance below heavy load. With common latency effectively below 500 microseconds, assist for 400K IOPS, and 10 GB/s throughput, Extremely Disks allow quicker threat calculation, extra agile portfolio administration, and resilient efficiency on BlackRock’s highest-volume buying and selling days. When paired with Ebsv6 VMs, Extremely Disks can attain 800K IOPS and 14 GB/s for essentially the most demanding mission essential workloads. And with versatile provisioning, prospects can tune efficiency exactly to their wants whereas optimizing TCO.

These mixed investments give enterprises a extra resilient, scalable, and cost-efficient platform for his or her most crucial workloads.

Designing for brand new realities of energy and provide

The worldwide AI surge is straining energy grids and {hardware} provide chains. Rising vitality prices, tight datacenter budgets, and industry-wide HDD/SSD shortages imply organizations can’t scale infrastructure just by including extra {hardware}. Storage should change into extra environment friendly and clever by design.

We’re streamlining your entire stack to maximise {hardware} efficiency with minimal overhead. Mixed with clever load balancing and cost-effective tiering, we’re uniquely positioned to assist prospects scale storage sustainably whilst energy and {hardware} availability change into strategic constraints. With continued improvements on Azure Increase Information Processing Items (DPUs), we anticipate step operate positive aspects in storage pace and feeds at even decrease per unit vitality consumption.

AI pipelines can span on-premises estates, neo cloud GPU clusters, and cloud, but many of those environments are restricted by energy capability or storage provide. When these limits change into a bottleneck, we make it straightforward to shift workloads to Azure. We’re investing in integrations that make exterior datasets firstclass residents in Azure, enabling seamless entry to coaching, finetuning, and inference knowledge wherever it lives. As cloud storage evolves into AI-ready datasets, Azure Storage is introducing curated, pipeline optimized experiences to simplify how prospects feed knowledge into downstream AI companies.

Accelerating improvements via the storage companion ecosystem

We will’t do that alone. Azure Storage companions intently with strategic companions to push inference efficiency to the subsequent stage. Along with the self-publishing capabilities out there in Azure Market, we go a step additional by devoting sources with experience to co-engineer options with companions to construct extremely optimized and deeply built-in companies.

In 2026, you will notice extra co-engineered options like Commvault Cloud for Azure, Dell PowerScale, Azure Native Qumulo, Pure Storage Cloud, Rubrik Cloud Vault, and Veeam Information Cloud. We’ll concentrate on hybrid options with companions like VAST Information and Komprise to allow knowledge motion that unlocks the ability of Azure AI companies and infrastructure—fueling impactful buyer AI Agent and Software initiatives.

To an thrilling new yr with Azure Storage

As we transfer into 2026, our imaginative and prescient stays easy: assist each buyer unlock extra worth from their knowledge with storage that’s quicker, smarter, and constructed for the long run. Whether or not powering AI, scaling cloud native functions, or supporting mission essential workloads, Azure Storage is right here that can assist you innovate with confidence within the yr forward.



RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments