From Linux kernel code to AI at scale, uncover Microsoft’s open supply evolution and influence.
Microsoft’s engagement with the open supply group has remodeled the corporate from a one-time skeptic to now being one of many world’s main open supply contributors. Actually, over the previous three years, Microsoft Azure has been the biggest public cloud contributor (and the second largest total contributor) to the Cloud Native Computing Basis (CNCF). So, how did we get right here? Let’s have a look at some milestones in our journey and discover how open-source applied sciences are on the coronary heart of the platforms powering a lot of Microsoft’s greatest merchandise, like Microsoft 365, and massive-scale AI workloads, together with OpenAI’s ChatGPT. Alongside the way in which, now we have additionally launched and contributed to a number of open-source initiatives impressed by our personal experiences, contributing again to the group and accelerating innovation throughout the ecosystem.

Embracing open supply: Key milestones in Microsoft’s journey
2009—A brand new leaf: 20,000 traces to Linux. In 2009, Microsoft contributed greater than 20,000 traces of code to the Linux kernel, initially Hyper‑V drivers, underneath Basic Public License, model 2 (GPLv2). It wasn’t our first open supply contribution, however it was a visual second that signaled a change in how we construct and collaborate. In 2011, Microsoft was within the high 5 corporations contributing to Linux. Right this moment, 66% of buyer cores in Azure run Linux.
2015—Visible Studio code: An open supply hit. In 2015, Microsoft launched Visible Studio Code (VS Code), a light-weight, open-source, cross-platform code editor. Right this moment, Visible Studio and VS Code collectively have greater than 50 million month-to-month lively builders, with VS Code itself extensively thought to be the most well-liked improvement atmosphere. We consider AI experiences can thrive by leveraging the open-source group, simply as VS Code has efficiently finished over the previous decade. With AI changing into an integral a part of the fashionable coding expertise, we’ve launched the GitHub Copilot Chat extension as open supply on GitHub.
2018—GitHub and the “all-in” dedication. In 2018, Microsoft acquired GitHub, the world’s largest developer group platform, which was already residence to twenty-eight million builders and 85 million code repositories. This acquisition underscored Microsoft’s transformation. As CEO Satya Nadella stated within the announcement, “Microsoft is all-in on open supply… With regards to our dedication to open supply, choose us by the actions now we have taken within the current previous, our actions at present, and sooner or later.” Within the 2024 Octoverse, GitHub reported 518 million public or open-source initiatives, over 1 billion contributions in 2024, about 70,000 new public or open-source generative AI initiatives, and a couple of 59% year-over-year surge in contributions to generative AI initiatives.
Open supply at enterprise scale: Powering the world’s most demanding workloads
Open-source applied sciences, like Kubenetes and PostgreSQL, have change into foundational pillars of recent cloud-native infrastructure—Kubernetes is the second largest open-source venture after Linux and now powers hundreds of thousands of containerized workloads globally, whereas PostgreSQL is without doubt one of the most generally adopted relational databases. Azure Kubernetes Service (AKS) and Azure’s managed Postgres take the perfect of those open-source improvements and elevate them into strong, enterprise-ready managed providers. By abstracting away the operational complexity of provisioning, scaling, and securing these platforms, AKS and managed PostgreSQL lets organizations concentrate on constructing and innovating. The mixture of open supply flexibility with cloud-scale reliability permits providers like Microsoft 365 and OpenAI’s ChatGPT to function at large scale whereas staying extremely performant.
COSMIC: Microsoft’s geo-scale, managed container platform powers Microsoft 365’s transition to containers on AKS. It runs hundreds of thousands of cores and is without doubt one of the largest AKS deployments on the planet. COSMIC bakes in safety, compliance, and resilience whereas embedding architectural and operational finest practices into our inside providers. The consequence: drastically diminished engineering effort, quicker time-to-market, improved value administration, even whereas scaling to hundreds of thousands of month-to-month customers around the globe. COSMIC makes use of Azure and open-source applied sciences to function at planet-wide scale: Kubernetes event-driven autoscaling (KEDA) for autoscaling, Prometheus, and Grafana for real-time telemetry and dashboards to call a couple of.
OpenAI’s ChatGPT: ChatGPT is constructed on Azure utilizing AKS for container orchestration, Azure Blob Storage for consumer and AI-generated content material, and Azure Cosmos DB for globally distributed information. The dimensions is staggering: ChatGPT has grown to virtually 700 million weekly lively customers, making it the fastest-growing shopper app in historical past.1 And but, OpenAI operates this service with a surprisingly small engineering group. As Microsoft’s Cloud and AI Group Govt Vice President Scott Guthrie highlighted at Microsoft Construct in Could, ChatGPT “must scale … throughout greater than 10 million compute cores around the globe,” …with roughly 12 engineers to handle all that infrastructure. How? By counting on managed platforms like AKS that mix enterprise capabilities with the perfect of open supply innovation to do the heavy lifting of provisioning, scaling, and therapeutic Kubernetes clusters throughout the globe.
Think about what occurs once you chat with ChatGPT: Your immediate and dialog state are saved in an open-source database (Azure Database for PostgreSQL) so the AI can bear in mind context. The mannequin runs in containers throughout 1000’s of AKS nodes. Azure Cosmos DB then replicates information in milliseconds to the datacenter closest to the consumer, guaranteeing low latency. All of that is powered by open-source applied sciences underneath the hood and delivered as cloud providers on Azure. The consequence: ChatGPT can deal with “unprecedented” load—over one billion queries per day, and not using a hitch, and while not having an enormous operations group.
What Azure groups are constructing within the open
At Microsoft, our dedication to constructing within the open runs deep, pushed by engineers throughout Azure who actively form the way forward for open-source infrastructure. Our groups don’t simply use open-source applied sciences, they assist construct and evolve them.
Our open-source philosophy is simple: we contribute upstream first after which combine these improvements into our downstream merchandise. To help this, we play a pivotal position in upstream open-source initiatives, collaborating throughout the trade with companions, clients, and even opponents. Examples of initiatives now we have constructed or contributed to incorporate:
- Dapr (Distributed Utility Runtime): A CNCF-graduated venture launched by Microsoft in 2019, Dapr simplifies cloud-agnostic app improvement with modular constructing blocks for service invocation, state, messaging, and secrets and techniques.
- Radius: A CNCF Sandbox venture that lets builders outline utility providers and dependencies, whereas operators map them to assets throughout Azure, AWS, or personal clouds—treating the app, not the cluster, because the unit of intent.
- Copacetic: A CNCF Sandbox device that patches container photos with out full rebuilds, dashing up safety fixes—initially constructed to safe Microsoft’s cloud photos.
- Dalec: A declarative device for constructing safe OS packages and containers, producing software program invoice of supplies (SBOMs) and provenance attestations to supply minimal, reproducible base photos.
- SBOM Software: A command line interface (CLI) for producing SPDX-compliant SBOMs from supply or builds—open-sourced by Microsoft to spice up transparency and compliance.
- Drasi: A CNCF Sandbox venture launched in 2024, Drasi reacts to real-time information modifications utilizing a Cypher-like question language for change-driven workflows.
- Semantic Kernel and AutoGen: Open-source frameworks for constructing collaborative AI apps—Semantic Kernel orchestrates giant language fashions (LLMs) and reminiscence, whereas AutoGen allows multi-agent workflows.
- Phi-4 Mini: A compact 3.8 billion-parameter AI mannequin launched in 2025, optimized for reasoning and arithmetic on edge gadgets; accessible on Hugging Face.
- Kubernetes AI Toolchain Operator (KAITO): A CNCF Sandbox Kubernetes operator that automates AI workload deployment—supporting LLMs, fine-tuning, and retrieval-augmented era (RAG) throughout cloud and edge with AKS integration.
- KubeFleet: A CNCF Sandbox venture for managing functions throughout a number of Kubernetes clusters. It gives good scheduling, progressive deployments, and cloud-agnostic orchestration.
That is only a small sampling of a few of the open-source initiatives that Microsoft is concerned in—every one sharing, in code, the teachings we’ve discovered from working methods at a world scale and alluring the group to construct alongside us.
Open Supply + Azure = Empowering the subsequent era of innovation
Microsoft’s journey with open supply has come a great distance from that 20,000-line Linux patch in 2009. Right this moment, open-source applied sciences are on the coronary heart of many Azure options. And conversely, Microsoft’s contributions are serving to drive many open-source initiatives ahead—whether or not it’s commits to Kubernetes; new instruments like KAITO, Dapr, and Radius; or analysis developments like Semantic Kernel and Phi-4. Our engineers perceive that the success of finish consumer options like Microsoft 365 and ChatGPT depend on scalable, resilient platforms like AKS—which in flip are constructed on and sustained by sturdy, vibrant open supply communities.
Be a part of us at Open Supply Summit Europe 2025
As we proceed to contribute to the open supply group, we’re excited to be a part of Open Supply Summit Europe 2025, going down August 25–27. You’ll discover us at sales space D3 with reside demos, in-booth periods protecting a variety of matters, and loads of alternatives to attach with our Open Supply group. Be sure you catch our convention periods as effectively, the place Microsoft specialists will share insights, updates, and tales from our work throughout the open supply ecosystem.
1 TechRepublic, ChatGPT’s On Monitor For 700M Weekly Customers Milestone: OpenAI Goes Mainstream, August 5, 2025.