KDDI stated the the brand new Osaka Sakai information middle can be powered by a rack-scale system that includes the Nvidia GB200 NVL72 platform
In sum, what to know:
New AI hub in Osaka by 2026 – KDDI and HPE are constructing a complicated AI information middle powered by Nvidia Blackwell-based infrastructure and liquid cooling to serve Japan and international AI markets.
Give attention to efficiency and sustainability – HPE’s rack-scale system brings energy-efficient high-performance computing, combining NVIDIA {hardware} and superior cooling to cut back environmental impression.
AI companies for startups and enterprises – KDDI plans to ship cloud-based AI compute by way of its WAKONX platform, enabling prospects to construct LLMs and scale AI apps with low latency.
Japanese operator KDDI Company and Hewlett Packard Enterprise (HPE) introduced a strategic collaboration geared toward launching a next-generation AI information middle in Sakai Metropolis, Osaka Prefecture, with operations scheduled to start in early 2026.
In a launch, the Japanese firm famous that the brand new AI information middle will help startups, enterprises and analysis establishments in creating AI-powered purposes and coaching massive language fashions (LLMs), leveraging Nvidia’s Blackwell structure and HPE’s infrastructure and cooling experience.
The Japanese firm famous that the brand new Osaka Sakai information middle can be powered by a rack-scale system that includes the Nvidia GB200 NVL72 platform, developed and built-in by HPE. The system is optimized for high-performance computing and incorporates superior direct liquid cooling to considerably scale back the environmental footprint, KDDI stated.
As AI workloads develop in scale and complexity, the demand for low-latency inferencing and energy-efficient infrastructure is rising. KDDI’s new AI information middle in Osaka goals to satisfy this problem by providing cloud-based AI compute companies by way of its WAKONX platform, which is designed for Japan’s AI-driven digital economic system.
The Nvidia GB200 NVL72 by HPE is a rack-scale system designed to allow massive and sophisticated AI clusters which might be optimized for power effectivity and efficiency by way of superior direct liquid cooling.
Geared up with Nvidia-accelerated networking, together with Nvidia Quantum-2 InfiniBand, Nvidia Spectrum-X Ethernet and Nvidia BlueField-3 DPUs, the system delivers high-performance community connectivity for various AI workloads. Clients can even run the Nvidia AI Enterprise platform on the KDDI infrastructure to speed up growth and deployment, the corporate stated.
Antonio Neri, president and CEO of HPE, stated: “Our collaboration with KDDI marks a pivotal milestone in supporting Japan’s AI innovation, delivering highly effective computing capabilities that can allow smarter options.”
Wanting ahead, the 2 firms will proceed to strengthen their collaboration to advance AI infrastructure and ship progressive companies ― whereas enhancing power effectivity.
HPE and Nvidia just lately unveiled a brand new suite of recent AI manufacturing unit choices geared toward accelerating enterprise adoption of synthetic intelligence throughout industries.
The expanded portfolio, introduced at HPE Uncover 2025 in Las Vegas, introduces a variety of modular infrastructure and turnkey platforms, together with HPE’s new AI-ready RTX PRO Servers and the subsequent technology of the corporate’s AI platform, HPE Non-public Cloud AI. These choices are designed to supply enterprises with the constructing blocks to develop, deploy and scale generative, agentic and industrial AI workloads.
Branded as Nvidia AI Computing by HPE, the built-in suite combines the chipmaker’s newest applied sciences—together with Blackwell accelerated computing, Spectrum-X Ethernet and BlueField-3 networking—with HPE’s server, storage, software program and companies ecosystem.
The important thing part of the launch is the revamped HPE Non-public Cloud AI, co-developed with the chip agency and absolutely validated underneath the Nvidia Enterprise AI Manufacturing facility framework. This platform delivers a full-stack resolution for enterprises in search of to harness the facility of generative and agentic AI.