SAN JOSE, Calif., March 17 — Kioxia America, Inc. today announced the development of its Super High IOPS SSD, a new type of SSD enabling the GPU to directly access high-speed flash memory as an expansion to High Bandwidth Memory (HBM) in AI systems. The new Super High IOPS SSD, the KIOXIA GP Series, is purpose-built to meet the growing performance demands of AI and high-performance computing, providing larger GPU-accessible memory capacity for faster data access to AI workloads. Evaluation samples of KIOXIA GP Series will be available to select customers by the end of 2026.

The NVIDIA Storage-Next™ initiative addresses the anticipated shift from compute-intensive to data-intensive workloads and the expanded need for GPU-accessible memory space, currently limited by HBM size. Expanding the GPU’s usable memory space allows access to larger data sets and improves GPU utilization by moving more data closer to compute resources.

The NVIDIA Storage-Next initiative calls on SSD vendors to design drives optimized for GPU-initiated AI workloads. The initiative effectively expands HBM capacity by enabling GPUs to access flash-based memory. Kioxia is supporting NVIDIA’s initiative with the KIOXIA GP Series SSDs, which utilize low-latency, high-performance KIOXIA XL-FLASH™ Storage Class Memory, and is uniquely positioned for this architecture, delivering higher IOPS, finer-grained data access (512 bytes), and lower power consumption per IO, compared with Kioxia conventional TLC SSDs.

“Kioxia fully supports the NVIDIA Storage-Next initiative and will deliver purpose-built SSDs to effectively address the need for GPU-accessible memory,” said Makoto Hamada, Senior Director of the SSD Division, Kioxia Corporation. “This collaboration is instrumental in shaping the future of AI storage architecture.”

Kioxia reaffirms its commitment to driving technological advancements in AI and high-performance computing through ongoing innovation and strategic collaborations. The KIOXIA GP Series SSD family is designed to address the evolving needs of AI workloads.

Additionally, AI models are rapidly scaling toward trillions of parameters while context windows expand to millions of tokens, driving an unprecedented growth in KV (Key Value) cache requirements. Architectures such as NVIDIA’s Context Memory Storage (CMX) recognize the need to extend the memory hierarchy beyond GPU memory using high-performance storage. The KIOXIA CM9 Series PCIe® 5.0 E3.S SSD, offering 25.6 TB TLC capacity with 3 DWPD endurance, provides the performance, capacity, and endurance needed to support these large-scale inference environments. KIOXIA believes this class of storage will play a critical role in scaling efficient, cost-optimized AI inference infrastructure. Samples will begin shipping in Q3 2026.

KIOXIA will be demonstrating the Super High IOPS SSD emulator and other technology innovations at NVIDIA GTC, booth 3522.

Leave a Reply

Your email address will not be published. Required fields are marked *