28 - 11 - 2024
Login Form



 


Share this post

Submit to FacebookSubmit to TwitterSubmit to LinkedIn

Today at Memcon 2024, Samsung Electronics, a world leader in advanced semiconductor technology, unveiled the expansion of its Compute Express Link (CXL) memory module portfolio and showcased its latest HBM3E technology, reinforcing leadership in high-performance and high-capacity solutions for AI applications. In a keynote address to a packed crowd at Santa Clara’s Computer History Museum, Jin-Hyeok Choi, Corporate Executive Vice President, Device Solutions Research America – Memory at Samsung Electronics, along with SangJoon Hwang, Corporate Executive Vice President, Head of DRAM Product and Technology at Samsung Electronics, took the stage to introduce new memory solutions and discuss how Samsung is leading HBM and CXL innovations in the AI era. Joining Samsung on stage was Paul Turner, Vice President, Product Team, VCF Division at VMware by Broadcom and Gunnar Hellekson, Vice President and General Manager at Red Hat to discuss how their software solutions combined with Samsung’s hardware technology is pushing the boundaries of memory innovation.

Jin-Hyeok Choi, Corporate EVP, Head of R and D Center, Samsung Semiconductor US
Jin-Hyeok Choi, Corporate EVP, Head of R&D Center, Samsung Semiconductor US

“AI innovation cannot continue without memory technology innovation,” said Choi. “As the market leader in memory, Samsung is proud to continue advancing innovation – from the industry’s most advanced CMM-B technology, to powerful memory solutions like HBM3E for high-performance computing and demanding AI applications. We are committed to collaborating with our partners and serving our customers to unlock the full potential of the AI era together.”

CXL Memory Module - Box (CMM-B)
CXL Memory Module - Box (CMM-B)

Highlighting growing momentum in the CXL ecosystem, Samsung introduced its CXL Memory Module – Box (CMM-B), a cutting-edge CXL DRAM memory pooling product. Samsung CMM-B can accommodate eight CMM-D devices of E3.S form factor and provide up to two terabytes (TB) of capacity. The huge memory capacity backed by high performance of up to 60 gigabytes-per-second (GB/s) bandwidth and latency of 596 nanoseconds (ns) can serve various applications that need high capacity memory such as AI, in-memory database (IMDB), data analytics and others.

Rack Scale Composable Memory Bank
Rack Scale Composable Memory Bank

Samsung also partnered with Supermicro, a global leader in Plug and Play Rack-Scale IT solutions, to demonstrate the industry’s first Rack-Level memory solution for highly scalable and composable disaggregated infrastructure. This advanced solution leverages Samsung’s CMM-B to increase memory capacity and bandwidth, enabling data centers to handle demanding workloads, unlike standard architectures that lack the necessary flexibility and efficiency for modern applications. The increased memory capacity and high-performance of up to 60GB/s bandwidth per server can enhance various applications that require high-capacity memory, such as AI, IMDB, data analytics and more.

CXL Memory Module - Hybrid (CMM-H)
CXL Memory Module - Hybrid (CMM-H)

On stage, Samsung and VMware by Broadcom also introduced project Peaberry, the world’s first FPGA (Field Programmable Gate Arrays)-based tiered memory solution for hypervisors called CXL Memory Module Hybrid for Tiered Memory (CMM-H TM). This hybrid solution combines DRAM and NAND media in an Add-in Card (AIC) form factor to tackle memory management challenges, reduce downtime, optimize scheduling for tiered memory, and maximize performance, all while significantly reducing total cost of ownership (TCO).

Paul Turner, VP of Product Team at VCF Division, Broadcom VMware
Paul Turner, VP of Product Team at VCF Division, Broadcom VMware

“VMware by Broadcom is pleased to collaborate with Samsung to bring new innovations in memory,” said Paul Turner. “Samsung's leadership in memory technologies and VMware's leadership in software memory tiering enables a new innovation in CXL and offers a compelling value-proposition with significant TCO benefits, better utilization of expensive DRAM resources, and improved consolidation of server resources while delivering the same great performance.”

Gunnar Hellekson, VP and GM, Red Hat Enterprise Linux, Red Hat
Gunnar Hellekson, VP & GM, Red Hat Enterprise Linux, Red Hat

In addition, Samsung showcased its CXL Memory Module – DRAM (CMM-D) technology, which uses Samsung's DRAM technology integrated with the CXL open standard interface, facilitating efficient, low-latency connectivity between the CPU and memory expansion devices. Red Hat, a global leader in open source software solutions, successfully validated Samsung’s CMM-D devices with its enterprise software for the first time in the industry last year. The two companies will continue their collaboration through the Samsung Memory Research (SMRC) in developing CXL open-source and reference models, as well as partnering on a range of other storage and memory products.

CXL Memory Module - DRAM (CMM-D)
CXL Memory Module - DRAM (CMM-D)

Samsung also gave 2024 Memcon attendees an opportunity to demo its latest HBM3E 12H chip – the world’s first 12-stack HBM3E DRAM, marking a breakthrough with the highest capacity ever achieved in HBM technology. The HBM3E 12H utilizes the company’s advanced thermal compression non-conductive film (TC NCF) technology, enhancing vertical density of the chip by over 20% compared to its predecessor, while also improving product yield. Samsung is currently sampling its HBM3E 12H to customers and plans to start mass production within the first half of this year. To learn more about Samsung Semiconductor’s advanced memory technologies and solutions, please visit: https://semiconductor.samsung.com/