AI Revolution Driven by Memory Technology Innovation – Jaihyuk Song (Samsung Electronics)

Jaihyuk Song, Corporate President & CTO, Device Solutions,
Samsung Electronics, Hwaseong, South Korea

The recent AI revolution, spearheaded by Large Language Models (LLMs), demands substantial computing resources and corresponding memory solutions. However, unlike processors that can leverage advancements in fabrication processes, memory devices are increasingly struggling to meet the high bandwidth, large capacity, and power efficiency requirements of AI systems. This paper analyzes the requirements and limitations of systems in the AI era, categorizing application-specific memory needs in terms of performance, power, and capacity. We introduce performance-
centric solutions such as HBM (High Bandwidth Memory) and PIM (Processing-In-Memory) technologies, energy-efficient solutions including custom HBM and LPW (LPDDR Wide-IO) memory, and capacity-focused solutions like SSD (Solid-State Drives) and CXL (Compute Express Link) Memories. Additionally, we discuss how continuous scaling of DRAM and NAND Flash processes, as well as 3D-packaging technologies, can address the trade-offs among performance, power, and
capacity more effectively. Finally, the importance of software technologies in optimizing the utilization of these increasingly specialized memory solutions is emphasized, along with a discussion of the enabling core technologies for each solution. To meet the high demands of AI systems, the ongoing advancement of existing memory devices and the development of new memory solutions will play crucial roles. These efforts will support the advancement of AI technologies and
contribute to human society.