Heike Riel
ISSCC 2026
Large Language Models (LLMs) have gained significant traction in real-world AI applications. However, LLMs demand large weight capacity, efficient computing, and high-throughput data communication. We will discuss how non-volatile memory, analog mixed-signal design, system architecture, and workloads impact efficiency and performance of Analog In-Memory Computing. Through circuit simulations and hardware-aware training, we demonstrate near-software accuracies in both simulation and hardware.
Heike Riel
ISSCC 2026
Katsuyuki Sakuma, Mukta Farooq, et al.
ECTC 2021
Olivier Maher, N. Harnack, et al.
DRC 2023
Divya Taneja, Jonathan Grenier, et al.
ECTC 2024