Accelerating LLM Inference via Dynamic KV Cache Placement in Heterogeneous Memory SystemYunhua FangRui Xieet al.2025IEEE Computer Architecture Letters
Breaking the HBM Bit Cost Barrier: Domain-Specific ECC for AI Inference InfrastructureRui XieAsad Ul Haqet al.2025IEEE Computer Architecture Letters
Guest Editorial Channel Modeling, Coding and Signal Processing for Novel Physical Memory Devices and SystemsShayan Srinivasa GaraniTong Zhanget al.2016IEEE J-SAC