DLFloat: A 16-b Floating Point Format Designed for Deep Learning Training and InferenceAnkur AgrawalBruce Fleischeret al.2019ARITH 2019
Accumulation bit-width scaling for ultra-low precision training of deep networksCharbel SakrNaigang Wanget al.2019ICLR 2019
Innovate Practices on CyberSecurity of Hardware Semiconductor DevicesAlfred L. CrouchPeter Levinet al.2019VTS 2019
Training deep neural networks with 8-bit floating point numbersNaigang WangJungwook Choiet al.2018NeurIPS 2018
A Scalable Multi-TeraOPS Core for AI Training and InferenceSunil ShuklaBruce Fleischeret al.2018IEEE SSC-L
A Scalable Multi-TeraOPS Deep Learning Processor Core for AI Trainina and InferenceBruce FleischerSunil Shuklaet al.2018VLSI Circuits 2018
Novel IC sub-threshold IDDQ signature and its relationship to aging during high voltage stressFranco StellariNaigang Wanget al.2018ESSDERC 2018
High-Q magnetic inductors for high efficiency on-chip power conversionNaigang WangBruce Doriset al.2016IEDM 2016
An 82%-efficient multiphase voltage-regulator 3D interposer with on-chip magnetic inductorsKevin TienNoah Sturckenet al.2015VLSI Circuits 2015
An 82%-efficient multiphase voltage-regulator 3D interposer with on-chip magnetic inductorsKevin TienNoah Sturckenet al.2015VLSI Technology 2015
03 Mar 2025US12240753Micro-electromechanical Device Having A Soft Magnetic Material Electrolessly Deposited On A Palladium Layer Coated Metal Beam
23 Dec 2024US12175359Machine Learning Hardware Having Reduced Precision parameter Components For Efficient Parameter Update
21 Jul 2024JP7525237Machine Learning Hardware Having Reduced Precision Parameter Components For Efficient Parameter Update
KEKaoutar El MaghraouiPrincipal Research Scientist and Manager, AIU Spyre Model Enablement, AI Hardware Center
PCPin-Yu ChenPrincipal Research Scientist and Manager; Chief Scientist, RPI-IBM AI Research Collaboration