DLFloat: A 16-b Floating Point Format Designed for Deep Learning Training and InferenceAnkur AgrawalBruce Fleischeret al.2019ARITH 2019
Accumulation bit-width scaling for ultra-low precision training of deep networksCharbel SakrNaigang Wanget al.2019ICLR 2019
Innovate Practices on CyberSecurity of Hardware Semiconductor DevicesAlfred L. CrouchPeter Levinet al.2019VTS 2019
Training deep neural networks with 8-bit floating point numbersNaigang WangJungwook Choiet al.2018NeurIPS 2018
A Scalable Multi-TeraOPS Core for AI Training and InferenceSunil ShuklaBruce Fleischeret al.2018IEEE SSC-L
A Scalable Multi-TeraOPS Deep Learning Processor Core for AI Trainina and InferenceBruce FleischerSunil Shuklaet al.2018VLSI Circuits 2018
Novel IC sub-threshold IDDQ signature and its relationship to aging during high voltage stressFranco StellariNaigang Wanget al.2018ESSDERC 2018
High-Q magnetic inductors for high efficiency on-chip power conversionNaigang WangBruce Doriset al.2016IEDM 2016
An 82%-efficient multiphase voltage-regulator 3D interposer with on-chip magnetic inductorsKevin TienNoah Sturckenet al.2015VLSI Circuits 2015
An 82%-efficient multiphase voltage-regulator 3D interposer with on-chip magnetic inductorsKevin TienNoah Sturckenet al.2015VLSI Technology 2015
14 Mar 2016US9287780Slab Inductor Device Providing Efficient On-chip Supply Voltage Conversion And Regulation
31 Aug 2015US9124173Slab Inductor Device Providing Efficient On-chip Supply Voltage Conversion And Regulation
25 Aug 2015US9118242Slab Inductor Device Providing Efficient On-chip Supply Voltage Conversion And Regulation
KEKaoutar El MaghraouiPrincipal Research Scientist and Manager, AIU Spyre Model Enablement, AI Hardware Center
PCPin-Yu ChenPrincipal Research Scientist and Manager; Chief Scientist, RPI-IBM AI Research Collaboration