DLFloat: A 16-b Floating Point Format Designed for Deep Learning Training and InferenceAnkur AgrawalBruce Fleischeret al.2019ARITH 2019
Accumulation bit-width scaling for ultra-low precision training of deep networksCharbel SakrNaigang Wanget al.2019ICLR 2019
Innovate Practices on CyberSecurity of Hardware Semiconductor DevicesAlfred L. CrouchPeter Levinet al.2019VTS 2019
Training deep neural networks with 8-bit floating point numbersNaigang WangJungwook Choiet al.2018NeurIPS 2018
A Scalable Multi-TeraOPS Core for AI Training and InferenceSunil ShuklaBruce Fleischeret al.2018IEEE SSC-L
A Scalable Multi-TeraOPS Deep Learning Processor Core for AI Trainina and InferenceBruce FleischerSunil Shuklaet al.2018VLSI Circuits 2018
Novel IC sub-threshold IDDQ signature and its relationship to aging during high voltage stressFranco StellariNaigang Wanget al.2018ESSDERC 2018
High-Q magnetic inductors for high efficiency on-chip power conversionNaigang WangBruce Doriset al.2016IEDM 2016
An 82%-efficient multiphase voltage-regulator 3D interposer with on-chip magnetic inductorsKevin TienNoah Sturckenet al.2015VLSI Circuits 2015
An 82%-efficient multiphase voltage-regulator 3D interposer with on-chip magnetic inductorsKevin TienNoah Sturckenet al.2015VLSI Technology 2015
30 Oct 2017US9806615On-chip Dc-dc Power Converters With Fully Integrated Gan Power Switches, Silicon Cmos Transistors And Magnetic Inductors
16 Oct 2017US9793336High Resistivity Iron-based, Thermally Stable Magnetic Material For On-chip Integrated Inductors
KEKaoutar El MaghraouiPrincipal Research Scientist and Manager, AIU Spyre Model Enablement, AI Hardware Center
PCPin-Yu ChenPrincipal Research Scientist and Manager; Chief Scientist, RPI-IBM AI Research Collaboration