Dynamic matrix factorization: A state space approach
John Z. Sun, Kush R. Varshney, et al.
ICASSP 2012
This paper develops a novel optimization framework for learning accurate and sparse two-level Boolean rules for classification, both in Conjunctive Normal Form (CNF, i.e. AND-of-ORs) and in Disjunctive Normal Form (DNF, i.e. OR-of-ANDs). In contrast to opaque models (e.g. neural networks), sparse two-level Boolean rules gain the crucial benefit of interpretability, which is necessary in a wide range of applications such as law and medicine and is attracting considerable attention in machine learning. This paper introduces two principled objective functions to trade off classification accuracy and sparsity, where 0-1 error and Hamming loss are used to characterize accuracy. We propose efficient procedures to optimize these objectives based on linear programming (LP) relaxation, block coordinate descent, and alternating minimization. We also describe a new approach to rounding any fractional values in the optimal solutions of LP relaxations. Experiments show that our new algorithms based on the Hamming loss objective provide excellent tradeoffs between accuracy and sparsity with improvements over state-of-the-art methods.
John Z. Sun, Kush R. Varshney, et al.
ICASSP 2012
Victor Akinwande, Megan Macgregor, et al.
IJCAI 2024
Lav R. Varshney, Kush R. Varshney
Proceedings of the IEEE
Hannah Powers, Ioana Baldini Soares, et al.
NeurIPS 2024