Ziv Bar-Yossef, Ravi Kumar, et al.
SODA 2002
We present a novel technique, based on the Jensen-Shannon divergence from information theory, to prove lower bounds on the query complexity of sampling algorithms that approximate functions over arbitrary domain and range. Unlike previous methods, our technique does not use a reduction from a decision promise problem. As a result, it gives stronger bounds for functions that possess a large set of inputs, each two of which exhibit a gap in the function value. We demonstrate the technique with new query complexity lower bounds for three fundamental problems: (1) the "election problem", for which we obtain a quadratic improvement over previous bounds, (2) low rank matrix approximation, for which we prove the first lower bounds, showing that the algorithms given for this problem are almost optimal, and (3) matrix reconstruction. In addition, we introduce a new method for proving lower bounds on the expected query complexity of functions, using the Kullback-Leibler divergence. We demonstrate its use by a simple query complexity lower bound for the mean.
Ziv Bar-Yossef, Ravi Kumar, et al.
SODA 2002
Ziv Bar-Yossef, T.S. Jayram, et al.
Journal of Computer and System Sciences
T.S. Jayram, Subhash Khot, et al.
STOC 2003
Ziv Bar-Yossef, Ido Guy, et al.
ICDM 2006