DAG-GNN: DAG structure learning with graph neural networks
Yue Yu, Jie Chen, et al.
ICML 2019
The Hutchinson estimator defines an estimate of the trace of a matrix M, based on a bilinear form with independent vectors y of zero-mean unit-variance uncorrelated entries. This technique is particularly useful when M is only implicitly given but the matrix-vector product My can be efficiently computed without M being explicitly formed. Well-known examples in practice are M = A-1, and more generally, M = f(A). We study in this paper the conditions under which the numerical error incurred in computing My is comparable with the statistical uncertainty caused by the randomness of y. With these conditions, we also derive the sufficient number of random vectors that guarantees a relative error bound given any desired probability. For the purpose of obtaining easily computable conditions, we focus on the use of random vectors consisting of normal variables, a precursor technique attributed to Girard by Hutchinson. As demonstrated in many practical scenarios, normal variables are as effective as symmetric Bernoulli variables (a more common definition under the name of Hutchinson) but are advantageous in that they enjoy a simultaneous estimation of the estimator variance.
Yue Yu, Jie Chen, et al.
ICML 2019
Mihai Anitescu, Jie Chen, et al.
Journal of Computational and Graphical Statistics
Tianchun Wang, Farzaneh Mirzazadeh, et al.
ICML 2023
Jie Chen, Lois C. McInnes, et al.
Journal of Scientific Computing