Wooseok Choi, Tommaso Stecconi, et al.
Advanced Science
The least-squares predictor for a random process which is generated by linear difference equations is known to obey similar linear difference equations. A stability theory is developed for such equations. Conditions under which the infinite covariance matrix of the process, considered as a bounded operator: l2 → l2, has a bounded inverse are shown to be both necessary and sufficient conditions for the stability of the optimum predictor. The same conditions also ensure the convergence of an algorithm for factoring recursively the infinite covariance matrix as a product of upper and lower triangular factors. Finally, it is shown that the stability obtained in this fashion is equivalent to uniform asymptotic stability. © 1969 American Elsevier Publishing Company, Inc.
Wooseok Choi, Tommaso Stecconi, et al.
Advanced Science
Ryan Johnson, Ippokratis Pandis
CIDR 2013
Tim Erdmann, Stefan Zecevic, et al.
ACS Spring 2024
Victor Akinwande, Megan Macgregor, et al.
IJCAI 2024