Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
The least-squares predictor for a random process which is generated by linear difference equations is known to obey similar linear difference equations. A stability theory is developed for such equations. Conditions under which the infinite covariance matrix of the process, considered as a bounded operator: l2 → l2, has a bounded inverse are shown to be both necessary and sufficient conditions for the stability of the optimum predictor. The same conditions also ensure the convergence of an algorithm for factoring recursively the infinite covariance matrix as a product of upper and lower triangular factors. Finally, it is shown that the stability obtained in this fashion is equivalent to uniform asymptotic stability. © 1969 American Elsevier Publishing Company, Inc.
Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
Fearghal O'Donncha, Albert Akhriev, et al.
Big Data 2021
Arthur Nádas
IEEE Transactions on Neural Networks
Zahra Ashktorab, Djallel Bouneffouf, et al.
IJCAI 2025