Charles H. Bennett, Aram W. Harrow, et al.
IEEE Trans. Inf. Theory
This paper discusses a method for placing confidence limits on the steady state mean of an output sequence generated by a discrete event simulation. An estimate of the variance is obtained by estimating the spectral density at zero frequency. This estimation is accomplished through a regression analysis of the logarithm of the averaged periodogram. By batching the output sequence the storage and computational requirements of the method remain low. A run length control procedure is developed that uses the relative width of the generated confidence interval as a stopping criterion. Experimental results for several queueing models of an interactive computer system are reported. © 1981, ACM. All rights reserved.
Charles H. Bennett, Aram W. Harrow, et al.
IEEE Trans. Inf. Theory
Khalid Abdulla, Andrew Wirth, et al.
ICIAfS 2014
Yvonne Anne Pignolet, Stefan Schmid, et al.
Discrete Mathematics and Theoretical Computer Science
S. Sattanathan, N.C. Narendra, et al.
CONTEXT 2005