Liat Ein-Dor, Y. Goldschmidt, et al.
IBM J. Res. Dev
This paper discusses a method for placing confidence limits on the steady state mean of an output sequence generated by a discrete event simulation. An estimate of the variance is obtained by estimating the spectral density at zero frequency. This estimation is accomplished through a regression analysis of the logarithm of the averaged periodogram. By batching the output sequence the storage and computational requirements of the method remain low. A run length control procedure is developed that uses the relative width of the generated confidence interval as a stopping criterion. Experimental results for several queueing models of an interactive computer system are reported. © 1981, ACM. All rights reserved.
Liat Ein-Dor, Y. Goldschmidt, et al.
IBM J. Res. Dev
Rajeev Gupta, Shourya Roy, et al.
ICAC 2006
Zohar Feldman, Avishai Mandelbaum
WSC 2010
Daniel M. Bikel, Vittorio Castelli
ACL 2008