Shai Fine, Yishay Mansour
Machine Learning
A communication system consisting of a number of buffered input terminals connected to a computer by a single channel is analyzed. The terminals are polled in sequence and the data is removed from the terminal's buffer. When the buffer has been emptied, the channel, for an interval of randomly determined length, is used for system overhead and/or to transmit data to the terminals. The system then continues with a poll of the next terminal. The stationary distributions of the length of the waiting line and the queueing delay are calculated for the case of identically distributed input processes. © 1974, ACM. All rights reserved.
Shai Fine, Yishay Mansour
Machine Learning
Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
Paula Harder, Venkatesh Ramesh, et al.
EGU 2023
Zhikun Yuen, Paula Branco, et al.
DSAA 2023