Publication Date


Degree Program

Department of Mathematics and Computer Science

Degree Type

Master of Science


A mathematical model for a discrete-time buffer system with both arrival and server interruptions is developed. In this model fixed-size packets arrive at the buffer according to a Poisson distribution and are stored there until they can be transmitted over the output channel. Service times are constant and the buffer is assumed to be of infinite size. Both arrival stream as well as the service of the packets are subjected to random interruptions described by Bernoulli processes, where the interruption process of the Poisson input stream is uncorrelated to the interruptions of the output line. Expressions are derived for the mean waiting time, the mean queue length, the average lengths of idle and busy periods of the server, and for the server utilization. The behavior of the system is demonstrated with a computer simulation; the simulation results are used to indicate optimal buffer sizes.


Computer Sciences | Mathematics