Dice and Queues
Published on: 2025-05-05 12:38:42
Introduction
One of the key insights from queuing theory is that the average queue size for an unbounded system tends to increase significantly as utilization approaches 100%. Theoretical models show no bound to this increase and at 100% utilization the resulting average queue size grows toward infinity. This is important because queues themselves consume resources, be it memory or disk space on a computer or factory floor space in a plant. In an ideal world, we wouldn’t need queues. Who likes waiting in lines after all? This would only be possible if the arrival rate of items in a queue was less than or equal to the departure rate and there were no variability in either. Not likely to happen in the real world.
Queuing models are often expressed using Kendall Notation (e.g. M/M/1 or M/D/1). These models specify different arrival and departure distributions, such as Poisson or Deterministic. They end up having closed-form expressions that define the average queue size as a function of
... Read full article.