An analytical framework for modeling the performance of a single TCP session in the presence of random packet loss is presented. A Markovian approach is developed that allows us to study both memoryless channels (IID packet loss) and channels with memory (correlated packet loss) modeled by a two-state continuous-time Gilbert model. The analytical results are validated against results using the ns simulator. It is shown that the model predicts throughput for LAN/WAN (low and high bandwidth-delay products) with good accuracy. Further, throughput for the IID loss model is found to be relatively insensitive to the probability density function (PDF) of the loss inter-arrival process. For channels with memory, we present an empirically validated rule of thumb to categorize the channel transition frequency.
展开▼