In optical burst switching (OBS) networks, a key problem is to schedule as many bursts as possible on wavelength channels so that the throughput is maximized and the burst loss is minimized. Most of the current research on OBS has been concentrated on reducing burst loss in an #x201C;average-case#x201D; sense, and little effort has been devoted to understanding the worst case performance. Since OBS itself is an open-loop control system, it may exhibit a worst case behavior when adversely synchronized. On the other hand, most commercial systems require an acceptable worst case performance.
展开▼