Handling on-demand streaming of multimedia requires the extremely high bandwidths of servers and network in order to service individual customer requests. This situation has resulted in many protocols aimed at reducing the bandwidth requirements of on-demand streaming services. Despite all their differences, most of these proposals fit into one of three groups.; The first group of proposal follows a proactive approach. This approach anticipate customer demand and distribute the various segments of each video according to a deterministic schedule. These distribution protocols have the common name of broadcasting protocols. All recent broadcasting protocols for video-on-demand derive in some fashion like pyramid broadcasting protocol. They partition each video into segments that are simultaneously broadcast on different channels. They also require customers to be connected to the service through a smart set-top box capable of receiving data at rates exceeding the video consumption rate and storing locally the video data that arrive out of sequence. They assume that customers will watch videos in sequential fashion without any fast forwards. This setup allows broadcasting protocols to transmit the various segments of each video using less and less bandwidth as the video progresses. Each video segment will typically require less bandwidth that its predecessor with the initial segment thus requiring the most bandwidth.; Proposals in the second group take a different approach. They achieve the bandwidth reduction by dynamically aggregating clients that make requests closely spaced in time, so that eventually these clients share the same streams. These schemes are said to be a reactive approach. Already proposed protocols, namely patching, dynamic skyscraper, and hierarchical stream merging, provide scalable on-demand streaming without requiring clients to wait for some fixed period of time as in a broadcasting protocols or a batching system.; Besides protocol oriented video streaming delivery schemes, video streaming delivery protocol for large and popular video streams can use various techniques to improve performance per costs. Caching protocols, like proxy caching and preloading schemes, can reduce server loads, network traffic, and access latencies. However, video files can be very large and techniques that cache entire video objects are not appropriate. A number of caching strategies have been proposed that cache a small portion of video files. Storing an initial prefix of the video file at a client's set-top box or proxy servers have numerous advantages, preventing clients from delays and jitter from online video streams, performing smoothing to reduce the burstiness of video file services without introducing additional client waiting delays, and reducing traffic on the server network.; This dissertation describes, proposes, and compares several video streaming delivery protocols and techniques that concern the required service bandwidth, client waiting time, and client buffer requirement. We will introduce general video streaming delivery schemes, called proactive, reactive, and caching approaches, and compare video streaming delivery protocols in terms of required server bandwidth, client buffer requirement, client waiting time and decoding complexity of incoming video streams.
展开▼