We present an algorithm and its implementation for the interactive view synthesis of 3D video objects. The algorithm is based on the theory of trifocal transfer using dense depth information of two cameras. In order to implement the algorithm and its post processing stages at interactive rates we employ commodity graphics hardware. We present issues that arise when using the pipeline architecture of the GPU for non-computer graphics algorithms and introduce approaches how to solve these problems with focus on our application. Finally, we show that the employment of a GPU can speed up the synthesis significantly.
展开▼