The following paper presents a novel method for texture synthesis, which combines simple patch-based texture mapping with an appropriate stitching procedure, performed by means of Cellular Neural Networks. Texture mapping involves placement of same-size blocks, extracted randomly from some reference texture image, at regularly-spaced locations. Gaps between blocks are next filled with contents generated by means of a Cellular Neural Network. A CNN is expected to spontaneously transform its initial random state into a texture-fitting pattern. The appropriate template is designed by approaching a CNN from a linear filter perspective: template's transfer function is expected to match a spectrum of a target texture. The main advantage of the proposed method is its fast speed of texture rendering, combined with good-quality of generated images.
展开▼