Most multimedia data today is sampled and quantized from analog sources. Even with sophisticated recognition and indexing techniques, it remains difficult to associate human-centered, semantic content and structural information with it. This prohibits many desirable, more advanced interaction metaphors with the data than simple replay. We propose that a human-centered model of both multimedia data and suitable metaphors to directly and instantaneously interact with it is crucial to the design of more interactive and "multimedia-aware" system and application architectures. We show how we implemented such a model for the media type "music" in the WorldBeat system, a highly successful interactive computer-based music exhibit in the Ars Electronica Center in Lint, Austria. The system uses a high-level semantic concept of musical information called "musical design patterns". Interaction with this representation, like spontaneous, computer-supported improvisation, is supported in a novel way, using just a pair of infrared batons to control the entire exhibit. The system and its semantic model offer a technologically and artistically innovative approach that should be of interest to multimedia researchers as well as educators, artists and performers.
展开▼