ICoSOLE aims at developing a platform that enables users to experience live events which are spatially spread out, such as festivals (e.g. Gentse feesten in Belgium, Glastonbury in the UK), parades, marathons or bike races, in an immersive way by combining high-quality spatial video and audio and user generated content. The project will develop a platform for a context-adapted hybrid broadcast-Internet service, providing efficient tools for capture, production and distribution of audiovisual content captured by a heterogeneous set of devices spread over the event site.
The approach uses a variety of sensors, ranging from mobile consumer devices over professional broadcast capture equipment to panoramic and/or free-viewpoint video and spatial audio. Methods for streaming live high-quality audiovisual content from mobile capture devices to content acquisition, processing and editing services will be developed.
In order to combine the heterogeneous capture sources, ICoSOLE will research and develop approaches for integration of content from professional and consumer capture devices, including mobile (and moving) sensors, based on metadata and content analysis. Methods for fusing visual and audio information into a Format agnostic data representation will be developed, which enable rendering video and audio for virtual viewer/listener positions.
ICoSOLEwill develop efficient tools for media production professionals to select, configure and review the content sources being used. These tools capture, extract and annotate metadata during the production process and integrate this metadata throughout the entire production chain to the end user. Content will be provided via broadcast, enhanced by additional content transported via broadband and novel interaction possibilities for second screen and web consumption. The content will also be provided in an adapted form to mobile devices, with specific location-based functionalities for users at or near the place of the event.