ICoSOLE will research real-time capable algorithms for automatic and reference-free assessment of the quality of video streams originating from low quality devices and from suboptimal capture conditions (hand-held cameras, insufficient lighting). The Information extracted from the user generated content is the basis for automatic camera stream selection by content features such as “best lit”, “least blurred”, “lowest noise” and “least shaky”. In order to develop these novel algorithms we will build on top of existing ICoSOLE Partner know-how on robust and fully automatic video quality assessment.
The real-time requirement for the ICoSOLE use case is challenging, as many important image processing and computer vision algorithms (e.g. tracking, optical flow, descriptor matching etc.) have a significant runtime. To handle this, we will employ high-performance image processing data structures like integral images and kd-trees. On the other hand, for sufficiently parallelizable algorithms we plan to research and develop real-time implementations which take usage of the massive computational power of GPUs (Graphical Processor Units). We plan to use CUDA, which is currently the most mature environment for general-purpose GPU programming. When comparing algorithm runtime on GPU and CPU, a speedup factor of an order of magnitude can be usually achieved with respect to a multi-threaded CPU implementation. From recent projects ICoSOLE partners gained profound experience how to optimally parallelize algorithms and how to develop highly efficient GPU implementations of computer vision algorithms.
Most saliency and unusualness detection methods are based on assumptions of static camera and learn their models of usual content from a single stream. ICoSOLE will advance These methods by considering camera motion and build models for different target areas covered in a stream. We will also develop methods that are able to integrate the models of usualness and saliency between different streams taken at nearby locations or by the same user.