Underwater High Definition Video Camera Platform

This project is funded by Canarie's Network Enabled Platforms (NEP) program, as follow-on to a previous Intelligent Infrastructure Program (CIIP) project. The project is being coordinated by Benoît Pirenne (UVic).

Overview

Oceans 2.0 intends to allow the distributed community of ocean scientists to work together on research projects aided by a dynamic and modern, web-based software system. Motivated by the massive volumes of data, including continuously streaming video, being made available by ocean observatories (e.g., VENUS and NEPTUNE) and the demands of interoperability with other data centres, there is an urgent need to move beyond the constraints of traditional web-based interaction with the associated assets. A Web 2.0 approach offers the possibility of more effective team work, supporting knowledge exchange between the ocean science communities and other cross-disciplinary groups, who must search, process, and visualize data on-line, capitalizing on the diverse expertise necessary to work successfully on these problems.

As part of this project, we are working on the development of Web services software that matches a common set of underwater video camera control inputs and video stream outputs to the bandwidth available to a particular scientist and allows scientists to collaborate through sharing the same underwater view in real time. We will then produce a web-based video camera user interface that makes use of the controls and features available through these web services. In addition, we will test an existing automated event detection algorithm for possible integration into the "live" system. These objectives require the following tasks:

Automated Change Detection

The final task listed above relates to earlier work our group conducted on automated change detection. Our motivation for this work was the large quantities of visual data becoming available to ocean scientists, but which overwhelms their manual processing abilities. To cope with such large data sets, an automated change detection system was developed that helps isolate the time periods in which significant activity is found in the video sequence.

First, the following example demonstrates why conventional change-detection mechanisms are inadequate for the undersea environment.


A conventional change detection mechanism based on frame differencing. Frame t (left), frame t+50 (center), and difference image, |frame t - frame t+50| (right). The actual area of change is indicated by the small rectangular bounding box in the difference image.

Unlike change detection algorithms in use in terrestrial environments, the system must account for the photometric complexity of underwater video, including interference from small floating particles ("sea snow"), the scatter of light as it propagates through water, and the non-uniform frequency decay of light intensity with distance. In addition, certain activity, such as the motion of swimming fish that are attracted by the use of artificial lighting, is considered a distracter, and should, ideally, be ignored. These factors are addressed by our system, in large part through the use of Mixture-of-Gaussians background models.

The following video demonstrates the detection and classification of "significant" objects, as distinct from the distracters (both the sea snow and the fish that swim through the scene). These are highlighted by the lighter boxes, overlaid for illustrative purposes.


At the same time, we construct a background image, in which all distracters have been removed. For illustrative purposes, this is best seen in the following example, in which the colour histogram was first equalized.

histogram-equalized video

background frame produced by algorithm

It is difficult to quantify the accuracy of our algorithm, given that the only reasonable test video sequence available to us was extremely short and limited in variability. However, as one metric of performance, on the test sequence used for this initial study, we find that the system consistently distinguishes between the constantly moving fish (distracter) and both the periodically moving fish and crab (significant objects). False positives only resulted from unpredictable background motion, as seen in the cloud of dust produced by a fish sweeping its tail along the sea floor. Another possible source of error is that the change detection algorithm may produce false negatives when significant objects exhibit similar colour to the background, just as camouflaged objects are difficult for humans to detect.

Further information is available from our Oceans paper: Qi, Z. and Cooperstock, J.R. Automated Change Detection in an Undersea Environment using a Statistical Background Model. Oceans '07, Sept. 29-Oct. 4, Vancouver.


Last update: 26 October 2009