About CoralNet

CoralNet is a resource for benthic images analysis. The site deploys deep neural networks which allow fully and semi automated annotation of images. It also serves as a data repository and collaboration platform. CoralNet is open source and free to use thanks to generous support from NSF and NOAA.

The site

The CoralNet website consist of several modules outlined below. For a (somewhat outdated) demonstration, check out our Vimeo channel.

Source: Main organizational element for a benthic survey or image "source". Here you specify your labelset, your privacy settings, and invite collaborators.

Labelset: Specify what labels you want to use in your analysis. Choose from a set of existing labels or create your own.

Import: Upload images, metadata and archived annotations to the site.

Annotation: Annotate your image right in the web browser using a point count interface. When enough images are manually annotated, an automated annotator is trained. This automated annotator is integrated directly into the annotation tool and makes the remaining annotation work easier.

Story

Introduction

A catastrophic decline of biodiversity and coral cover is occuring at reefs across the world. To monitor the changes, and inform action plans, large spatiotemporal surveys are needed. Data collection methods are typically sufficient to meet this need but the subsequent image analysis requires manual inspection of each photo, creating manual annotation bottleneck.

CoralNet reduces this bottleneck by deploying state of the art computer vision methods alongside human experts. Often 50-100% automation can be achieved with minimal reduction in the quality of the final data-product (see papers for references). CoralNet, by its nature, also provides a platform for collaboration & sharing of data.

CoralNet Begins

The idea of CoralNet was conceived by Oscar Beijbom, then PhD student at the UCSD computer science department. Oscar was developing computer vision methods for coral reef images analysis and wanted to make sure the methods were made available to reef researchers, managers and enthusiasts. However, deploying such methods is not trivial. They require significant compute resources, integration to the user-interface has to be done just right, and they require a lot of training data. In particular, getting a hold of large amounts of annotated data was very difficult at the time. For all those reasons releasing code or developing a desktop application was not going to cut it, it needed to be a fully managed web-server.

So CoralNet was born for the dual purpose of organizing the world's coral reef survey data, and using that data to create and deploy automated annotation methods.

CoralNet Alpha

The CoralNet Alpha server was developed by then UCSD undergrads Stephen Chan and Devang Sampat along with Oscar Beijbom. It first came online fall 2011 and funding came from NSF through a grant to PIs Greg Mitchell, David Kriegman and Serge Belongie. The Alpha server was single desktop computer housed at the UCSD computer science department. It was a scrappy effort but it worked!

The computer vision backend of CoralNet Alpha deployed an early research method which worked well, but not on par with human annotators. At the time, most users did not use the automated annotation engine, but were happy to have access to a free, easy-to-use, web-based annotation tool.

CoralNet Beta

A few years down the road there were over 300.000 images on CoralNet! The state of computer vision technology had also leapfrogged with the discovery of deep learning. We has also out-grown our desktop at UCSD. So the stars were aligned, and with funding from NOAA, we decided to release CoralNet Beta in November 2016.

The beta launch was a significant efforts with three main changes. 1) The server and database were moved to the cloud to enable continuous growth; 2) the site UI and features received a large overhaul; and 3) a much more powerful computer vision backend was deployed. The new backend is also hosted in the cloud and can engage 100 workers in parallel. Further the core algorithm was replaced by a much more powerful deep-learning based method. The method was presented at ICRS 2016 and performs on par with human expert annotators. Interestingly, super-human accuracy was observed when the human experts collaborated with the autonomous system.

CoralNet lead developer Stephen Chan led the software updates for the beta release, while Oscar Beijbom developed the computer vision backend. For full release notes refer to CoralNet Beta release-notes.

CoralNet 1.0

In early 2019 Williams et al. at NOAA conducted a large study and showed that the automated annotations for CoralNet Beta produced benthic cover estimates highly correlated with those derived from human annotators. This study thus suggests that CoralNet was finally accurate enough to fully address the manual annotation bottleneck!

Following these results a second grant was received from NOAA to develop CoralNet 1.0. During this period we achieved two major goals. First, we developed an API to allow easy and automated deployment of a trained classifier on new images. This also enables the creation of a repository of trained classifiers from which users can choose from. Second, we took another look at the core technology behind the computer vision system and managed to push performance even further. See our whitepaper for details.

CoralNet is open source. You can visit our GitHub repository to see what we're working on, learn how the website works, or even contribute improvements.

Data Policy

We at CoralNet highly encourage public sharing of research data for the greater good. However, we realize that data privacy is preferred in some cases, especially for newer projects. So, sources have two visibility options, Public or Private.

Public sources

  • All of your source's images and annotation data are available for the public to browse and download (including original images in full resolution).
  • Only members of the source can add or edit content.

Private sources

  • All users can still see your source on the world map, based on the source's latitude and longitude settings. Here, they can also see the name, description, affiliation, and total number of images in your source - but no example images.
  • Label information pages - public to all users - list the names of all sources that use the label, including private sources. Also, these label pages show example image 'patches' (small thumbnail cut-outs of a larger image) of point annotations using that label. These patches can come from any source using the label, but if a particular patch is from a private source, the name of the patch's source will not be given.
  • Only members of the source (invited with View, Edit, or Admin permissions) can browse all of the source's data, including full images and their annotations. Other users will be unable to browse pages and images within the source, even if they know the URLs.

User privacy

  • Your username (handle) is public on CoralNet through the list of profiles.
  • You may choose to make your user profile (including first name, last name, and affiliation) public for everyone, public for registered users only, or private.
  • The email address associated with your account is not publicly viewable.
  • We'll never ask you for your password. If you get an email asking you for your password, it didn't come from us.
  • The CoralNet website uses cookies for login and for analytics (via Google).

Credits

CoralNet has been supported by:

  • NSF: Computer Vision Coral Ecology grant #ATM-0941760, 2012 - 2015.
  • NOAA: Grants NA10OAR4320156, NA15OAR4320071 under the Automated Image Analysis Strategic Initiative, 2014 - 2019.
  • NOAA: CoralNet: Tackling Bottlenecks in Coral Reef Image Analysis with Next Gen Deep Networks for Photographs to Large Mosaics, 2019 - 2020.
  • NOAA: Quantifying Coral Reef Net Calcification Capacity and Vulnerability in the Context of Ocean Acidification, 2019 - 2021.
  • Pacific Blue Foundation.

People

Current

  • Oscar Beijbom [www, Nyckel]
  • Stephen Chan
  • David Kriegman [www]
  • Jessica Bouwmeester [www]

Alumni

  • Qimin Chen - ML researcher
  • Haoming Zhang - ML researcher
  • Serge Belongie - Academic advisor [www]
  • David Kline - Academic advisor [www]
  • Ben Neal - Academic advisor [www]
  • Gregory Mitchell - Academic advisor [www]
  • Devang Sampat - Developer
  • Andrew Hu - Developer
  • Jeff Sandvik - Developer

Papers

The following papers are relevant to the development of CoralNet. We ask that you cite the appropriate papers if you use CoralNet in your work.

Press

  • January 2017: Our beta launch featured in phys.org [www].
  • September 2016: CoralNet featured in the GTC keynote (at 0:00:52) [www].
  • August 2016: CoralNet featured in the Nature Toolbox section [www].
  • June 2016: NVIDIA blog post about Oscar's work on deep learning for coral ecology [www].
  • Nov 2014: Jonathan Cohen at NVIDIA highlights CoralNet in his talk at the SuperComputer conference. CoralNet section starts at 9.30 [www].
  • May 2014: Destin at SmarterEveryDay follows the data collected by the Catlin Seaview Survey all the way to CoralNet [www].
  • September 2013: Greenwire covers CoralNet [www].