Deep Chimpact: Depth Estimation for Wildlife Conservation

As an example of interesting use of the applicability of the artificial intelligence technologies, Deep Chimpact invites all competitors’ help to tackle the problems of wildlife conservation. The goal of the competition is to use machine learning as a way of automatically estimating the distance between the trap camera and the animals through various pools of video footage. The researchers aim to evaluate the number of primates living in a specific location by using various methods that include: the chimpanzees’ distance to the camera through different parts of the day, named the “distance sampling”, as well as to track how the population will be progressing through time. The important part of the competition is that all the data was already collected and labeled by the hosts. The only thing that the competitors need to figure out is the distance at a certain frame at the provided coordinates given by the training data.

an example analysis of a frame, taken from the competition website

an example analysis of a frame, taken from the competition website

The footage for the training set, which consists of 3900 videos up to 1 minute long, comes from two parks in West Africa(Côte d'Ivoire and Guinea); and the competition is only focused on chimpanzees among the other primate footages. The main challenging part of the competition is the fact that the footage varies in height and length, and with a single point of view as well as different levels of brightness and light levels. Therefore, the normalization of the data is a big issue of its own. Luckily, the researchers have provided some metadata to help the competitors as well as some frameworks for motion-tracking and distance estimation. Our team’s aim is to process the data accordingly for better results through state-of-the-art algorithms as well as to come up with ways to improve them on their own.

Previous
Previous

Jigsaw Rate Severity of Toxic Comments

Next
Next

NFL Health & Safety - Helmet Assignment