AI for corals

Technology
by
Guido
Published on
February 28, 2024

Empowering nature restoration projects with AI

Our mission is clear: to make the impact of your donation to nature restoration projects tangible and engaging. We're enabling you to see the coral fragment you donated, or the tree, or any other impact grow over time. We validate the imagery we receive from the partners so you can be sure your impact is unique and additional! 

But at the same time we don't want to burden project partners with a new administrative chores; the whole point here is to accelerate their work and not slow it down.
That's why we're always on the look-out for more scalable ways to create the 1:1 proofs we're famous for.

Proof of planted corals were particularly tricky, given the limited time divers can remain under water and the impracticality of photographing under water. Especially if there's more complex structures involved with artificial or regenerated reefs, an additional challenges arises in making sense of the spatial placement of the snapshots we receive.

Our motto is always to tackle the hard problems first, and this was definitely one worth solving, so we set out to leverage Google Vertex and Gemini, to develop our own coral recognition AI.

Creating the training set

I started by turning videos into over 3,000 pictures (frames) to create a training set for our AI. I then proceeded to annotate 100 images to train our model - basically indicating what is a coral and what is not. As i went along, I watched it grow smarter, first guessing right 50% of the time, then 70% after I fine-tuned it with 200 more images.

Results

The better the training set, the better the model outcomes. So we spent a good old pizza night tagging more than 2,000 images, our model now operates at a higher than 90% accuracy. This way the project partners can create simple overview videos/pictures, and our Operations team can do the rest with some AI-assisted magic. The good news is that every new batch of validated images enlarges our dataset, making our model smarter with every impact made!

In order to share cool biodiversity updates in our News feed, we went ahead and tagged the fish inhabiting the structures as well.

So what’s next?

We're going to optimize the accuracy of the model across different coral projects, as local conditions, species and other aspects can vary from one location to another. Ideally we use one model that scales across the different projects for each category (corals), but time will tell if this approach is feasible in the short-term. And we're not sticking to just 2d snapshots, but will move into stitched models (orthomosaics / 3d point clouds). And trees? They're next on our list, with plans to leverage drone footage and bodycams to connect you directly to the impact being made in the most tangible way! 

Stay tuned as we continue to innovate, making saving nature tangible and also downright awesome.

Hooked on the story?