Exploring 3D technology for nature restoration projects

Technology
by
Guido
Published on
December 7, 2023

While watching Apple's keynote on the new Vision Pro, my first thought was to question the necessity of introducing more technology into our increasingly rare, non-tech-centric social moments. But it did got me thinking about how we could bring our "follow your donation" concept to the next level.

Imagine diving into a digital twin of a coral reef you helped restore, watching it grow over time, or strolling through a forest you helped plant on the other side of the world, identifying your own tree among the many others contributed to this project.

As this new AI wave brings so much potential to make this vision come to life with relatively low effort I explored options for an easy-to-build pilot project.


3D model creation

To generate 3D models, you need a lot of footage, preferably images. There are quite a few apps, like Reality Scan or Polycam, where you can easily take some images with your phone and get exciting results. With about 300 photos of an object, you can create a pretty decent 3D model.

For example, below is a 3D scan of a meter squared of mangrove forest, provided by the Ryan de Jongh Charity Foundation, based on around 250 images using Polycam.

Polycam 3D model

But taking that many images of a single donation item, like a coral fragment or seedling, isn't doable. However, what if we could generate a 3D model by using video and extracting imagery from it?

The mangrove project provided some cool drone videos (about 6 videos of 2 minutes each) of the project site. The different drone imagery I put into Drone Deploy, and I was amazed with the outcome.

Drone Deploy 2D model

Not only did it generate a cool 3d render of the project site it also provided a very detailed 2d image that we could overlay on our map.

Drone Deploy 2D overlay


Now, this is all exciting but still not as cool as my ambition to 3D map the coral reefs.

Reefsystems provided a video of just under 2 minutes of one of the underwater reefs our donors helped restore. With this footage, I tried some more professional tools and got some interesting results.

Raw video used for renders

As this whole thing started with Apple's Vision Pro, I checked out the software they provide to content creators to create 3D models. So I pushed my MacBook Pro M2's to its limits overnight on this 2-minute video and achieved a pretty cool result.

RealityKit render

Generating these scans is quite energy-intensive, and it's a valid question whether the environmental costs are worth the extra engagement from our donors. Next to that, it is far from perfect and you'd need a lot more footage to create a proper model.

Trying not to get too deep into the geeky details (you can read more about NeRf and other methods if you're curious), I stumbled upon a neat 3D tool by Luma Labs when listening to a nerdy podcast (POKI for who's interested), using a relatively new method called Gaussian Splatting. It's pretty neat – it creates a surreal, almost dreamlike version of the video. Considering it's just a 2-minute clip, the results are pretty impressive!

LumaLabs render - You have to navigate around to see some really clear areas, by double clicking you change the point of center.


Next steps

So, what now? The first step is to check what happens if I use a very high-quality video of, let's say, 30 minutes, and load it into more models for a good comparison. As I am using a Mac, I haven't tried some other good ones like Reality Capture yet.

In order to find your coral in the reef, I will need to build some form of tagging tool where we can link the donations to the impact made. We will be doing a exciting hackathon with ML6 during Christmas to discover what can be done with vision AI software to detect objects like coral fragments in a 3D model.

Lastly, this deserves a good interface, where you can actually "dive" into the coral reef and swim to your donation, or navigate to the forest to see the tagged trees. So I guess some crazy evenings of coding will get me there.

If you have any ideas or want to help out, please reach out at info@sumthing.org and don't forget to subscribe to our newsletter if you are interested in learning more.

Hooked on the story?