We are living in times of splendour for Artificial Intelligence (AI), which astounds us with its advances: from curious and surprising applications such as creating non-existent hyper-realistic faces or pieces of music and literature, to others of enormous transcendence, such as fighting cancer or solving a half-century-old mathematical problem that promises to revolutionise biomedical research. And at a time when environmental issues have become a global emergency, AI is also opening up a vast new field for conservation biology and the fight against ecological degradation and biodiversity loss.
For centuries, nature has been studied and conserved thanks to the walking boots of seasoned naturalists and explorers. The conservation management of species and their habitats used to depend exclusively on fieldwork—tracking specimens, counting them by eye, ringing or tagging them and taking notes by hand, all undertaken through laborious and often risky expeditions. Technology brought new systems into the field, such as camera traps and GPS collars, increasing the volume and resolution of data that still had to be processed by rudimentary methods.
In the 1980s, algorithms began to be introduced in conservation biology, with geographic information systems and others that allowed the prioritisation of areas to be protected as nature reserves in a more systematic and rigorous way. Since then, this marriage of ecology and technology has proliferated in a multitude of applications that use networks of connected sensors, drone or satellite imaging, machine learning systems, artificial vision or facial recognition software, among many other tools. All this technology not only makes it possible to easily acquire and process huge volumes of data, but also to make AI-based decisions that integrate the effects of climate change. It is a radical transformation that already involves a large community of 6,500 professionals in 120 countries, according to the Wildlabs conservation technology network.
One example of the potential of these applications is being developed in Australia’s Great Barrier Reef, the largest living structure on the planet and a natural wonder severely threatened by climate change. Over the past three decades, bleaching caused by ocean warming has destroyed more than half of the coral. Researchers from Queensland University of Technology and the Australian Institute of Marine Science have developed an algorithm and a range of technologies to map and protect this ecosystem. The team uses drones and machine learning to fly over the Great Barrier Reef at an altitude of 60 metres, collecting and analysing data to classify bleaching levels. Measurements obtained from the air are compared with surveys done underwater.
The Great Barrier Reef is roughly the size of Japan, with more than 3,000 individual reefs stretching over 2,300 kilometres, but the system is able to monitor it quickly and effectively. “The algorithm allows the comparison of large databases that can be used to identify other areas at risk. And the more data scientists have during a bleaching event, the better they can address it,” aeronautical engineer and aerial robotics expert Felipe Gonzalez, leader of the project, tells OpenMind.
He explains that each coral emits “unique hyperspectral fingerprints” and that each individual colony alters its hyperspectral signature—light at wavelengths beyond the visible spectrum—as its level of bleaching changes, so the system is able to track these changes over time. “Our goal is to mitigate the destruction of the Great Barrier Reef by identifying the regions most at risk, in order to more effectively allocate resources and methods of protection,” says Gonzalez. The project is supported by Microsoft’s AI for Earth programme, one of the most active platforms for applying AI to conservation.
An aerial laboratory fly over the Amazon
Arizona State University ecologist Greg Asner has used a similar approach to help preserve the world’s largest rainforest—the Amazon—along with other ecologically valuable regions. In 2006, Asner and his team launched the Global Airborne Observatory (formerly the Carnegie Airborne Observatory), an airborne laboratory that houses highly advanced Earth-mapping technology. The observatory’s instrumentation and computing package, called the Airborne Taxonomic Mapping System, or AToMS, maps regions in 3D using airborne sensors and advanced algorithms. The current version of the system combines high-resolution digital imaging, spectrometry and lasers.
During their work in the Amazon, the mapping by Asner and his collaborators showed that this region is home to 36 different forest types, and their results have guided the environmental policy of several governments. Asner’s system has identified the spectral signatures of half of the world’s 60,000 tree species, as well as measuring the amounts of carbon stored in forests or even entire countries such as Panama, and discovering millions of hectares threatened by mining or logging.
The researcher has since applied his system to coral reef conservation, and has taken his observatory into space: thanks to a network of 140 small satellites and a partnership with NASA’s Earth observation programme, his work has led to the creation of the Allen Coral Atlas, a comprehensive map of the world’s reefs, compiling millions of satellite images.
Facial recognition at sea
Facial recognition technology has also found application in conservation. In 2016, the environmental NGO The Nature Conservancy won the Google Impact Challenge award in Australia with its FishFace project. This is a system that uses artificial intelligence techniques and electronic monitoring systems on fishing vessels to record all the activity that occurs on them and identify the number and type of catches, in order to help manage fisheries more sustainably.
Since its launch, the project has made progress in creating the hardware for FishFace—the set of sensors that will be responsible for collecting the data—and in fine-tuning the machine learning algorithm that will recognise the different species. A test on a fishing boat in Indonesia has demonstrated the validity of the system, with 90-95% accuracy in identifying species. The NGO is involved in the development of other electronic fisheries management projects aimed at preventing illegal fishing.
The fight against illegal activity, in this case poaching, is also the focus of conservation technology projects. The Connected Conservation Foundation uses images from Airbus satellites with a resolution of 30 centimetres to detect poaching incidents from space. The NGO also uses these observations and wildlife detection algorithms to locate endangered species in hard-to-reach areas and track their movements.
At the University of Southern California, a collaboration with technology companies led to the creation of PAWS, which stands for Protection Assistant for Wildlife Security. This AI system is fed with geographic information and data on poaching activity in a region to predictively model the behaviour of offenders, which aids surveillance management. PAWS has been integrated into a database of poaching activity called SMART, the Spatial Monitoring and Reporting Tool, which is currently used in 800 protected areas in 60 countries, and is also being adapted to tackle illegal logging and fishing.
Social media technology has also been applied to this work. With the collaboration of several US universities, the conservation technology organisation Wild Me launched the Wildbook platform, dubbed the Facebook for animals. Through crowdsourcing nature photos taken by scientists, tourists or automatic cameras, and tracking those published on internet sources, the system uses artificial vision and machine learning to count animals and even identify them individually through their marks, wrinkles or scars. Wildbooks currently exist for species such as the Iberian lynx and sharks, providing scientists with a wealth of data on the status and evolution of populations.
Thanks to technology, it is no longer even necessary to obtain direct images of the animals: thanks to the University of Berkeley and the WildTrack organisation, the FIT system, Footprint Identification Technology, processes images of animal footprints to determine the species, age or sex of the animals. The algorithm is able to identify specific individuals, making it easier to track them without the need for GPS collars or other systems that interfere with the animals’ natural behaviour.
These are just a few examples among many others of how AI applied to conservation is booming. The possibilities for these technologies are almost endless, limited only by funding, duplication of effort or the need to build more capacity, according to the first State of Conservation Technology report published by Wildlabs in 2021. Yet, as Cambridge University conservation specialist William Adams points out, these technological tools can never eliminate the human factor: “The fieldworker who makes ecological observations, the farmer or hunter who knows how the seasons change, the water collector or firewood gatherer, are all essential to intelligent conservation solutions.”
Comments on this publication