This article was originally published by Hakai Magazine.
To an eye in the sky, a stranded whale on the shoreline might look like a pink blob, a gray smear, or a long line of bleached and curving white. It may be a curled question mark that ends in flukes, or a long ellipsis of decomposition.
Yet a new study highlights how, as satellite imagery improves, accurately identifying which colorful splotches are indeed stranded whales is becoming possible. The scientists behind the paper further argue that spying from space is an effective way to identify these beached behemoths in places where they would otherwise go undiscovered, such as on remote coastlines, in resource-limited nations, or in countries experiencing conflict.
For as long as humans have been monitoring the ocean, the only way we’ve known about stranded whales was to stumble upon them ourselves. But knowing about stranded whales—including where and when they strand, and how many are ashore—is vitally important. Largely due to human causes such as ship strikes, pollution, and entanglement in fishing gear, whale strandings are on the rise. Their occurrence can signal that something is amiss and hint at a larger ecosystem problem, such as a harmful algal bloom. Yet the ground-based networks used to monitor stranded whales are biased toward wealthy, highly populated regions.
The new paper shows that very-high-resolution (VHR) satellite imagery makes it possible to spot stranded large-bodied whales, such as humpback or sperm whales, in remote areas where they may otherwise take weeks to find, if they’re noticed at all. By that point, animals are usually long decomposed, making it too late to figure out what caused the stranding or take action to fix it.
“Satellites may allow local communities to greater understand the patterns, timing, and location of mass stranding events, to inform when to invest resources for intervention on the ground,” says Penny Clarke, the paper’s lead author and a graduate student with the British Antarctic Survey.
The first satellite with VHR sensors launched in 1999. With the number of them in orbit now slowly climbing, Clarke’s team sees this sort of imagery as a tool that could help decolonize science by giving less-wealthy countries, which account for roughly 70 percent of the world’s coastlines, a tool that can allow just a few people to monitor a large area.
As a case study, the team examined a 2015 stranding in Golfo de Penas, an extremely remote area in Chilean Patagonia. That year, at least 343 sei whales washed up dead on the gulf’s wild shorelines. Nobody knew about the stranding for two months, until a research team happened upon the carcasses.
In retrospect, satellites saw them. Analyzing archival VHR images later allowed researchers to estimate the number of dead whales and confirm that the stranding started in early March.
In March 2019, a number of dead sei whales were again discovered in the Golfo de Penas. But this time, Clarke was ready. She examined satellite photos taken of the region from February 2 to February 18 and saw few whale-shaped objects. The lack of whales in these earlier images suggests the stranding began in late February or early March.
Examining the repeated strandings in the Golfo de Penas shows the satellite approach does have some limitations. As Clarke found, images may not be available for the date range researchers want; there are currently only 27 VHR satellites circling Earth, three of which are for military use. Satellites also take photos only when “tasked”—when given orders to open their lenses. Tasking a satellite is expensive, and even accessing archived images can come with a hefty price tag.
Additionally, identifying whales requires manually scanning for the right shapes, frame by frame. In 2019, a team led by Clarke’s co-author, Peter Fretwell, tried to automate this process. They found that because dead whales change so drastically as they decompose, the algorithm’s search wasn’t very accurate. It often confused whales with features such as rocks or washed-up trees.
Clarke and her colleagues say that better automation, improved by machine learning and artificial intelligence, could identify whales in images rapidly and with greater accuracy. They also think satellite companies could cooperate with governments and organizations to provide low-cost access to imagery.
At least one such collaboration is under way. The National Oceanic and Atmospheric Administration (NOAA), Microsoft, the satellite company Maxar, and other public- and private-sector partners are developing a system that automatically identifies marine mammals in satellite images. Called GAIA (Geospatial Artificial Intelligence for Animals), the project aims to create a program that’s completely open-source.
“This has so much potential, especially when we look at where we are right now, in a pandemic,” says Kim Goetz, the project’s co-principal investigator at NOAA’s Marine Mammal Laboratory. Goetz studies the highly endangered Cook Inlet beluga, and she’s been unable to do any fieldwork over the past two years.
“Things are going to happen where we can’t get there to know what’s going on,” she says. “Do we just sit on the couch and hope that the animals are still there by the time we get up there?”
The 2022 launch of Maxar’s Legion constellation, a group of six VHR-equipped satellites, should also “drastically improve the revisit rates in certain areas,” says Goetz.
Even with all of these improvements ahead, Clarke emphasizes that satellites won’t entirely replace old-fashioned monitoring networks. “A satellite can’t look inside the whale and see there’s presence of a virus, or an embolism from being hit by a boat,” she says.
Next up, Clarke hopes to test the robustness of satellite monitoring by working on the ground with experts during stranding events so that she can see for herself what satellite imagery misses. “We don’t know enough about some of these challenges we’re going to be coming up against,” she says. Yet the potential for this technology excites her immensely. “It’s quite literally out of this world.”