The naked mole rat may not be much to look at, but it has much to say. The wrinkled, whiskered rodents, which live, like many ants do, in large, underground colonies, have an elaborate vocal repertoire. They whistle, trill and twitter; grunt, hiccup and hiss.
And when two of the voluble rats meet in a dark tunnel, they exchange a standard salutation. “They’ll make a soft chirp, and then a repeating soft chirp,” said Alison Barker, a neuroscientist at the Max Planck Institute for Brain Research, in Germany. “They have a little conversation.”
Hidden in this everyday exchange is a wealth of social information, Dr. Barker and her colleagues discovered when they used machine-learning algorithms to analyze 36,000 soft chirps recorded in seven mole rat colonies.
Not only did each mole rat have its own vocal signature, but each colony had its own distinct dialect, which was passed down, culturally, over generations. During times of social instability — as in the weeks after a colony’s queen was violently deposed — these cohesive dialects fell apart. When a new queen began her reign, a new dialect appeared to take hold.
“The greeting call, which I thought was going to be pretty basic, turned out to be incredibly complicated,” said Dr. Barker, who is now studying the many other sounds the rodents make. “Machine-learning kind of transformed my research.”
Machine-learning systems, which use algorithms to detect patterns in large collections of data, have excelled at analyzing human language, giving rise to voice assistants that recognize speech, transcription software that converts speech to text and digital tools that translate between human languages.
In recent years, scientists have begun deploying this technology to decode animal communication, using machine-learning algorithms to identify when squeaking mice are stressed or why fruit bats are shouting. Even more ambitious projects are underway — to create a comprehensive catalog of crow calls, map the syntax of sperm whales and even to build technologies that allow humans to talk back.
“Let’s try to find a Google Translate for animals,” said Diana Reiss, an expert on dolphin cognition and communication at Hunter College and co-founder of Interspecies Internet, a think tank devoted to facilitating cross-species communication.
The field is young and many projects are still in their infancy; humanity is not on the verge of having a Rosetta Stone for whale songs or the ability to chew the fat with cats. But the work is already revealing that animal communication is far more complex than it sounds to the human ear, and the chatter is providing a richer view of the world beyond our own species.
“I find it really intriguing that machines might help us to feel closer to animate life, that artificial intelligences might help us to notice biological intelligences,” said Tom Mustill, a wildlife and science filmmaker and the author of the forthcoming book, “How to Speak Whale.” “This is like we’ve invented a telescope — a new tool that allows us to perceive what was already there but we couldn’t see before.”
Studies of animal communication are not new, but machine-learning algorithms can spot subtle patterns that might elude human listeners. For instance, scientists have shown that these programs can tell apart the voices of individual animals, distinguish between sounds that animals make in different circumstances and break their vocalizations down into smaller parts, a crucial step in deciphering meaning.
Read More on Artificial Intelligence
“One of the things that’s really great about animal sound is that there are still so many mysteries and that those mysteries are things which we can apply computation to,” said Dan Stowell, an expert in machine listening at Tilburg University and Naturalis Biodiversity Center in the Netherlands.
Several years ago, researchers at the University of Washington used machine learning to develop software, called DeepSqueak, that can automatically detect, analyze and categorize the ultrasonic vocalizations of rodents.
It can also distinguish between the complex, songlike calls that the animals make when they’re feeling good and the long, flat ones they make when they are not. “You can just get a direct, subjective, from the animal’s mouth how-are-they-feeling,” said Kevin Coffey, a behavioral neuroscientist at the University of Washington, who was part of the team that developed DeepSqueak.
Decoding the meaning of animal calls also requires large amounts of data about the context surrounding each squeak and squawk.
To learn more about the vocalizations of Egyptian fruit bats, researchers used video cameras and microphones to record groups of the animals for 75 days. Then they reviewed the recordings, painstakingly noting several important details, such as which bat was vocalizing and in what context, for each of nearly 15,000 calls.
The bats are pugilistic, frequently quarreling in their crowded colonies, and the vast majority of their vocalizations are aggressive. “Basically, they’re pushing each other,” said Yossi Yovel, a neuroecologist at Tel Aviv University who led the research. “Imagine a big stadium and everybody wants to find a seat.”
But a machine-learning system could distinguish, with 61 percent accuracy, between aggressive calls made in four different contexts, determining whether a particular call had been emitted during a fight related to food, mating, perching position or sleep. That’s not a perfect performance, Dr. Yovel noted, but it is significantly better than the 25 percent accuracy associated with random guessing.
Dr. Yovel was surprised to discover that the software could also identify, at levels greater than chance guessing, which bat was on the receiving end of the scolding.
“This implies that an eavesdropping bat is theoretically able, to some extent at least, to identify if individual A is addressing individual B or individual C,” the researchers wrote in their 2016 paper.
Although the idea remains unproven, the bats may vary their vocalizations depending on their relationship to and knowledge of the offender, the same way people might use different tones when addressing different audiences.
“It’s a colony, they’re very social, they know each other,” Dr. Yovel said. “Perhaps when I shout at you for food, it’s different from when I shout at somebody else for food. So the same call will have slightly different nuances, which we were able to detect using machine learning.”
Still, detecting patterns is only the beginning. Scientists then need to determine whether the algorithms have uncovered something meaningful about real-world animal behavior.
“You have to be very careful to avoid spotting patterns that aren’t real,” Dr. Stowell said.
After the algorithms suggested that naked mole rat colonies all had distinct dialects, Dr. Barker and her colleagues confirmed that the rodents were far more likely to respond to soft chirps from members of their own colonies than those from foreign ones. To rule out the possibility that the naked mole rats were simply responding to individual voices they recognized, the researchers repeated the experiment with artificial soft chirps they generated to match the dialect of a rat’s home colony. The results held.
In the wild, colony-specific dialects might help naked mole rats ensure that they are not sharing scarce resources with strangers, and may be a way of enforcing social conformity. “In these large underground tunnels, you want to make sure that everyone’s following the rules,” Dr. Barker said. “And one very quick way to test that is to make sure everyone is speaking very similarly.”
Other major projects are underway. Project CETI — short for the Cetacean Translation Initiative — is bringing together machine-learning experts, marine biologists, roboticists, linguists and cryptographers, among others, at more than a dozen institutions to decode the communication of sperm whales, which emit bursts of clicks that are organized into Morse code-like sequences called codas.
The team is planning to install its “core whale-listening stations,” each of which includes 28 underwater microphones, off the coast of Dominica this fall. It plans to use robotic fish to record audio and video of the whales, as well as small acoustic tags to record the vocalizations and movements of individual animals.
Then, the researchers will try to decipher the syntax and semantics of whale communication and probe bigger scientific questions about sperm whale behavior and cognition, such as how large groups coordinate their actions and how whale calves learn to communicate.
“Every which way we turn there’s another question,” said David Gruber, a marine biologist at Baruch College who leads Project CETI. “If there was a big event that happened a week ago, how would we know that they’re still communicating about it? Do whales do mathematics?”
The Earth Species Project, a California-based nonprofit, is also partnering with biologists to pilot an assortment of machine-learning approaches with whales and other species.
For instance, it is working with marine biologists to determine whether machine-learning algorithms can automatically identify what behaviors baleen whales are engaging in, based on movement data collected by tracking tags.
“Is there a specific signature in the data for when an animal takes a breath or when an animal is feeding?” said Ari Friedlaender, a marine ecologist at the University of California, Santa Cruz, who is collaborating on the project.
The researchers hope to overlay that behavioral data with audio recordings to determine whether there are certain sounds that whales consistently make in certain contexts.
“Now you can do really interesting things, like, ‘Let’s take orcas, look at their motion, translate the motion into the sound that goes with it,’” said Aza Raskin, the president and co-founder of the Earth Species Project. “Or you can start with the audio and say, ‘What behavior goes with what they’re saying?’”
In another line of research, Earth Species experts are using machine-learning algorithms to create an inventory of all the call types made by captive Hawaiian crows, which became extinct in the wild two decades ago.
They will then compare the results to historical recordings of wild Hawaiian crows to identify specific call types the birds might have lost over their years in captivity.
“Their vocal repertoire may have eroded over time, which is a real conservation concern,” said Christian Rutz, a behavioral ecologist at the University of St. Andrews in Scotland who is working with Earth Species on the project. “They keep them in these aviaries to breed birds for future releases. But what if these crows no longer know how to speak crow?”
Scientists can then study the function of any lost calls — and perhaps even reintroduce the most critical ones to captive colonies.
The Earth Species Project has also partnered with Michelle Fournet, a marine acoustic ecologist at the University of New Hampshire, who has been trying to decipher humpback whale communication by playing prerecorded whale calls through underwater speakers and observing how the whales respond.
Now, Earth Species scientists are using algorithms to generate novel humpback whale vocalizations — that is, “new calls that don’t exist but sound like they could,” Dr. Fournet said. “I can’t say how cool it is to imagine something from nature that isn’t there and then to listen to it.”
Playing these new calls to wild whales could help scientists test hypotheses about the function of certain vocalizations, she said.
Given enough data about how whales converse with each other, machine-learning systems should be able to generate plausible responses to specific whale calls and play them back in real time, experts said. That means that scientists could, in essence, use whale chatbots to “converse” with the marine mammals even before they fully understand what the whales are saying.
These machine-mediated conversations could help researchers refine their models, and improve their understanding of whale communication. “At some point, it might be a real dialogue,” said Michael Bronstein, a machine-learning expert at Oxford and part of Project CETI.
He added, “As a scientist, this is probably the craziest project I have ever participated in.”
Learning to listen
The prospect of ongoing, two-way dialogue with other species remains unknown. But true conversation will require a number of “prerequisites,” including matching intelligence types, compatible sensory systems and, crucially, a shared desire to chat, said Natalie Uomini, an expert on cognitive evolution at the Max Planck Institute for Evolutionary Anthropology.
“There has to be the motivation on both sides to want to communicate,” she said.
Even then, some animals may have experiences that are so different from our own that some ideas simply get lost in translation. “For example, we have a concept of ‘getting wet,’” Dr. Bronstein said. “I think whales would not even be able ever to understand what it means.”
These experiments may also raise ethical issues, experts acknowledge. “If you find patterns in animals that allow you to understand their communication, that opens the door to manipulating their communications,” Mr. Mustill said.
But the technology could also be deployed for the benefit of animals, helping experts monitor the welfare of both wild and domestic fauna. Scientists also said that they hoped that by providing new insight into animal lives, this research might prompt a broader societal shift. Many pointed to the galvanizing effect of the 1970 album “Songs of the Humpback Whale,” which featured recordings of otherworldly whale calls and has been widely credited with helping to spark the global Save the Whales movement.
The biologist Roger Payne, who produced that album, is now part of Project CETI. And many scientists said they hoped these new, high-tech efforts to understand the vocalizations of whales — and crows and bats and even naked mole rats — will be similarly transformative, providing new ways to connect with and understand the creatures with whom we share the planet.
“It’s not what the whales are saying that matters to me,” Dr. Gruber said. “It’s the fact that we’re listening.”