The Edge

#33 Talking to Whales with Gašper Beguš

California magazine

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 28:49

For decades, researchers believed that whales communicated using a sort of underwater Morse code—simple, repeated patterns of clicks. But recent advances in technology have revealed a much more complex, even human-like language. Now, with artificial intelligence, scientists are getting closer than ever to translating their vocalizations. UC Berkeley’s Gašper Beguš is one such researcher building advanced AI to learn and decipher sperm whale communication. But what happens when we crack the code? What might whales tell us—and what would be the practical, ethical, and legal implications of what we learn? 

Further reading:

  • Berkeley News - “UC Berkeley and Project CETI study shows sperm whales communicate in ways similar to humans”
  • National Geographic - “Whales could one day be heard in court—and in their own words”
    National Geographic - “What are animals saying? AI may help decode their languages”
  • Legal Paper - “What if We Understood What Animals Are Saying?: The Legal Impact of AI-Assisted Studies of Animal Communication”
  • NYU More-Than-Human Rights report - “Listening to Our Animal Kin: Legal and Ethical Principles for Nonhuman Animal Communication Technologies”

This episode was written and hosted by Leah Worthington and Nat Alcantara and produced by Coby McDonald. 

Special thanks to Gašper Beguš, Pat Joseph, and Laura Smith. Art by Michiko Toki and original music by Mogli Maureal. Additional music from Blue Dot Sessions.


Support the show

Transcript: 


LEAH WORTHINGTON: Hey Nat.


NAT ALCANTARA: Hey Leah.


LEAH: I want you to listen to this and tell me what you think it is.


[CLIP OF SPERM WHALE CLICKS]


NAT: Uhhh… I don’t know,....... a typewriter? One of those wheels that you spin for a prize?


LEAH: Yep, I hear that. Not even close. It’s a sperm whale talking.


NAT: Woah, I never would have guessed.


LEAH: I know, I kind of set you up to fail. Ok, and here’s something else you would never have guessed. Did you know that sperm whales are one of, if not the loudest animal on earth?


NAT: Really? That doesn’t sound all THAT loud.

LEAH: Yeah. They can reach up to 230 decibels, which for comparison, is louder than a rocket launch. By nearly 100 decibels. 


NAT: Louder than a rocket ship??? That’s crazy. I’m guessing it’s because they need to talk to each other from far away? 


LEAH: Yep. And, like other whales, sperm whales use these really complex patterns of clicks for all sorts of things. They use them to hunt. They use them to navigate. And—this is where it gets interesting—they also use them to communicate.


NAT: But when you say “communicate”... are we talking, like, “hey, come over here” or, like, “so, I’ve been thinking about existence”?


LEAH: Well, we don’t know…yet. But there’s a group of researchers, including one right here at Berkeley, working really hard to decipher sperm whale language. And they’re making some pretty good progress.


NAT: But how? Like, where do you even start?


LEAH: Right. So, humans are terrible at finding patterns in a bunch of clicks and whistles and random sounds. But computers are not. 


NAT: Let me guess—AI?


LEAH: Nailed it. Researchers are training AI language models to listen to sperm whale recordings and detect familiar sequences, kind of like conversational patterns in human language. 


NAT: We’re teaching machines to eavesdrop on whales. Got it. 


LEAH: Yes. And maybe, some day, to help us translate what they’re saying. 


NAT: This sounds like sci-fi. You know, the kind of story that starts cute and ends with some person using Whale Translate to build a whale army and turn them against us. 


LEAH: [gasp] Whales would NEVER. But seriously, more likely, we learn something about sperm whales that completely changes the way we see and treat them. What if we tune into whale radio and overhear them complaining about all the noise and trash pollution we’re dumping in the ocean? Not to mention the legal implications. What if a whale could, you know, testify in court?


NAT: Ok, now you got my attention. 


LEAH: Well, today we’re talking to one of the people at the center of this whale translation work. So let’s dive in. 


[MUSIC IN]


LEAH: This is The Edge, produced by California magazine and the Cal Alumni Association. I’m your host, Leah Worthington. 


NAT: And I’m your co-host Nathalia Alcantara.


LEAH: Our guest today is Berkeley Linguistics Professor Gašper Beguš, who’s also part of an international team of researchers using advanced AI and robotics to listen to and translate sperm whale communication. 


[MUSIC OUT]


LEAH: So, so great to connect with you. Thank you so much for joining me. I've been really looking forward to this.


GASPER: Yeah, for sure. Thank you so much for the conversation.


LEAH: Gasper is an associate professor of linguistics at Berkeley and a linguistics lead at Project CETI, the Cetacean Translation Initiative. So, can you start by telling us what that is? 


GASPER: Project CETI is this wonderful multi-disciplinary project that aims to listen to sperm whales and try to understand what they're saying and how their lives work.


LEAH: Amazing. What a cool way to get to introduce yourself. 


GASPER: Well, it wasn't always like that. I started out as historical indo-europeanist, historical linguist, which is basically, you learn as many dead languages as possible, and you try to reverse engineer how the proto language of Latin and Greek and Sanskrit and Hittite and Tocharian and all this beautiful language sounded like.


Linguistics as a field does not have a, you know, great history with animals. And the reason for that is, for the longest, I think we had a very exclusivistic view of humanity, and, you know, how we’re very special and how the other animals are, you know, either just performing tricks or have nothing even remotely similar to our language. 


And I think, I hope that's changing. I think if there is any entity, if there's a group of people, that can change it, I hope it's CETI because I think we can really shift the perspective and show how animals are important and that they have more human-like properties than we used to think. And just yeah, that, you know, if nothing else, linguists have amazing tools to analyze other communication systems. 


I would say the interesting part about sperm whales specifically is that the, you know, the communication systems seem to have some semantic content because they vocalize before they hunt and while socializing. And not, you know, not exclusively for mating purposes. They have already evolved a very complex communication system, even if it's not human language.


LEAH: What do we know at this point about how sperm whales communicate? I know that bees waggle in a certain orientation to signify that a food source is, you know, so in a certain direction, a certain distance away. But whale communication is more, I think I've read you compare it a little more to a sort of alphabet?


GASPER: Well, they engage in really structured and interesting conversations. So they exchange this series of clicks that are called codas. They're not really clicks, they're pulses. They would be better analyzed as pulses. And, you know, similar to pulses of our vocal folds.


For the longest, people thought that was Morse code, but the more you look at it, the more it's actually similar to human vowels, or like human communications. And it appears to be the case that they do that before they hunt, when they socialize, when they give birth. And so it's really structured and interactional. So like there's this one whale and another whale and the whale and they exchange codas, and sometimes they say a coda synchronously—so at the same time—and sometimes not. And it appears to be a really, really interesting, not like a random, just like just random vocalizations, but very structured. 


NAT: Please tell me we’ll soon have Google translate for whales...


LEAH: Well, that would require knowledge of what they’re saying in the first place. But really, they’re starting from scratch. So more like programs that are first trying to deconstruct whale communication to understand how it functions—and then maybe someday make a translator that could interpret it. Which is a really difficult problem when you’re working with a species that lives in the deep ocean and has a completely different experience of the world from us. 


NAT: Lucky them... ok, so how are we doing that—deconstructing whale communication? 


LEAH: So, right now, they’re working off the coast of Dominica, an island country in the Eastern Caribbean. And basically they have drones that very gently attach mics to the whales so they can record their vocalizations even when they’re diving thousands of feet underwater. And then they take these recordings and feed them into specially trained, bespoke AI. I’ll let him explain from there.


GASPER: So we're using AI techniques that mimic the way our brain works, and we're creating an artificial language learner that learns language as a baby. And that language learner, our model, needs to learn to speak like humans and also transmit information like humans, right? So when I say a “red apple,” those vocalizations have some meaning. We do the same with whales, right? The model needs to learn to speak like a whale but also transmit information through those codas. 


LEAH: So you're taking this sort of like, unformed, unshaped blob sponge and allowing it to just absorb information and language in the way theoretically any organism would learn from exposure to family, I guess, and then asking it, what'd you learn from that experience?


GASPER: That's exactly what we do, yes. So we have this artificial baby language learner that has been trained on whales instead of human vocalizations. And we can look inside the network, the AI model, and say, “what did you learn about the whales?”


LEAH: I’m curious in the case of whales, what you were able to glean from that probing?


GASPER: Well, for example, it suggested that spectral properties are meaningful. We have shown that the spectrum is important to look into. A few years ago, nobody was looking at the spectrum, at the way frequency of their sounds work. And now we think that that's really controlled and interesting.


NAT: Wow. Do they have actual evidence to back that up? 


LEAH: Yeah. So, to analyze these vocalizations, the researchers used something called generative adversarial networks, or GANs, a machine learning model that is basically listening to the whales and tries to imitate what it hears. That’s why he calls it an “artificial baby language learner”—because in a way it learns just like human babies do, by listening and imitating. And this can help point researchers in a new direction by uncovering new language patterns that hadn’t been found yet.


NAT: So what did they find out?


LEAH: Well, just last November, Gasper published a paper that described vowel and diphthong-like patterns in whale communication.


NAT: Come again?


LEAH: Yeah, so like Gasper said, scientists used to compare sperm whale clicks to Morse code—it kind of sounds like that, right—which is a pretty simple communication system that only uses two dimensions: frequency and timing of sounds. But he and his team discovered a whole new dimension, these so-called “spectral properties,” which includes things like resonant frequencies. So, rather than just making monotonous clicks over and over and over again, the researchers believe that sperm whales are varying the quality of the clicks in a way that seems comparable to human vowels. 


NAT: And how many vowels do they have?


LEAH: Well, so far they’ve only been able to identify a- and i-type coda vowels. But they also found that these codas can have a rising or falling trajectory—also known as a diphthong, like “oi” in coin or “ea” in fear. And they believe that these patterns are completely controlled and detectable by the sperm whales.


NAT: So, they’re intentionally putting together strings of vowel-like sounds to communicate with each other.


LEAH: Exactly. Here are those whale noises again, the same ones that we heard at the beginning. This is what it sounds like when it’s sped up and the pauses are removed.


[CLIP FROM THIS VIDEO 2:06-2:08]


NAT: Woah. Sounds so alien. But also really precise.


LEAH: Totally. And finding these vowel-like components has huge potential for accelerating our understanding of their language. Or I should say languages… because different pods have distinct dialects. 


GASPER: So they communicate with codas and the dialects that they have are very salient. Like I can hear a few codas, and by now I know which clan they'll belong to, those codas. Families form higher units that are called vocal clans and they use distinct patterns to express their social identity. And that's what we do with our speech as well a lot, right? So the way we say “hello” or the way I say “y'all” expresses a lot about my background. And so they do that as well. 


LEAH: Do we know enough to be able to say that their communication resembles that of putting them together words to form a sentence?


GASPER: We don't know what the word is. We don't yet know what the sentence is. We have to start small and you know, we started at the smallest unit and now we're building our world of representations further. 


LEAH: So if the spectrum of what we understand is from “we know that there are sounds that they make and we have no idea what they mean” to “we know that there are sounds that they make and we can actually decipher them and interpret what they are saying more or less,” where are we in that spectrum? 


GASPER: We know what the units are, we know how to analyze them. We know that they repeat things. We know that many whales use the same patterns. So we basically have already the ability to, in a sense, transcribe their communication system. What we don't have is just like what each of those elements means specifically. 


[MUSIC]


LEAH: I don’t know if this is actually true, but I remember learning this at some point that it took us a really long time to identify echolocation because we just couldn't even conceive of it. Like, what questions do you ask to arrive at the solution, bats use echolocation? And I wonder if that could apply in situations like this where we don't even have the capacity—or if our questions are so limited by our own human experience that we could miss something major or completely misinterpret.


GASPER: Absolutely. Yeah, for sure. I mean, the patterns that are very difficult for us to spot can be easy, can be relatively easy for AI because they're less human-centric. So what the AI can do for you is it can tell you where to look. It can be a hint. It's not giving you final answers. It kind of shrinks your space of hypothesis. So, in a sense, it's a similar problem, as, if you know, aliens contacted us right in a sci-fi scenario. Like, how do you approach this? How do you go about it? 


You know, a whale communication system could have a million shapes or forms. They could express things in various different ways. You can’t just come and know nothing about a communication system. Imagine like we were contacted by aliens. Until you understand what is even meaningful in their communication system, you're not going to be able to crack it. 


So, there are a lot of different things. Like they don't have a concept of a tree. They don't have a concept of drinking. They don't have a concept of floor, in the same way as we do or air or smell. Even conceptualizing like how do they see the world, right? The ocean is not the best place for light. So they basically see with their ears.


But they have mothers, they have grandmothers, have children, they babysit each other's children, they breastfeed, they help each other give births. It's one of the few species that have collaborative births like humans do. And they talk to their calves, their calves and so on. And so, you know, the mothers babysit each other's children, even though they're not their own potentially. Other cetaceans have been shown to carry the dead, mourn the death for extended periods of time, especially their babies. 


So we're deeply connected to them on the social level, even though our worlds are vastly different. And they exchange those vocalizations in those very relatable situations, right? And we're, you know, we're making some progress and trying to understand where and how they're using those vocalizations. I think that is a really difficult step and one that we are proud to have made.


LEAH: Well, it seems like AI is really accelerating our ability to understand language, human and non-human communication. At least that's the sense I'm getting from you, that this is sort of opening up a world of possibilities and increasing the speed at which we can do all of this enormous analysis. So if that's true, and we're going to—and the speed at which we can kind of decode and decipher language and communication across the animal kingdom is increasing, what are the implications of that for society, and also, you know, ethics and law?


GASPER: Well, I think I'm very excited about this change of perspective, or paradigm shift, if you want, where you know our work is suggestive of the fact that we might not be as exclusive. There are complex intelligences with complex communication systems that are very similar to us in many respects, and that there is very little that allows us to be humans, or is there very little, if anything, that makes us special.


NAT: Ah, that reminds me of what you said earlier about using whale testimony in court.


LEAH: Yeah, in fact, just last March, Gasper published a legal paper on this very question. It was him—a linguist—along with a biologist and two lawyers—


NAT: They walk into a bar.


LEAH: They walk into a bar and write a paper called: “What If We Understood What Animals Are Saying? The Legal Impact Of AI-assisted Studies Of Animal Communication.” That paper lays out some pretty fascinating hypotheticals in a world where, as they write, “today’s cutting-edge technology draws us nearer to interspecies comprehension.”


NAT: Meaning, what if a whale could serve as an expert witness in court?


LEAH: Exactly.


GASPER: Since language is at the forefront of basically everything in our society, how can we expand the rights and expand the protection, legal protections from humans, beyond the human?


We need to rethink how we view them and their voices in the legal system. And I think we're arguing for more legal protection and potentially, legal personhood for the whales because their communication system is so complex. And you know, our communication system is pretty complex as well, but we are realizing that we're not so high above everyone else as we thought. And if you think of language as more of a continuum, then there is no line that you can draw and say, “Okay, from here on this species has language, and therefore they're worthy of legal protection and have a voice in the legal system, whereas lower than that, that's not true.” And in a sense, language is this last frontier for rights. 


LEAH: Yeah, I saw this article in Nat Geo the other day that was referencing your work, and the headline of it is, “whales could one day be heard in court and in their own words,” which to me reads a tiny bit sensationalist, definitely eye-grabbing, but I'm curious to what degree there's truth in that, and whether, whether what we're talking—like, what are the implications of potentially being able to decipher whale communication?


GASPER: We don't have a kind of a direct evidence where the whale is expressing pain or annoyance, or, you know, is expressing, you know, inner states that are suggestive of harm. So that's, that's one, for example, you know, very direct use case of, of, you know, understanding their communication system better. And I think that we're making real, real, tangible progress there. I don't think you know they're, they're gonna, you know, they're gonna read legal transcripts of courts, but like stuff like that. And also, you know, these, this market. So the fact that they have a complex communication system is suggestive of high intelligence, and made some that sometime lead, you know, at some point, lead us to our ability to look inside their internal states and inside their complex thought, right? that channel that we initially talked might give us, you know, unprecedented view into their internal worlds that would have, I think, huge legal implications.


LEAH: Of course, Nat, as you might have guessed, the researchers are really concerned about the ethical risks of this kind of work. 


NAT: Like, do whales deserve human-like protections under the law?


LEAH: Yeah, and also, what if someone tries to communicate with sperm whales for nefarious purposes? You mentioned the whale army earlier. Or, more likely, what if the translator isn’t perfect and results in gross miscommunications between humans and whales?


NAT: Yeah, and maybe they don’t even want to talk to us. I mean, I wouldn't blame them. And also, if we’re giving whales some legal personhood we might also need to think about consent?


LEAH: Totally. Which is why the researchers are partnering with NYU Law’s More-Than-Human Rights project to work on some ethical and legal guardrails as the research continues. These include things like the so-called “precautionary principle,” which means that scientists should act with care to prevent harm and keep in mind the best interest of the animal that they’re trying to communicate with. But I know the regulations are kinda vague right now. But these are questions that lawyers and ethicists are starting to discuss to address the emergent field of “nonhuman animal communication technologies” or NACTs. 


NAT: So this really is bigger than just the whales?


LEAH: Oh yeah. As some lawyers at NYU Law wrote in a recent report: “While these technologies hold extraordinary promise for conservation, wildlife protection, and deepening our understanding of the more-than-human world, they also pose serious risks to animal welfare, autonomy, and ecological integrity.”


NAT: Ok, but don’t we already know that non-human animals experience pain and suffering? And what about animals that maybe can’t express their suffering in some kind of decipherable language? 


LEAH: Great questions. Yeah, I asked Gasper about that… 


LEAH: I do worry a tiny bit about a world where we preferentially protect animals with more complex inner worlds and intelligence.


GASPER: And you know, it's, it's a really tough question, philosophical, right? I mean, why not fruit flies? I mean, they're also right, I mean intelligent in their own right. Or chicken—chicken is very intelligent in, you know, finding a grain. And you know, intelligence is extremely human-centric. When we talk about intelligence, it is extremely human-centric. And I agree in many ways, but I think, you know, we're pushing boundaries. 


There's this whale called, we named her Pinchy, and she's a grandmother. She's very chatty. And so you get attached to that. And it's, you know, it's easier to make societal moves when people have emotions involved, and you see that, you know, Pinchy is around, and she's like the grandmother who's chatty, and you're gonna grow, you know, you grow connection to these to these wonderful creatures. And I think that's also what's needed. By acknowledging that, you know, it is very anthropocentric, in a sense that to say that only you know whales should have rights and nobody else. And whales and primates. But I think it's an important direction that we're, you know, taking, which is expanding rights and protections from harm to non-humans. And by no means you know this should end at whales, but it is a part of a larger conversation that we should have a society.


LEAH: Do you think that it's possible that as we get further into understanding what their vocalizations mean and even potentially kind of interpreting greater meaning from them, that it will not just help us understand what's being communicated but also give us a window into their unique experience of being whales?


GASPER: Absolutely. Yeah. We used to think that language is a kind of a synonym for complex stuff—that you cannot have a complex inner life without language. There's another perspective that I think has more evidence that we have complex inner worlds, complex thoughts, and language in a sense is just a channel to our internal states, right? And understanding that channel, understanding those vocalizations, might be a window into their inner worlds and into what's meaningful to them and how do they transmit inner worlds among each other. 


LEAH: Wow, that would be amazing to be able to enter the world of a whale through their language and just be exposed to maybe even concepts that we hadn't considered about just like the physicality of being a whale. I mean, this sounds so, like, as I'm saying it, I feel like I'm writing a science fiction novel, but theoretically.


GASPER: Well, there some philosophers think that they will never be able to understand what it's like to be a whale. But I think we can learn even by trying. We can learn a lot, and maybe we can learn that we're not as special as we have for so long thought to be.


We define ourselves as the, you know, these species that have language. We build our lives around language. Language research on animals can have the ability to teach us that we're not so special, and teach us to appreciate these intelligences and their, you know, their societies, their cultures. So we have to rethink our entire position in the legal world, in the society world, the way we treat nature and the way we look at nature, right? I think sperm whales have a really great potential to teach us that.


[MUSIC IN]


LEAH: This is The Edge, brought to you by California magazine and the Cal Alumni Association. I’m Leah Worthington. This episode was produced by Coby McDonald, with support from Nat Alcantara, Laura Smith, and Pat Joseph. Special thanks to Gašper Beguš. Original music by Mogli Maureal. Additional music from Blue Dot Sessions.


[MUSIC OUT]