Can Chatbot GPT Communicate with Whales? – Video

Can Chatbot GPT Communicate with Whales? – Video

The concept of communicating with animals has long been a fascination for humans, and now, with advancements in technology, we may be closer than ever to breaking the language barrier with sperm whales. These magnificent creatures, known for their complex communication system, could hold the key to unlocking interspecies communication like never before.

Sperm whales have been studied for decades, and researchers have discovered that their communication is not just random noise, but a structured form of language. With distinct patterns and sequences of clicks called codas, sperm whales seem to have a sophisticated way of conveying messages to one another. Calves even go through a babbling stage before mastering recognizable codas, similar to human infants.

Thanks to advancements in machine learning and artificial intelligence, scientists are now able to analyze and understand these whale vocalizations in a new way. Projects like the Cetacean Translation Initiative (CETI) aim to collect data from sperm whales and use machine learning models to decode their language. By utilizing techniques developed for human language translation, researchers hope to bridge the communication gap between humans and sperm whales.

The potential for translating whale language opens up a world of possibilities for our understanding of animal communication. Just as we have learned to communicate with other species on Earth, the ability to converse with sperm whales could revolutionize our knowledge of the animal kingdom. The possibility of deciphering the language of one of the ocean’s most enigmatic creatures is an exciting prospect that could lead to groundbreaking discoveries in the field of interspecies communication.

Watch the video by Real Science

Video Transcript

This is the sound of the biggest toothed predator on the planet A song being broadcast from the deepest reaches of the sea. A call that sounds less like a harmonious melody like we hear from other whales, but more like a digital data transfer. The sound of sperm whales talking.

Its a sound that haunted sailors for centuries. A sound which they thought to be the cries of the ghosts of drowned sailors calling out to them. Amazingly, it wasn’t until 1957 that scientists even realized that these sounds were in fact, coming from whales.

And it wasn’t until the 1970s that they realized that these undulating, clicking, bellowing noises were a form of communication. And now, researchers think sperm whales could be our best chance at breaking a barrier we’ve never broken. That they might hold the key to unlocking the first ever case of interspecies communication.

Because if you were to bet on one animal that had something that was even close to human language, it would be the sperm whale. Over the years, scientists have learned that their brains are extremely developed, that they have rich social lives, family lives, culture, and an intricate communication system to support it all.

Their lives, in many ways, are similar to ours, and their language could have developed for similar reasons. Plus, they’ve had their massive brains for tens of millions of years longer than we’ve had our current-size brains. And with a vast repertoire of different sounds with intricate patterns, it seems as though

Sperm whale language could be as complex as our own. Speaking with other animals has long been a part of the collective human imagination- part of folklore, fables, and fantasy in many different cultures. But now, for the first time, we have technology that might be able to actually break the code.

Recently, we’ve all seen the huge strides in the field of natural language processing with ChatGPT3, giving a clear idea of what is possible when it comes to computers and human language. As technology like this has advanced dramatically over the last ten years, scientists now think

They can apply these techniques to the field of interspecies communication, specifically, to sperm whales. Because sperm whales and other species aren’t just singing to one another—they seem to be communicating specific messages, maybe even using language in a way that we would understand.

For millennia, humans have considered language to be one of the defining characteristics of our species, and something that other animals couldn’t possibly approximate–so we hardly bothered to investigate all the ways the animals actually are communicating. Today, with better technology, we’re starting to realize that many species have complex forms of conveying information.

Prairie dogs have distinctive alarm calls for different predators, and have even used different calls to distinguish the size, shape, color and speed of those predators. [6] Male and female putty-nosed monkeys each produce their own alarm calls when leopards approach–the

Females calling to get males to go on the defense, and the males calling to signal they’re ready for a fight. Bat squeaks aren’t just echolocation to help them hunt—they sometimes contain information about the speaker and which other bat is being addressed. Even tiny jumping spiders have a form of communication through vibrations.

The animal kingdom is clearly full of species communicating to one another, but whether or not they’re using language is much harder to recognize. When linguistics developed as an area of study specifically interested in analyzing the differences between human languages and the mechanics of how language works and is acquired, it

Seemed clear that nothing else on Earth was communicating the way we are. Human language has grammar, the overarching structure of each language, and syntax, which is the order in which words are spoken in order to convey meaning. We have the ability to create new words and use them, to communicate information about

Things that happened elsewhere, or in the past or future. Our languages are complex and nuanced. What we don’t know is if the same is true of any other animal. But there’s one species that might be able to help us solve that problem.

As we’ve learned more and more about sperm whales, they seem to be the perfect candidate to put our non-human translation abilities to the test. Sperm whales make the loudest noises of any living creature, up to 230 decibels. For references, when sounds are made above water, human eardrums rupture when noises

Are louder than 150 decibels. And most importantly, sperm whale communities use discrete sequences of clicks that get repeated in specific patterns, called codas. Researchers believe this is the basic communication unit for their language. And codas differ between whales in different regions.

For the whales off the coast of Dominica, a typical coda has 5 clicks, and lasts for around 4 seconds. They have recognizable patterns, measured by interclick intervals. Codas are mostly produced during periods of socialization—not when the whales are hunting or engaged in other activities.

And calves can take up to two years to produce recognizable codas, and before that they babble, just like human babies. All of these elements seem like the ingredients to a language that we might be able to make sense of, thanks to machine learning and artificial intelligence.

But how exactly can computers help us with this never-before-solved puzzle? Thanks to CETI, we may be about to find out. The world’s largest ever interspecies communication effort was launched in March 2020 with a project known as the Cetacean Translation Initiative, or CETI.

If that sounds familiar, it might be because there’s another moonshot project known as SETI—the Search for Extraterrestrial Intelligence. In the case of this new cetacean project, scientists will be collecting data from a group of sperm whales around the Caribbean island of Dominica.

The whales of that area have already been studied for more than fifteen years, thanks to the Dominica Sperm Whale Project. But before we can understand how computers might translate whale language, lets first dive into how they have done so for human languages.

Since 2014, machine translation has largely relied on something called an encoder/decoder Deep Learning model, which is a system that uses two separate neural networks. The encoder is the first neural network, and is where you input a sentence in, say, English.

The encoder takes each word in the sentence and turns these into a sequence of numbers, which are multidimensional vector representations for each word, which are called word embeddings. The second neural network, the decoder, will take this sequence of numbers and output a sentence in a different language.

But for this to work, it needs human supervision. It relies on known pairs of inputs and outputs. A system like this wouldn’t work to translate any language that is new, or unknown. And for several years, this was the best machine learning could do. That is, until 2018.

A group at facebook research found it is possible to do completely unsupervised translation. You do this by taking all the words from one language, and calculating their statistical properties – more or less, how often that word is used, and with what other words it is often correlated.

You can then assign all the words to a point cloud, kind of like a 3D galaxy, based on these statistical properties. Then, if you do the same for a totally different language, and compare the point clouds, they have nearly identical structures.

A word located in a particular position in the galaxy in one language, would be in the same location in any other language. You can take languages as dissimilar as english and chinese, and the structure in the point clouds will still line up.

And this means you can translate languages without needing a Rosetta stone. Tom Mustill, a documentary film maker and the author of How to Speak Whale, has followed the work of researchers at CETI for years, is particularly thrilled about how this development could apply to sperm whales.

And that is what got all the scientists excited because they thought “if we can then get a big enough corpus, a big enough amount of recordings” – and these techniques work with sound as well, they don’t just work with words, they work with like with audio that

Was a like a discovery just a couple of years ago. That gives you a way in with no bilingual dictionaries, which is what the situation we have with other species. And it doesn’t mean that we’re going to make a cloud in sperm whale and that it’s going

To match the human one, but maybe they’ll be some shapes that do. Maybe some parts of the universe that they represent in their communication does match ours, but those differences and similarities are going to give us an idea of what the different language galaxies that exist in life are.

The AI will be trained not just to predict what a single whale might say next, but will also learn to predict what a second whale might say, like in a conversation. The result will be like a whale chat bot. But there is one caveat to all of this.

For it to work, you need a ton of data. Like, so much data. For reference, Chat GPT-3, needed to be trained with over 500 billion words to be able to do what it does. This is the largest neural network that has ever been trained.

Maybe we don’t need Chat GPT3 levels of data, but can we even get close to enough data from sperm whales in the ocean for translation to be possible? In order to estimate how much data sperm whale communication the CETI project might be able

To gather, we can do a simple back of the envelope calculation. Scientists are studying a 20 square km area off the coast of Dominica, where 50-400 whales are observed, depending on the season. Sperm whales vocalize almost constantly. However, 75% of the clicks are used for echolocation.

The remaining 25% of the clicks are used for communication – the coda clicks. A typical coda from the whales in this region is about one click per second. So dividing the number of seconds in a year by 4, times the number of whales, gives us 400million, to 4 billion clicks per year.

So not nearly close to GPT3 levels, but still enough to work with. At the high end, it’s between Google’s BERT algorithm and ChatGPT2 levels. But how exactly, will researchers collect this data? It’s a task that will require a host of autonomous and semi-autonomous devices that

Continuously record the whales, from above, below, and all around them. One source of data will be from tethered buoy arrays. These devices will collect massive amounts of background whale bioacoustic data. These are big devices, and work in a similar way to military equipment that detects submarines over large areas and distances.

Each array will consist of audio recording devices that will be positioned at different depths in 200-300 meter increments, up to 1200 meters which is the depth that sperm whales are known to hunt. And attaching certain codas to certain depths could be a particularly good place to start looking for patterns in their language.

A study done in 2016 reported that sperm whales seemed to use certain codas at certain depths. Some were used on the ascent from a dive, some on the descent, some at the surface immediately after returning. With many hydrophone arrays in the water, scientists will also be able to track the

Movements of specific whales over space and time. In addition to the static hydrophones, scientists are attaching recording devices to the whales themselves using suction cups. These tags provide the most detailed recordings of the sounds they make, and also record signals from inertial and pressure sensors which will allow scientists to reconstruct the whale’s

Motion and diving patterns. The tags are especially critical in uniquely identifying whales in conversations, and associating the behavioral patterns with the recordings. And at last, are the autonomous sound recording robots. Free-swimming and passively floating aquatic drones will record audio and video from multiple

Animals simultaneously to observe behaviors and conversations within a group of whales. This will hopefully cover any blind spots from the first two methods. All of this data will be combined into a complex array of data, which can then be parsed out by different algorithms.

So what they’re really doing is they’re working with a whole different suite of machine learning tools and other software analyses tools. Some of them will be looking through all the audio and, and their job will be to figure out when a sperm whale is making a sound or not.

Is that the sound of a landslide? is that the sound of a sperm whale? Then they’ll have to filter out, Is the sperm whale echo-locating or is it a sperm whale making a communication vocalization like a coda? And then they’ll have to separate them out.

Then they have to figure out who’s talking and where were they talking? And what else was going on? Because that sort of metadata, you know, was it the one that then fought with the other one? Was it the one that was hunting? Was it the one that was nursing its young?

And what relationship does it have to the other sperm whales, who were involved in that vocalization exchange? who could have heard it, what was the sea doing? And then, you can start to build your kind of like I do this with a lot of people come

Into it because they’ll be trying to figure out. Okay. What if this then that then what’s likely what do you think they might be communicating about? And that’s when you start to have hypotheses about meaning to overlay into your chart of

All of the different vocalizations then when they figured out who’s talking and if so if there is sperm whale sound, if that sound is communication, who is it? Then they’ll try to figure out okay. Well, how different it is to the kind of things that have been said before and how many different

Kinds of sounds do they make? Researchers there have already collected around 100,000 sperm whale codas and tied each call to specific whales. An algorithm was created to analyze these codas and categorize them by which whale made which call, and it was accurate more than 94 percent of the time.

But this experiment will not be without its challenges. There are so many variables in any given coda, that it will be extremely difficult to parse what is meaningful, and what is not. For example, many of the clicks have almost the same structure, but with slight variations,

Like codas with a different amplitude, or a different frequency. Are these variations akin to differences in pronunciation, or are these fundamental variations that have different meanings? And does the unsupervised machine translation model rely on something innately human for it to work?

We might use the words in a sentence to describe how we are feeling, for example – “I feel happy today.” But to translate this into whale assumes they have the same sense of self that we do, that they have similar emotions as we do, that they perceive time the same way we do.

There are some words that are more likely to be common between us, words like mother, baby, friend, or foe. But just as a whale might not understand the concept behind a word like“today” what might a whale have a word for, that we can’t even conceptualize?

So even if the CETI researchers are able to collect all of this information about sperm whales, it’s impossible to say whether they’ll be able to translate the codas in a way that would make sense to us as language. But even if we can’t directly translate everything they are saying, researchers hope

That there will still be some patterns we can recognize. Some fragments of language that can tell us more about their lives, their families, even their intentions, or problems. Because the attempt to understand whales, whether successful or not, might also be a way for us to care for and help them.

A thing that’s really struck me through my whole career was the more that you can relate to other species, the more you care about them. All ocean ecosystems are under threat from climate change, and cetaceans are among those groups.

The marine heatwave known as “The Blob” that formed over the Pacific Ocean from 2014 to 2016 resulted in crashing fish and whale populations. Even without extreme heatwaves, the oceans are still getting warmer, which means cold-water species like the white-beaked dolphin are having to find new habitats.

The availability of food is changing, and runoff from the shoreline could eventually lead to cetaceans being poisoned by different chemicals. And on top of this, the shipping industry has regular impacts on cetacean populations. Shipping lanes that overlap with important whale habitats can lead to whales being struck,

And the noise of vessels can disrupt hunting and vocalizations. [20] Whales and dolphins are facing a difficult future, in large part due to human activity. If we find can a way to understand them, we might be able to do more to help.

How different might the future be, if we could simply ask whales what they need? What is hurting them? What causing them to change their behavior, breeding habits, or hunting grounds? How might we even work together with whales, for a better future for both of our species?

A shared language with whales could one day change the fate of the entire planet. Language is, afterall, the tool that propelled humans to innovate and solve so many of the problems that once afflicted us. It’s an evolutionary advantage that likely allowed us to dominate the globe.

And it’s origins in our species is one of the greatest mysteries of modern science. Did language suddenly develop in humans, or did we earn our voices through the slow, incremental process of natural selection that marks all other aspects of evolution?

Did language only ever develop in our species, homo sapien, or did other early humans use language too? This is a hotly debated, extremely exciting corner of science, linguistics, and archeology. And to understand the great debate, and hear the most compelling theories, you should watch

Our brand new episode of Becoming Human on Nebula, How Humans Started Speaking. This is the fourth episode in our flagship series about human evolution, which takes you through the most important steps in our evolution that made us the incredible, strange apes we are today.

For this series we’ve created our own world in 3D to show you the artifacts, fossils, and archeological digs all in one place, like a magical museum of all the most incredible aspects of early human archeology. This is a series that wouldn’t have really made sense to put here on YouTube.

Its a series we wanted to take time with, and a series that is more archeological and more philosophical than what we normally do here. But its a subject so important to the innate desire we all have of trying to understand our own humanity. And that’s why we created this series for Nebula.

Nebula is a streaming platform that was created by us – by the educational youtube content creators who wanted to experiment, play, and CREATE more. Sometimes the content we want to make is experimental, too long, too short, or contains themes that would get it demonatized instantly on youtube.

Sometimes we want to make content that doesn’t exactly “fit” on our regular youtube channels. Or sometimes, like in the case of becoming human, we want to make a beauitful standalone piece. Signing up to Nebula is also the single best thing you can do to support this channel.

By signing up, you are diretly supporting us, getting to watch all your favorite content ad free, and also getting access to tons of exclusive content. Nebula is now also offering Nebula First content, where you can get some of your favorite creators

Shows weeks or months before they are on YouTube, like Jet Lag the Game, or episodes from Johnny Harris, Legal Eagle, and many more. And top of original content and Nebula First videos, if you sign up with my link, you get

Free access to Nebula Classes, where our creators host classes on how to make the content you know and love. All of the classes are taught by the creators on Nebula, so you can learn how Alex the LowSpec Gamers makes videos on a budget, or how Foreign tells stories or how Patrick Willems makes

Movies. Your favorite creators teaching you how they create. If you’ve ever wanted to become a content creator, video maker, or are just cuurious how it all gets done, this is a great place to start. So if you sign up for Nebula using the link below, you can get Nebula for a little over

$2.50 a month, or $30 for the entire year. A portion of this subscription fee goes directly to us, and the rest goes towards building this platform to become the best streaming platform out there for viewers and creators.

It is quickly becoming THE hub for online creators, and now is the best time to join.

Video “Could Chat GPT Talk to Whales?” was uploaded on 03/18/2023 to Youtube Channel Real Science