Artificial intelligence is taking us one step closer to having real conversations with dolphins. For decades, scientists have been fascinated by these intelligent marine mammals, known for their complex vocalizations and social behaviors. Now, thanks to cutting-edge AI models and ambitious research initiatives, we’re beginning to decode their language in ways that once seemed like science fiction.
Google’s DolphinGemma: translating whistles into data
One of the most exciting developments is Google’s DolphinGemma, an AI model designed specifically to analyze and generate dolphin-like sounds. Built in collaboration with Georgia Tech and the Wild Dolphin Project, DolphinGemma uses Google’s SoundStream tokenizer to translate dolphin vocalizations into machine-readable sequences. This allows the AI to spot patterns, predict the next “word,” and even produce realistic dolphin calls.
Despite its complexity — with around 400 million parameters — DolphinGemma has been optimized to run on mobile devices like the Google Pixel series, making it practical for field researchers who study wild dolphin populations.
Building a shared dolphin-human vocabulary
Complementing DolphinGemma is the CHAT system (Cetacean Hearing Augmentation Telemetry), which takes a more interactive approach. CHAT links specific dolphin sounds to objects and actions, creating a kind of shared vocabulary. The system synthesizes dolphin-like sounds and listens for responses, accelerating the feedback loop between human researchers and dolphins.
This synergy between AI analysis and sound synthesis is paving the way for real-time, two-way communication — not just decoding dolphin chatter but actually conversing with them.
Groundbreaking research into dolphin social calls
Beyond AI, traditional marine biology is still playing a key role. The Sarasota Dolphin Research Program, led by Laela Sayigh and Peter Tyack from the Woods Hole Oceanographic Institution, has made major strides in decoding bottlenose dolphin whistles. Using hydrophones and non-invasive acoustic tags, they’ve cataloged at least 20 types of non-signature whistles shared among different dolphins.
Playback experiments suggest some of these sounds are context-specific, potentially serving as alarm calls or signals for unexpected encounters. This research recently earned them a $100,000 prize for advancements in interspecies communication.
A $10 million challenge to talk to animals
Pushing the field further is the Coller Dolittle Challenge, an ambitious competition backed by British billionaire Jeremy Coller. With a grand prize of $10 million, the challenge seeks to develop a system capable of enabling meaningful conversations with animals, starting with species like dolphins.
The goal is clear: achieve a breakthrough comparable to the Rosetta Stone’s role in deciphering ancient Egyptian hieroglyphs, but this time for cross-species communication.
What’s next?
While we’re not quite ready for full conversations with dolphins, the gap between human and animal communication is narrowing. The combination of AI models like DolphinGemma, interactive systems like CHAT, and rigorous field research is creating a foundation for real breakthroughs.
If these efforts continue to progress, the dream of understanding what dolphins are really saying — and responding in kind — might soon become a reality.