Google has launched DolphinGemma, a new AI model that listens to dolphin sounds and tries to make sense of them. This joint effort by Google, the Wild Dolphin Project, and Georgia Tech might finally help humans understand what dolphins are saying to each other.
“We’re not just listening anymore. We’re beginning to understand the patterns within the sounds,” says Dr. Denise Herzing, who founded the Wild Dolphin Project in 1985 to study Atlantic spotted dolphins in the Bahamas.
The project analyzes three main types of dolphin sounds: whistles (tonal calls), clicks (short sounds used for navigation), and burst pulses (rapid click sequences used in social situations). Researchers have observed correlations between specific sounds and certain behaviors – mother dolphins use unique “signature whistles” to call their calves, “squawks” often occur during fights, and “buzzes” appear during courtship.
DolphinGemma works similarly to AI models that predict the next word in a human sentence, but instead predicts the likely subsequent sounds in a dolphin sequence. This helps identify patterns that could reveal the structure of dolphin communication.
What makes this technology particularly useful is its portability. The AI model runs on Google Pixel phones, allowing scientists to analyze dolphin sounds while swimming with them. Google Pixel 9 phones in waterproof rigs, expected for research in summer 2025, will include speaker and microphone functionalities for field research.
Similar Posts:
The team has also developed the Cetacean Hearing Augmentation Telemetry (CHAT) system, which creates artificial whistles and associates them with objects like seagrass or toys. Researchers observe whether dolphins mimic these sounds in what appears to be requests for these items – potentially creating a simple shared vocabulary.
Google plans to make DolphinGemma freely available to researchers worldwide next summer. Scientists studying bottlenose dolphins, spinner dolphins, and other cetaceans will be able to adapt the model for their work.
Not everyone is convinced that dolphin sounds qualify as true language. Zoologist Arik Kershenbaum questions whether dolphin communication has the complexity needed for language. Thea Taylor from the Sussex Dolphin Project warns about accidentally training dolphins to respond to artificial sounds rather than truly understanding their natural communication.
Understanding dolphin language could have far-reaching benefits beyond simply talking to them. It might reveal how dolphins respond to environmental threats, inform conservation efforts, and provide insights into their social lives.

The project faces significant challenges. Dolphin communication involves more than just sounds – body language and social context play important roles that audio recordings alone can’t capture. Without a “Rosetta Stone” to translate dolphin sounds to specific meanings, researchers must carefully interpret patterns without human bias.
DolphinGemma represents the latest step in our evolving relationship with these intelligent marine mammals, using modern technology to bridge a communication gap that has existed for thousands of years.