BRAVE NEW WORLD: We Are Close To Deciphering Animal Language

Uncategorized

We truly are living in the most fascinating time to be alive!

Yes, I realize AI and Artificial General Intelligence (AGI) are going to be extremely disruptive and could lead to some very bad outcomes, but it’s going to be one heck of a trip along the way, with many things previously thought to only be science fiction suddenly coming to life.

We are literally entering the Star Trek universe.

And in Star Trek they have the Universal Translator.  A computer so powerful it can map and translate almost any language in the galaxy except of course “Darmok and Jalad at Tanagra”:

Anyone else love that episode?

A classic!

But anyway, we are edging ever closer to the Universal Translator here in Stardate 2026.

And you have several options.

You can get live translation on Apple Airpods:

Or if you trust that dork Mark Zuckerberg, you can get it with your Meta Rayban Glasses too:

And that’s all super cool.

But what about not just other languages but other SPECIES?

Could we soon have tech so powerful it could understand what animals are saying?

It’s really not all that far-fetched to think about.

Animals clearly communicate with each other.

They clearly understand what the other animals are saying.

It almost certainly isn’t in formal words and sentences like we talk, but could the computer eventually map the language to understand what concepts are being communicated?

Almost certainly, and this video shows the progress already being made:

At this point, most things — this included — really just seem to be a matter of time before the computing power is sufficiently strong enough to crack the code.

In other words, I think it’s not a question of “if” we can do it, but just a matter of “when” it happens.

But that’s not just my opinion.

Grok gave me an estimated timeline of what kind of progress we can expect to see and when it will likely happen:

Animal Communication and AI: Progress and Outlook

We’re already making rapid progress in decoding animal communication using AI, with significant breakthroughs in the last few years (2024–2025). However, “decoding what animals are saying” varies by species and depth: we’re good at identifying species, emotions, and basic patterns now, but full semantic understanding (like translating complex “sentences” or thoughts) is likely 5–15 years away for the most advanced cases, such as whales or dolphins.


Current Achievements (as of early 2026)

  • Species identification and basic classification: Highly accurate today. Apps like BirdNET and Merlin identify thousands of bird species from songs with high precision. AI classifies dog barks (playful vs. aggressive), pig emotions, and ungulate (e.g., cow, boar) positive/negative states at ~89% accuracy across species.

  • Structural decoding: We’ve mapped “phonetic alphabets” in sperm whales (codas like building blocks), vowel-like patterns in their calls, and complex click structures. Dolphins show syntax-like whistle patterns, with AI models like Google’s DolphinGemma predicting sequences and generating dolphin-like sounds.

  • Emotional and contextual insights: AI detects distress, pain, or positivity in vocalizations (e.g., elephants, whales, birds). Projects like Earth Species Project’s NatureLM-audio transfer human speech models to animals, identifying shared structures across species.

  • Generative AI: Models create synthetic animal sounds (e.g., whale codas, bird songs) for testing responses, a step toward two-way communication.


Key Ongoing Projects

  • Project CETI (sperm whales): Identified a “phonetic alphabet,” vowel/diphthong analogs, and dialects. They’re building massive datasets with robotics and AI for contextual translation.

  • Earth Species Project: Foundation models like NatureLM-audio analyze bioacoustics across species (birds, whales, crows). They emphasize it’s “when, not if” for decoding.

  • Dolphin research: Google/DeepMind’s DolphinGemma and Wild Dolphin Project tools process decades of data, spotting patterns humans miss. Prizes like the Coller Dolittle Challenge award advances in dolphin two-way communication.

  • Birds and others: AI tracks song evolution (e.g., great tits influenced by migration/age) and monitors ecosystems via passive recordings.


Timeline Estimates

  • Short-term (now to 2030): Robust decoding of basic meanings (e.g., warnings, greetings, emotions) for well-studied species like whales, dolphins, elephants, and birds. Improved conservation tools (e.g., detecting stress in livestock or wild populations) and initial two-way tests (playing AI-generated sounds and observing responses).

  • Medium-term (2030–2040): Deeper semantic understanding for complex communicators (e.g., cetaceans), potentially revealing social structures, references to absent individuals, or abstract concepts. Full “conversations” remain speculative and ethically complex.


Challenges Slowing Progress

  • Limited data (compared to human language corpora)

  • Non-vocal communication (body language, scents)

  • Anthropomorphism risks

  • Ethical issues (consent, interference with animal cultures)


Conclusion

Experts from Earth Species Project, Project CETI, and others agree exponential AI advances make this inevitable sooner than expected—major leaps have occurred since 2024. It’s transforming conservation, welfare, and our view of animal intelligence, even if a universal “animal translator” app is still sci-fi for now.

Which only leaves the one big question:  Do you REALLY want to know what your cat is thinking and saying about you?

My dog, sure.

But I think it’s going to get pretty dark pretty fast once we learn what the cats are plotting against us….

We are pleased to announce our partnership with Hunter Tylo.

Many of you will recognize her as the actress who stared in such daytime dramas as All My Children and The Bold and the Beautiful. PEOPLE Magazine twice named her one of the world’s 50 most beautiful people. She was also successful in suing Aaron Spelling over his firing her from Melrose Place for not aborting her child, a case which is widely recognized in supporting a Mother’s rights.

Hunter is coming onto TUC YouTube LIVE this Thursday at 4pm EST to discuss her experiences in Hollywood and why she left, choosing rather to pursue YASHA’UA and the Torah. As a member of our community, she has also opened up a channel at our TUC Discord to discuss a number of pressing issues, like narcissistic abuse.

Here is your TUC Discord invite link. https://discord.gg/zFPnExWT

Be sure to introduce yourself and then head right on over to her room, “Getting Real with Hunter”.

We hope our partnership with Tylo will be an ongoing one.