Financial News
Real life Dr. Dolittle? Scientists on verge of cracking code for talking to animals
Talking with dogs, decoding whale sounds and interpreting bird calls could all be possible in the coming years as artificial intelligence applications learn to translate different creatures' communications, animal researchers said.
Scientists have started using AI tools to analyze vast quantities of data on various species' communications, ranging from sounds, postures, expressions and more, to determine if they can understand and talk to animals in human terms.
"The door has been opened to using machine learning to decode languages that we don't already know how to decode," said Aza Raskin, who co-founded the Earth Species Project, a nonprofit aiming to develop AI models that let humans have "conversations" with animals. He predicts this will be possible within the next two years.
"The plot twist is that we will be able to communicate [with animals] before we understand" them, Raskin told Scientific American. "It wouldn't surprise me if we discovered [expressions for] ‘grief’ or ‘mother’ or ‘hungry’ across species."
WATCH MORE FOX NEWS DIGITAL ORIGINALS HERE
Christian Rutz, a behavioral ecologist at the University of St Andrews, agreed.
With new AI developments, "people realize that we are on the brink of fairly major advances in regard to understanding animals' communicative behavior," he said.
The research and possible breakthroughs go well beyond just translating animals' sounds. Con Slobodchikoff, an animal language researcher, is aiming to develop an AI model that interprets dogs' barks as well as their facial expressions for owners.
"We are so fixated on sound being the only valid element of communication, that we miss many of the other cues," he said. Despite this added complexity, Slobodchikoff is confident that machine learning will soon reveal more about what pets are trying to communicate.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
AI advancements are helping translate other animals besides traditional pets, as well.
The lead biologist for Project CETI, Shane Gero, for example, is using it to decode sperm whale sounds. His team is using underwater microphones to track codas — specific patterns of whale sounds — and plans to use AI to translate them.
Gero started by feeding codas his team had manually decoded to an algorithm, which was able to correctly identify a subset of whales 99% of the time. CETI hopes to eventually create a "whale chatbot."
The Cornell Lab of Ornithology, meanwhile, has developed a tool that can accurately identify and differentiate sounds from over 1,000 bird species. The Earth Species Project plans to test how zebra finches respond to AI-generated bird calls.
"We'll be able to pass the finch, crow or whale Turing test," Raskin said, referring to the ability to trick animals into believing they are communicating with their own species.
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.