The University of Haifa has embarked on a global research project to learn how to decipher sperm whale communication, which may enable humans to understand the dialogue between these large sea creatures.

To that end, it has joined forces with educational institutions such as the City University of New York, Harvard University, the Massachusetts Institute of Technology, Imperial College London and the University of California, Berkeley. At a press conference held this week in Dominica in the Caribbean, where the project will take place, researchers revealed plans for what they are calling the Cetacean Translation Initiative (CETI).

The project, estimated to last at least five years, will combine expertise from marine biology, marine acoustics, artificial intelligence and linguistics. Researchers will leverage machine learning and non-invasive robotics to listen and translate the whales’ language and attempt to communicate with them.

Why this particular species?

Project CETI ecosystem illustration. Credit: Alex Boersma.

As Project CETI explains, “The sperm whale is the animal with the largest brain, and like humans, it has a complex communication system and lives in tightly knit family groups. These whales also help keep carbon out of the atmosphere, support our oxygen supply and increase marine life. We now have the tools to identify and translate the deep structure of their communicative patterns and to kick-start the path towards meaningful dialogue with another species. By illustrating whales’ incredible intelligence and advocating for legislation, we can accelerate conservation efforts.”

One of the project’s leaders, Professor Dan Tchernov of the University of Haifa’s Leon H. Charney School of Marine Sciences and scientific director of the Morris Kahn Marine Research Station elaborated further, saying “these primates make a clicking sound at varying frequencies when they are in the company of other whales. The question is: Is this just a simple code or a true language?”

“Right now, our database is not comprehensive enough to know the answer to this question,” he explained. “However, with the advancement of machine learning and advanced linguistics, we realize that if we gather enough data about their voices, the context in which these sounds are employed and understood, and the behavior and motivation behind these sounds, then we can then develop an algorithm to determine whether they have an authentic language.”

“Of course,” he said, “the dream would be if we are able to communicate with them on their terms.”

JNS

Support
Jewish News Syndicate


With geographic, political and social divides growing wider, high-quality reporting and informed analysis are more important than ever to keep people connected.

Our ability to cover the most important issues in Israel and throughout the Jewish world—without the standard media bias—depends on the support of committed readers.

If you appreciate the value of our news service and recognize how JNS stands out among the competition, please click on the link and make a one-time or monthly contribution.

We appreciate your support.