Euromedia24 on Play Store Euromedia24 on App Sore
BNB

$855.39

BTC

$111231

ADA

$0.849796

ETH

$4503

SOL

$202.87

25 °

Yerevan

11 °

Moscow

34 °

Dubai

18 °

London

21 °

Beijing

17 °

Brussels

25 °

Rome

23 °

Madrid

BNB

$855.39

BTC

$111231

ADA

$0.849796

ETH

$4503

SOL

$202.87

25 °

Yerevan

11 °

Moscow

34 °

Dubai

18 °

London

21 °

Beijing

17 °

Brussels

25 °

Rome

23 °

Madrid

Artificial intelligence learns "read" animal emotions


The cat's whistle serves as a warning, and the dogs usually shake the tail from joy. However, in most cases, it is difficult for man to understand what the animal feels. Milan researchers have found a way to solve the problem. The artificial intelligence model has learned to determine the emotional coloring of the emotional emerges from the seven types of hoofs, pigs, and the emotional coloring of sound signals. The work was published in Scientific Reports.

It turned out that negative emotions are more often expressed in middle and high frequency areas, while positive signals are more equally distributed. At the same time, the "key" of emotions is different between the species. High frequencies are especially important for pigs, while the average frequencies are important for sheep and horses.

According to Stavros Nalampiras, his algorithm is able to emphasize the most delicate acoustic features that cannot be discovered. This opens the way to animal health and well-being monitoring new tools.

The technology has many different prospects. Farmers will be able to receive early alarm signals, zoologists will be able to control the state of wildlife populations, and the zookeepers will be able to respond to changes in their animal behavior before the appearance of visible signs. But at the same time, an ethical question arises. If the car registers the animal's suffering, do people must intervene?

Similar studies are also held on other types. Scientists from the United States are analyzing the click signals of the Ketraloti in the CETI project, trying to identify social codes in their "speech." Dogs become the subject of projects where artificial intelligence connects facial expressions, barking movements with emotions and even helps to record the signs of epileptic attack near the Lord. Algorithms have also successfully deciphered bee dances, showing the direction of a food source.

However, researchers emphasize: It is not about "translating" in human language, but about the identification of patterns. Any attempt to simplify the rich behavioral palette of "bad-good" binary categories contains an error risk. More reliable models should combine acoustic data with visual and physiological indicators such as posture, facial expression or heartbeat.

"The question is not how accurate we will learn to listen to animals, but how we will use this information," said scientists.

Translation of: Euromedia24.com