If you’ve ever had a pet cat, you know they’re masters at communicating. Since we domesticated felines more than 10,000 years ago, they’ve learned to combine body language with an array of meows, purrs, and chirps to convey their needs to humans. Cat colonies also have complex social relationships based on their rank, age, sexual status, and genetics. So, how do they “talk” to each other?
Researchers have shown primates and other mammals use facial signals to communicate. For example, gorillas often mirror each other’s faces while playing, a phenomenon called rapid facial mimicry. Scientists have correlated rapid facial mimicry with emotional recognition and think it could even have developed as a precursor to empathy in humans. Vets are keenly aware that cats make different faces when they’re afraid, frustrated, relaxed, or in pain. They’ve found cats often blink when they’re afraid, and lick their nose or hiss when frustrated. But scientists don’t know whether felines in colonies or multi-cat households rely on facial signals like mimicry to get along.
Researchers from Israel and the USA recently developed new automated methods to test whether domestic cats use facial mimicry. In the past, scientists have compared animal facial expressions by recording discrete movements manually using a standardized Facial Action Coding System, or FACS. Other researchers have tracked facial shape changes by following specific points of reference, or landmarks, on an animal’s face. Both methods are time-consuming and subjective, so this team proposed that machine learning could speed up the process and minimize bias.
The team analyzed 186 videos of 53 adult short-haired cats socializing at the CatCafe Lounge in Los Angeles, CA, between August 2021 and June 2022. They categorized friendly interactions between cats, like grooming or snuggling, as affiliative social interactions. In contrast, they categorized neutral or antagonistic interactions, like staring or hissing, as non-affiliative interactions. Since other mammals use facial mimicry to bond, they hypothesized the cats would mirror each other’s faces more often during affiliative interactions than non-affiliative ones.
First, the researchers tested whether a machine learning model could correctly classify how cats interacted in the videos. They used a model called Tree-Based Pipeline Optimization Tool or TPOT, which was previously used to sort genetic data. They started with a manually assembled CatFACS dataset that tracked 48 different lip, ear, and eye movements in the videos. Then they used the CatFACS dataset to train TPOT on 147 videos and test how well it performed on another 37. They found the model correctly identified interactions from feline facial movements in the videos 74% of the time, and up to 77% of the time when they included the order in which the movements occurred.
Next, they examined how well TPOT characterized cat interactions based on facial landmarks, which included 48 reference points spanning their eyes, ears, nose, and mouth. They started with an automated landmarking system that quantified the cats’ facial signals from raw video footage. The advantage of the automated approach is that it detects quick, subtle movements humans could miss. They used the automated landmark data to train TPOT on more than 87,000 video frames and test it on 22,000 others. They found that TPOT categorized landmark-based facial signals less accurately than the CatFACS-based signals – it was right 64% of the time or up to 68% of the time when it considered the sequence of facial changes. They suggested that this discrepancy could mean the fully automated landmarking was actually more error-prone than manual tracking.
Finally, the researchers analyzed their full CatFACS dataset with TPOT to see when one cat mimicked some or all of another cat’s facial expression. In line with their hypothesis, they found that cats mimicked each other statistically more often during affiliative interactions than non-affiliative ones. They also found that cats most commonly mimicked each other’s ear movements. They explained this result supported past researchers’ assertions that domestic cats use their ears to communicate, but acknowledged the cats could be responding to external sounds rather than each other.
The team concluded that cats use rapid facial mimicry to bond, like other mammals. They proposed that these facial cues help colonial cats navigate their complex social lives and coexist in harmony. However, they conceded that the automated landmark-based facial tracking could be improved by including more cameras or testing cats under controlled situations to reduce external influences. Regardless, they suggested that automated tracking of feline facial signals could someday help vets and shelters determine which cats get along well enough to live together.