Understanding Pain Through Cats’ Facial Expressions
When a baby is born, they can’t tell you that they’re happy or sad so instead they communicate through their expressions, and we do the best we can to decipher them. As humans, we need to understand what these expressions mean so that we can take care of our young properly. This process happens with other non-human species, too. So far, we’ve been able to detect pain through expressions in rats, rabbits, horses, cats, pigs, sheep, and ferrets.
The most common way that we capture and interpret facial expressions is to compare two still pictures and play “spot the difference.” Unfortunately, this relies on the subjective interpretation of a human, which can introduce a lot of bias. The most common process is based on the Facial Action Coding System, a system based on the human facial muscles and how they move. Unfortunately, it’s an anthropomorphic tool and does not account for how different species have different muscles, nor how they move differently. Furthermore, the system also doesn’t take into account that two different individuals in a species may have potential variations in shape and underlying muscles of the face. Obviously, a Chihuahua and a German Shepherd don’t have the same facial shape.
Alternatively, there has been progress in the automated locating of facial “landmarks” and detecting facial action units (specific muscle movements) likely associated with pain. However, these evaluations still use human facial structure as a base. When comparing a sheep and a human, you can see there are some clear differences between them and those have to be accounted for. Over time, researchers have compiled a collection of common facial action units seen across species. However, it is likely incomplete and doesn’t encompass particular species or situations that certain expressions may only occur in. Unfortunately, comparing still pictures of two expressions loses a lot of information, like how the face changed shape over time and how quickly that change happened. This can be seen in evaluations of human emotions: people struggle to differentiate fear and surprise when all they only have still images to compare.
Another approach is to try and measure the distances of certain points on a face when a subject is in pain and when they are not. With this information, a “cartoon-like” drawing can be created to show how the face would change when expressing an emotion. To use this baseline, you would have to try and compare the image drawn to the animal in front of you. The accuracy of this method still needs more research and it is very difficult to mitigate all the variations between individuals, scale, and orientation of the animal.
In this present study, the researchers wanted to try a new method: Landmark Based Geometric Morphometric Analysis. This system uses landmarks to mark places where changes occur and then uses these points to create either a 2D or a 3D map of an animal’s face. This allows for much better observation of the entire face shape and how it changes. The researchers did this for each individual by superimposing a type of lattice on their face and placing landmarks in key places. This helps to mitigate any issues that come from individual variation in face shape or structure, scale, and orientation. It also makes it easier to look at different variables at the same time and allows measuring the direction in which landmarks move. With this system, the landmarks align with the muscles relating to the individual species so it gives us better insight into how the muscles move and react. It also allows for a more holistic measure as it takes into account the ears, muzzle, and eyes.
To test this new system, the researchers used cats, as they have well documented anatomy and facial expressions. They looked at 48 specific landmarks on the face and observed how they changed with or without pain present. The subjects of this experiment were 29 healthy domestic shorthair female cats that were undergoing a spay. Facial expressions were observed at four stages: pre-surgery, one hour post surgery before being given additional pain medications, four hours after being given the pain medications, and 24 hours after the surgery. While using the Geometric Morphometric analysis, they also used a multidimensional composite pain scale to measure if pain was present or not by looking at things like posture, comfort, and activity. This was the baseline comparison for the study.
Across the 29 cats and four stages, 932 images were captured. From these images, eight facial actions accounted for 87% of variation in face shape for all the cats. Two landmarks had significant changes, however not between all four stages. This means that they were likely not related to pain but potentially related to other forms of negative emotions or noise. Only one landmark seemed best tied to variations in pain. This landmark had a high principal component score during the second stage, or one hour post-surgery and before pain medications were given. During this stage, it was noted that the ears were angled more outward, the muzzle moved down and outwards, the eyes were slightly more squinted, and the nose moved upwards and left. The cheeks, mouth, and nose all were scrunched. In the other three stages, the score was relatively the same, with the cat having a much more neutral expression.
With this data, the researchers wanted to make sure the changes were tied directly to pain. To do this, they compared results from the reputable measure called the BOTUCATU. This system assesses pain by observing the entire body of the cat, rather than just the face. The results from BOTUCATU showed pain present in the second stage, but not as much in the other three stages, correlating with the Geometric Morphometric Analysis results.
This research, while preliminary, is a step in the right direction. This technique could be used in further settings to analyze and understand other species’ facial expressions, and further research could be done to make the tests more accurate and efficient. With advances in facial recognition, even how it’s implemented into our mobile devices, it’s feasible to think of a day when we may be able to simply point our phone cameras at the face of an animal and understand if they are in pain or not. The implications for advocacy and animal welfare are huge.