The Effectiveness Of Videos As An Advocacy Tool: An MFA Review
One of the most common tools in the animal advocates’ toolbox is the use of video to show people farmed animal (or other) cruelty. Whether through informational booths in various public settings or sharing videos via social media, videos depicting farmed animal cruelty have become more and more prevalent. Most animal advocacy groups use these types of videos in one form or another, or will share videos that other groups have produced. In particular, Mercy for Animals (MFA) is one group that has used farmed animal cruelty videos extensively; recently they set out to understand the effectiveness of that technique as an outreach strategy.
Guided by a central question—”Does watching video of farmed animal cruelty actually change people’s diets and attitudes?”—MFA contracted Edge Research to assess the effectiveness of videos on Facebook, targeting females ages 13-25. From there, people in the “experimental group” were sent to a “regular landing page with video of farm animal cruelty.” Meanwhile, the control group was “randomly sent to a different landing page that looked similar but focused on an unrelated social issue, fighting tropical diseases.” MFA posited that, “by comparing the experimental and control groups over the next few months, we could see whether the farmed animal cruelty video impacted diets and attitudes.” This hypothesis was based on the assumption that any differences seen between the two groups would “likely be due to the video, as the people in each group were similar in almost all respects except for which video they viewed.” Two to four months later, participants were surveyed about their diet and about attitudes toward meat and farmed animals in order to make a comparison. Approximately 800 people from each group responded to this survey.
What they found is largely inconclusive. MFA notes that “the experimental group actually reported eating slightly more animal products, but the difference was not significant,” and there is a good chance that because of the “low power” of the study that “the difference may well have been due to chance.” What does “low power” mean in this context? This is worth describing in more detail:
Our study was powered to detect a 10 percent difference between the groups, and since the differences between the groups were much smaller than that, we can’t be confident about whether the differences between the groups were due to chance or were the true group means. So unfortunately, our pool of participants wasn’t large enough to answer our key question. Based on our study design, it appears we would have needed tens to hundreds of thousands of participants to properly answer this question.
Based on this, the results of the study raise more questions than they provide answers. If true, the results seem to suggest that the impact of farmed animal cruelty videos may be less than a 10% change in diet in either direction. MFA did note that viewing a farmed animal cruelty video could make people more likely to self-identify as vegetarians, but this finding also wasn’t statistically significant and they are cautious about what this could mean in terms of actual change. This seems sensible given current research around self-identification and vegetarianism. In addition, they were only able to define vegetarians based on two days of diet data, which has further limitations.
What does this study mean for animal advocates? Apart from the potentially disturbing (though hopefully unlikely) implication that farmed animal cruelty videos themselves could actually increase the consumption of animal products, the main takeaway is methodological. As animal advocacy grows and we try different types of interventions, we need to figure out ways to measure their effectiveness. It’s difficult to measure any one form of advocacy in a vacuum. While this study assumed that any difference in attitudes or diets between the two groups would be based on the single video that they had seen, there is no way to control for the various messages that respondents could have received in the 2-4 months after the viewing. MFA notes possible avenues for future research on this topic and one of the major takeaways is clear: if we want to truly understand the effectiveness of video as an outreach technique, we’ll need to conduct even larger scale studies to get more conclusive results.