In September, 2021, the Wall Street Journal published a series of articles under the umbrella of The Facebook Files, detailing a number of explosive revelations about how the tech company has affected our lives by manipulating the information we see in our social media feeds. Scientist and noted science fiction author David Brin has written that our great societal addiction is to outrage; the Facebook Files offers compelling evidence that Facebook stoked that outrage in order to boost its own fortunes.
Internal Facebook documents made public by a former Facebook employee, Frances Haugen, showed that Facebook actively courted outrage by, among other things, making an “Angry” emoji available as a possible reaction to a post on the social media service, and by then prioritizing posts that had been assigned this emoji so that they would be seen more often. Subsequent investigations revealed that these kinds of posts were often toxic, polarizing, and more likely to be misleading or false.
Americans have been increasingly wary of Facebook and the sway it holds over how we think, how we feel, and the news we see. For some, Haugen’s revelations were likely not all that surprising. But the real import of the documents she made public was the fact that Facebook was not only aware of its damaging role in society, it was actively studying it — and (according to Haugen), flouting the findings by continuing to prioritize page views over societal good.
One of the most visible manifestations of Facebook’s algorithmic manipulation of our lives is in the realm of politics. The New York Times reported in October that the company was well aware that misinformation about the 2020 Presidential Election was being spread on the platform, and that many employees felt that the company could be doing more to combat the spread of false information from QAnon and other sources. According to the NYT article, some Facebook employees raised alarms that the platform’s tolerance of various “Stop The Steal” groups may have played a role in facilitating the January 6th Capitol riot, with an internal document noting:
…the on-platform experiences on this narrative may have had substantial negative impacts including contributing materially to the capital riot and potentially reducing collective civic engagement and social cohesion in the years to come.
Politics has always been divisive. The disagreements between Democrats and Republicans are legion, from abortion to taxation, and as disagreeable as many find the ad hominem assaults of modern politics, the election between Jefferson and Adams in 1800 was also riddled with personal attacks, ranging from “fake news” (Jefferson supporters claiming Adams planned to marry off his son to the daughter of King George III) to overtly racist attacks against Jefferson. Politics has always been brutal theater. What has changed in recent elections, however, is the extent to which American citizens have been pitted against each other in the service of the play. The revelations of the Facebook Papers make it clear that Facebook, too, has had a part in that play, and Americans (including voters) are beginning to take notice.
My company, Edison Research, has been the sole provider of exit polling data to the National Election Pool since 2003, and we have been studying shifting patterns in the electorate and the issues that drive voters for nearly three decades. We have also been studying the usage of social media platforms since the dawn of MySpace, and the confluence of these two datasets has produced some fascinating analysis over the years. In 2020, we began a weekly tracking service called The Social Habit, which offers an ongoing measure of US social media users, their habits, and their perceptions of the various platforms in use today. Because we are continually in the field with this project, we have the opportunity to take quick snapshots of emerging trends (e.g., Clubhouse Users in America) and also the impact of important political events on social media behaviors (Twitter Before And After Trump).
We were interested in the potential impact of the Facebook Papers and the testimony of Frances Haugen on the public’s perception of Facebook and its role in society. As a part of our ongoing Social Habit research, we surveyed 1,114 social media users, ages 18+, between October 29 and November 18, 2021, to determine just how aware social media users were of Haugen’s testimony, and what their current perceptions of both Facebook and social media in general were after the initial news had been given a month to settle. We wondered if people would recall the story, and more importantly, if it still resonated, and with whom.
What we found was truly surprising. In a modern political climate in which Democrats and Republicans (and more broadly, Liberals and Conservatives) hold diametrically opposed views on nearly every political issue, Facebook appears to be a galvanizing force that both can agree upon.
First, we asked social media users if they were aware of the Frances Haugen testimony, and more importantly, whether or not they believed her. The answers were remarkably similar, especially in a climate in which we are continually exposed to polling about vaccines, the climate, and other issues that show radically differential responses:
The majority of social media users (54%) were aware of Haugen’s allegations. While Liberals were more likely to be aware of them than Conservatives, the degree to which either believed Haugen was roughly the same, which becomes more clear when you look at the above question only among those who were aware of the allegations:
Among those aware of the Haugen story, over two-thirds of Republicans AND Democrats, Liberals AND Conservatives, believe her allegations against her former employer. What’s more, there are clear indications that the Facebook Papers have had a negative impact on how much people trust Facebook. It should be noted that even before this story broke, both Republicans and Democrats had trust issues with the service:
Among social media users, three in ten Democrats, and nearly half of Republicans, already didn’t trust Facebook. While there are variances apparent on this graph, the overall view is similar — not many people would say that they trust Facebook “a lot,” which in itself is probably not a surprising finding. However, even in this climate, the injection of the Haugen testimony and related documents has appeared to damage Facebook’s perception even further:
Among those aware of the Haugen allegations, nearly two-thirds say that they trust Facebook less (and a third “much less”) as a result. The data from Liberals and Conservatives is statistically identical. And while there is a slight trust gap apparent between Democrats and Republicans in these data, keep in mind the graph immediately preceding — Republicans already trusted Facebook less than Democrats. The Haugen revelations simply leveled the playing field.
For many, these allegations were simply one more story that supported a conclusion they had already drawn about Facebook’s role in society; for others, this may have been the first time that they had actively questioned whether or not the company was doing more harm than good. In either case, at the end of 2021, we are witness to a remarkable landscape in terms of public opinion: while poll after poll shows the American public to be significantly polarized on issue after issue, the Facebook situation is the rare cause that can potentially unite the electorate.
Democrats and Republicans often poll extremely differently on many social issues, but their attitudes about Facebook are astonishingly similar. For instance, the perceptions of social media users about Facebook’s impact on our mental health (and the mental health of our children) are virtually identical between Democrats and Republicans:
Partisan politics has led Americans to be incredibly divided about many things, including how we raise and educate our children. But regardless of what side of they aisle they vote with, more American social media users agree that Facebook is bad for our mental health — and that of our children — than disagree. This perception, along with an increasingly widespread awareness of the platform’s role in spreading misinformation, has contributed to a growing sense that Facebook’s role in our society may be greater than we bargained for, and more than we would like, as this graph details:
The majority of Republicans and Democrats agree that Facebook has “too much power,” and that agreement is again statistically nearly identical.
This leads us to perhaps the most remarkable data point from this Social Habit exploration. The recent polarization surrounding the proposed Build Back Better legislation is emblematic of the deep divisions that our elected officials, and the electorate itself, have about what should and should not be legislated by the government. John Haidt’s research into the moral roots of Conservatives and Liberals pinpoints the core values that underpin both ideologies, including their differential views on change/stability and fairness/reciprocity. These differences have been expressed in legislation for decades, as both sides have markedly different perspectives about what the government can and should legislate, and what it should “leave alone.” Often, there is no rational explanation for shifting perspectives on the role of legislation, with both sides choosing to define themselves ideologically simply as standing against the other. Pick an issue, from abortion to trans rights to immigration to guns, and you will discover not only fundamental disagreements on the core issue itself, but also in the role of government to intervene.
Against that backdrop, we have the case of Facebook. When we asked American social media users if they thought that Facebook should be regulated, the results were incredibly similar:
While there are legions of issues that would show a differential in the “strongly disagree” column, Facebook is not one of them. The problem of Facebook is one that is recognized by both Democrats and Republicans, and the degree to which they want the government to intervene is something upon which, however improbably, they agree in equal measures.
Those who subscribe to the ideology of either party will have their own views on the defining issues of our current period in history. For some, our defining issue is climate change; for others, it’s immigration and economic preservation. It’s highly unlikely that regulating Facebook and moderating the power it holds over Americans is seen as a crucial issue by either party. But in terms of something both can agree upon, it might be near the top of the list.