In the memo, he said, “We will continue to face investigation – some of it fair and some of it unfair.” “But we must also continue to hold our heads high.”
Here’s Mr. Clegg’s full memo:
Our position on polarization and elections
You may have seen a series of articles published about us in the Wall Street Journal in recent days, and this has stirred the public interest. This Sunday night, a former employee leaking internal company material to the Journal will appear in a segment on 60 Minutes on CBS. We understand that this piece may emphasize that we contribute to polarization in the United States, and suggest that the extraordinary steps we took for the 2020 election were too early to be relaxed and in the Capitol. Contributed to the horrific events of 6 January.
I know some of you – especially those of you in America – are going to get questions from friends and family about these things, so I wanted to take some time to go over the weekend so I hope That’s some useful context about our work in these important areas.
Facebook and polarization
People are quite concerned about the divisions in the society and are looking for solutions and solutions to the problems. Social media has had a major impact on society in recent years, and Facebook is often the place where much of this debate takes place. So it is natural for people to ask if this is part of the problem. But the idea that Facebook is the main cause of polarization is not backed by facts – as Chris and Prati put in their note on the issue earlier this year.
The rise of polarization has been the subject of serious academic research in recent years. In truth, there isn’t a great deal of consensus. But the evidence that is there does not support the idea that Facebook, or social media in general, is the primary cause of polarization.
The rise in political polarization in America predates social media by several decades. If it were true that Facebook is the main cause of polarization, we expect to see it go up wherever Facebook is popular. It is not. In fact, polarization has decreased in many countries with high social media usage, as well as increased in the US.
In particular, we expect reporting to suggest that changes to Facebook’s News Feed ranking algorithm were responsible for polarizing content on the platform. In January 2018, we made ranking changes to promote meaningful social interactions (MSI) – so you can see more content in your News Feed from friends, family, and groups you’re part of. This change was driven heavily by internal and external research that showed that meaningful engagement with friends and family on our platform was better for people’s well-being, and we refined and improved this over time as we looked at all ranking metrics. do together. Of course, everyone has an evil uncle or an old-school classmate who holds strong or extreme views that we disagree with—that’s life—and change means you’re more likely to come across their positions as well. keep. Nevertheless, we have developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform has now come down to around 0.05%.
But the simple fact is that changes to the algorithmic ranking system on a social media platform cannot explain the widespread social polarization. In fact, polarizing content and misinformation exist even on platforms that have no algorithmic ranking, including iMessage and private messaging apps like WhatsApp.
elections and democracy
There is perhaps no other topic that we as a company have been more vocal about than our work to dramatically change the way we approach elections. From 2017, we began to build new defenses, bring in new expertise, and strengthen our policies to prevent interference. Today, we have over 40,000 people working on safety and security across the company.
Since 2017, we have disrupted and removed more than 150 covert influence operations, ahead of major democratic elections. In 2020 alone, we removed over 5 billion fake accounts – identifying almost all of them before anyone flagged us. And, from March to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.
Given the extraordinary circumstances of holding controversial elections in a pandemic, we implemented so-called “glass-breaking” measures – and talked about them publicly – before and after election day to respond to specific and unusual signals. To disseminate potentially infringing content before our content reviewers assess it against our policies to keep what we were viewing on our platform and to keep it.
These measures were not without trade-offs – they are blunt tools designed to deal with specific crisis scenarios. It’s like closing down entire city streets and highways in response to a temporary threat that may be lurking somewhere in a particular neighborhood. In implementing them, we are aware that we have impacted a significant amount of content that does not violate our rules to prioritize the safety of people during periods of extreme uncertainty. For example, we have limited the distribution of live video that our system may have polled. It was an extreme move that helped prevent potentially infringing content from going viral, but it also affected completely normal and appropriate content, some of which had nothing to do with the election. We would not take such crude, catch-all measures under normal circumstances, but these were not normal circumstances.
We only rolled back these emergency measures – based on careful data-driven analysis – when we saw a return to more normal circumstances. We left some of them for a longer period until February this year and others, such as citizens, political or not recommending new groups, we have decided to keep permanently.
Fighting hate groups and other dangerous organizations
Let me be absolutely clear: We work to limit hate speech, and we have clear policies that prohibit content that incites violence. We don’t benefit from polarization, in fact, just the opposite. We do not allow dangerous organizations, including militarized social movements or conspiracy networks to incite violence, to organize on our Platform. And we remove content that praises or supports hate groups, terrorist organizations and criminal groups.
We have been more aggressive than any other Internet company in combating harmful content, including content that seeks to make elections illegal. But our work to crack down on these hate groups was going on for years. We removed thousands of QAnon pages, groups, and accounts from our apps, removed the original #StopTheSteal group, and removed references to Stop the Steal prior to the inauguration. In 2020 alone, we removed over 30 million content that violated our policies regarding terrorism and over 19 million in 2020 that violated our policies around organized hate. We designated the Proud Boys as a Hateful Organization in 2018 and we continue to remove their praise, support and representation. Between August last year and January 12 of this year, we identified nearly 900 militia organizations as part of our Dangerous Organizations and Individuals policy and removed thousands of pages, groups, events, Facebook profiles and Instagram accounts associated with these groups.
This work will never be completed. There will always be new threats and new problems to solve in America and around the world. That’s why we stay alert and vigilant – and always will be.
This is why it is sometimes suggested that there would not have been a violent uprising on 6th January if it were not for social media it is so misleading. To be clear, responsibility for those incidents rests entirely with the perpetrators of violence and those actively encouraging them in politics and elsewhere. Mature democracies in which social media is widely used, conduct elections all the time – for example Germany’s election last week – without the presence of violence. We actively share with law enforcement material that we may find on our Services related to these traumatic events. But narrowing down the complex causes of polarization in America – or rebellion in particular – is too simplistic for a technical explanation.
We will continue to face scrutiny – some of it fair and some of it unfair. We will continue to be asked tough questions. And many people will continue to doubt our motives. That’s what comes with being part of a company that has significant influence in the world. We must be humble enough to accept criticism when it is appropriate, and make changes where it is appropriate. We are not perfect and we do not have all the answers. So we do the kind of research that has been the subject of these stories. And we’ll continue to look for ways to respond to the feedback we get from our users, including testing ways to ensure that political content doesn’t dominate their News Feed.
But we too must continue to hold our heads high. You and your team do an incredible job. Our equipment and products have a hugely positive impact on the world and people’s lives. And you have every reason to be proud of that work.