Meta, the parent company of Facebook, announced its own set of “policies and safeguards” to political discourse ahead of the 2022 midterms.
In a blog post Tuesday, Meta announced that it would implement a series of policies to be used in the 2022 midterm elections, policies that are “consistent” with the same policies the platform used during the 2020 election. Those policies include “Preventing Election and Voter Interference,” “Connecting People With Reliable Information,” and “Transparency Around Elections and Advertising.”
“With the 2022 US midterms on the horizon, we are setting out how our approach applies in this election cycle, which is largely consistent with the policies and safeguards we had in place during the 2020 US Presidential election,” Meta President of Global Affairs Nick Clegg wrote in the announcement.
“Our approach to the 2022 US midterms applies learnings from the 2020 election cycle and exceeds the measures we implemented during the last midterm election in 2018. This includes advanced security operations to fight foreign interference and domestic influence campaigns, our network of independent fact-checking partners, our industry-leading transparency measures around political advertising and pages, as well as new measures to help keep poll workers safe,” the announcement added.
Under the heading “Preventing Election and Voter Interference,” Meta touted its existing policies around removing hate speech, noting that it had already banned 270 white supremacist organizations and removed more than 2.5 million posts for hate speech in just the first quarter of 2022. The company then touted its relationships with state and local election officials, federal agencies and industry peers.
It went on to outline its content removal policies. “As was the case in the US in 2020, election-related content we will remove includes misinformation about the dates, locations, times, and methods of voting; misinformation about who can vote, whether a vote will be counted, and qualifications for voting; and calls for violence related to voting, voter registration, or the administration or outcome of an election. We will reject ads encouraging people not to vote or calling into question the legitimacy of the upcoming election.”
The company also announced that it would prohibit new political, electoral, and social issue ads during the final week before the election.
Meta announced additional policies under the heading “Connecting People With Reliable Information.” The company touted its continuing feature sending notifications to Facebook users with voting information, and its partnership with election officials.
Meta also announced it would be implementing Spanish-language features for voters who interact with Spanish-language content. It also outlined changes to its election fact-checking system. “We have 10 fact-checking partners in the US to address viral misinformation,” the company said. “We add warning labels to content they debunk so that people can decide for themselves what to read, trust and share. We’re also investing an additional $5 million in fact-checking and media literacy initiatives ahead of the midterms,” including fact-checking services on WhatsApp; giving money to fact checkers to increase their capacity during the elections, and developing “media literacy resources to teach people how to identify misinformation for themselves.”
The changes to Meta’s policies come as the company announced that it would decrease the reach of political content on Facebook more broadly. Republican House Minority Leader Kevin McCarthy (R-CA) blasted the platform in a tweet earlier this month. “Facebook made sure no one saw the Hunter Biden laptop story before the 2020 election,” McCarthy wrote. “But now that America has record inflation, rising crime, & a border crisis — all as a result of Dem policies — Facebook is shutting down more ‘political content’ to hide the truth from Americans.”
Twitter introduced a similar set of policies to “protect civic conversation” ahead of the midterms last week, as The Daily Wire reported.