Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC.
Yasin Ozturk | Anadolu Agency | Getty Images
With less than two months to go until the 2020 U.S. election, Facebook is struggling to assure users and employees that it has everything under control.
Four years ago, the social network took no action as Russian operatives posed as political groups with polarizing agendas, even going so far as to organize fake rallies, according to the FBI. Their goal: to divide the American populace and help elect Donald Trump president. Right after the 2016 election, Facebook CEO Mark Zuckerberg dismissed as “crazy” the idea that fake news on Facebook could have influenced the election. A year later, Zuckerberg said he regretted saying that.
Since then, the company has insisted that it had learned from its mistakes.
But over the past 12 months, Facebook has frustrated some users and employees with its policy decisions around speech on its platform, and its haphazard enforcement of those policies.
These decisions loom large considering that Facebook has 198 million daily active users in the U.S. and Canada. Adding to the concern is the limited oversight over Zuckerberg’s handling of the company. Zuckerberg holds more than 51% of voting shares, and in 2019, he pushed out several directors who questioned his authority, according to The Wall Street Journal.
“Elections are different now and so are we. We’ve created new products, partnerships, and policies to make sure this election is secure,” a Facebook spokesman said in a statement. “We’ve faced criticism from Republicans for being biased against conservatives and Democrats for not taking more steps to restrict the exact same content. Our job is to create one consistent set of rules that applies equally to everyone.”
Stumbles so far
Here’s a rundown of some Facebook moves that have drawn criticism:
Misinformation in political ads. Facebook started to rile up observers late last year when the company announced it would allow political candidates to run ads containing misinformation. The company said it was a bastion of free speech, and that it believed users should see for themselves what candidates had to say, whether or not what they said was true. The policy decision proved to be contentious. A group of employees wrote a letter to Zuckerberg and Facebook’s executives, later leaked to The New York Times, letting them know that they strongly objected to the policy.
“When the looting starts, the shooting starts.” In late May, Trump posted criticism of the Black Lives Matter protests, saying that “when the looting starts, the shooting starts.” Facebook decided to leave the post up in its entirety while rival Twitter limited the visibility of Trump’s post.
Numerous Facebook employees publicly criticized the decision to leave Trump’s post up, arguing that the post violated the company’s community standards, which do not allow the incitement of violence. The employees protested the decision by staging a virtual walkout.
Nazi symbolism. Days later, John Buysse of Fortune pointed out that Facebook had allowed the Trump campaign to run ads containing symbols used by Nazis to identify political prisoners. The company eventually removed the ads, but only after numerous users had spotted the ads and called them out. A week later, the company announced that it would prohibit hate speech in its ads.
The Kenosha militia takedown. Facebook’s enforcement of its policies once again came under fire last month following the killing of two people during a Black Lives Matter protest in Kenosha, Wisconsin.
In late August, Facebook introduced a new policy that would allow the company to remove militia groups and groups that seek to incite violence. Despite this new policy and hundreds of user reports, Facebook failed to remove the page for a militia group in Kenosha and an event created by the group.
Zuckerberg claimed to have removed the group and its event after the killings at the protest, saying the company’s failure to remove the pages proactively had been “an operational mistake,” but days later, BuzzFeed reported that Facebook had not actually removed the event page. Rather, a page administrator for the militia group removed its event page, and later, Facebook removed the militia’s page. A Facebook spokeswoman apologized “for the error.”
Trump’s voting suggestion. Employees criticized the company yet again a few days later after Trump suggested that mail-in voters go to their polling place to ask if their vote had been counted, and if there was no record, to demand they be allowed to vote in person. It is illegal to vote twice in the same election.
Trump’s post came shortly after Zuckerberg said in a Facebook post that the company would remove explicit and implicit misrepresentations about voting that could lead to voter suppression. Numerous Facebook employees criticized the company within its internal social network for allowing Trump’s post to remain live, according to BuzzFeed. Eventually, the company placed a label on the post stating that voting by mail is a trustworthy process in the U.S., but the post remains up.
A consequential election
The election rhetoric and intensity will only heat up in the final weeks before Nov. 3. Facebook says it’s ready for complicated outcomes, including the possibility that the presidential election might not have a clear winner on election night.
“It’s particularly important that people know not just where the vote is, what’s happening, what’s going to happen next and that there is a process in place that is working to get to an accurate and fair result,” Facebook security policy chief Nathaniel Gleicher told reporters in August.
Meanwhile, many critics have said that Facebook and Zuckerberg simply have too much power over speech — Facebook co-founder Chris Hughes last year decried Zuckerberg’s “unilateral control of speech” — and proposed a breakup of the company as a solution. That message seems to be resonating with lawmakers in Washington and around the country, as antitrust investigations mount.
Some lawmakers have also proposed changing a key rule from the 1990s that shields online sites from liability for what their users post and allows them to moderate those posts in good faith without fear of reprisal. Changing that law, Section 230 of the Communications Decency Act, could upend the company’s entire business model.
There have also been legislative proposals to manage how political ads are regulated online, such as the bipartisan Honest Ads Act, which was introduced in 2017. The Honest Ads Act would require similar disclosures and other safeguards for online political ads that you already see in other media like print and television.
Zuckerberg and other Facebook executives have repeatedly called for lawmakers to address these issues as the company has made moves to solve them itself. Meanwhile, other social media companies like Twitter and Pinterest have overcorrected and banned political ads. But lawmakers have shown little appetite to pass any kind of major regulation on the tech industry, meaning Facebook continues to chart its own course.
Facebook’s ability — or inability — to control speech on its own platform in the election season could be a critical factor in how these lawmakers view the company in the years to follow.