Advertisement

SKIP ADVERTISEMENT

Op-Ed Contributors

Facebook, Free Expression and the Power of a Leak

Margot E. Kaminski and

Credit...Alex Merto

The First Amendment protects our right to use social networks like Facebook and Twitter, the Supreme Court declared last week. That decision, which overturned a North Carolina law barring sex offenders from social networks, called social media “the modern public square” and “one of the most important places” for the exchange of views. The holding is a reminder of the enormous role such networks play in our speech, our access to information and, consequently, our democracy. But while the government cannot block people from social media, these private platforms can.

In some ways, online platforms can be thought of as the new speech governors: They shape and allow participation in our new digital and democratic culture in ways that we typically associate with governments. Even Facebook’s recently updated mission statement acknowledges this important role, with its vow to give “people the power to build community and bring the world closer together.” But social media sites are not bound by the First Amendment to protect user speech. Facebook’s mission statement says as much, with its commitment to “remove bad actors and their content quickly to keep a positive and safe environment.”

Until recently, the details of the types of posts Facebook prohibited were a mystery. That changed on May 21 when The Guardian released over 100 pages of leaked documents revealing Facebook’s internal rules. This newfound transparency could mean Facebook will be held accountable to the public when it comes to its decisions about user speech.

Facebook has often been pressured to explain or alter its approach to moderating users’ speech, in cases involving topics like breast feeding pictures, Donald Trump’s posts about banning Muslims from entering the United States and the video of a Cleveland murder. But before this leak, nobody outside the company could say exactly how it made decisions — and it was under no legal obligation to share.

This leak provides some answers: Facebook’s content policies resemble United States law. But they also have important differences.

For example, Facebook generally allows the sharing of animal abuse, a category of speech the Supreme Court deemed protected in 2010. But diverging from First Amendment law, Facebook will remove that same imagery if a user shows sadism, defined as the “enjoyment of suffering.”

Similarly, Facebook’s manual on credible threats of violence echoes First Amendment law on incitement and true threats by focusing on the imminence of violence, the likelihood that it will actually occur, and an intent to credibly threaten a particular living victim.

But there are also crucial distinctions. Where First Amendment law protects speech about public figures more than speech about private individuals, Facebook does the opposite. If a user calls for violence, however generic, against a head of state, Facebook deems that a credible threat against a “vulnerable person.” It’s fine to say, “I hope someone kills you.” It is not fine to say, “Somebody shoot Trump.” While the government cannot arrest you for saying it, Facebook will remove the post.

These differences are to be expected. Courts protect speech about public officials because the Constitution gives them the job of protecting fundamental individual rights in the name of social values like autonomy or democratic self-governance. Facebook probably constrains speech about public officials because as a large corporate actor with meaningful assets, it and other sites can be pressured into cooperation with governments.

Unlike in the American court system, there’s no due process on these sites. Facebook users don’t have a way to easily appeal if their speech gets taken down. And unlike a government, Facebook doesn’t respond to elections or voters. Instead, it acts in response to bad press, powerful users, government requests and civil society organizations.

That’s why the transparency provided by the Guardian leak is important. If there’s any hope for individual users to influence Facebook’s speech governance, they’ll have to know how this system works — in the same way citizens understand what the Constitution protects — and leverage that knowledge.

For example, before the Guardian leak, a private Facebook group, Marines United, circulated nude photos of female Marines and other women. This prompted a group called Not in My Marine Corps to pressure Facebook to remove related pages, groups and users. Facebook announced in April that it would increase its attempts to remove nonconsensual nude pictures. But the Guardian leaks revealed that the pictures circulated by Marines United were largely not covered by Facebook’s substantive “revenge porn” policy. Advocates using information from the leaks have begun to pressure Facebook to do more to prevent the nonconsensual distribution of private photos.

Civil liberties groups and user rights groups should do just this: Take advantage of the increased transparency to pressure these sites to create policies advocates think are best for the users they represent.

Today, as social media sites are accused of spreading false news, influencing elections and allowing horrific speech, they may respond by increasing their policing of content. Clarity about their internal speech regulation is more important now than ever. The ways in which this newfound transparency is harnessed by the public could be as meaningful for online speech as any case decided in a United States court.

Margot E. Kaminski is an assistant professor at the Ohio State University Moritz College of Law. Kate Klonick is a Ph.D. candidate at Yale Law School.

Margot E. Kaminski and Kate Klonick

Follow The New York Times Opinion section on Facebook and Twitter (@NYTopinion), and sign up for the Opinion Today newsletter.

A version of this article appears in print on  , Section A, Page 23 of the New York edition with the headline: Speech in the Social Public Square. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT