You do not have to be a fan of conspiracy theories to worry about the precedent that Apple, YouTube and Facebook set by banning Alex Jones and Infowars.com. Jones complained that it was like ‘something out of “1984”’, as he does not know how to rebut Facebook’s decision. The company does not divulge how it makes these decisions, for fear that users would deliberately push boundaries.
Jones is not alone in being concerned with the seemingly authoritarian power of tech giants to decide what gets removed in an opaque content-moderation process. As de facto arbiters of speech and access to audiences, companies with such consolidated publishing power have unique responsibilities to be transparent with the public.
Instead of making decisions in private isolation, companies would do better by engaging one another and their users to shape content-moderation policies in a more transparent and consistent way. We propose a deliberative body, a ‘content congress’ where stakeholders—including companies, civil-society groups and even constituencies of end-users—could hash out best practices, air grievances, and offer rebuttals.
How would it work? Multistakeholder initiatives take varied forms, and we are not advocating for a specific one. We offer multistakeholderism because simple calls for government regulation or self-regulation are not enough—the public wants input. Such a body should not be a legally binding authority but an arena for transparent coordination, public representation and human engagement in an industry dominated by algorithms and machines. Companies want feedback; government wants more insight into decision-making; and people want to be heard.
The content congress should have issue-based working groups, which would proactively or reactively address complicated use cases. For instance, how do you address videos depicting unjust violence or death? Facebook faced backlash for censoring videos of police brutality that resulted in on-camera death posted by social-justice groups. How do you curb pornography, without censoring images of female breasts in instances of breastfeeding or health advocacy?
And what should the protocol be when social media stars inspire violent trolls among their followings? A content congress could facilitate a transparent process in handling Jones, whose fans harassed the parents of Sandy Hook victims, or Milo Yiannopoulos, whose followers hacked actress Leslie Jones’s website and harassed her until she left Twitter. Given the public nature of a content congress, Messrs. Jones and Yiannpoulos would probably have a say too.
For private companies to be part of such a loose organisation might sound strange, but there is a precedent. The Internet Corporation for Assigned Names and Numbers (ICANN), a nonprofit governance body, maintains the functions of the Domain Name System (DNS) through multistakeholder decision-making. Early Internet advocates rejected direct government control and sought a more open and inclusive way to make policy; ICANN, with its sometimes-slow or complex decision-making, was the answer. Stakeholders include businesses, nonprofits, activists and governments.
Another example is the Global Network Initiative, an international nonprofit dedicated to Internet freedom and privacy rights. Drawing from the expertise of stakeholders ranging from the Harvard Berkman-Klein Center to the Committee to Protect Journalists, it helps companies like Facebook and Vodafone resist government requests to surveil or censor users and their communications.
Beyond tech, there are similar examples in the finance industry’s coordination to fix systemic issues and avoid disaster. Algorithmically enhanced trading once posed a threat to healthy financial markets in creating volatile high-frequency trading. Proactive self-regulation through the Financial Industry Regulatory Authority brought together industry stakeholders to come up with solutions, staving off disaster and more government regulation.
Multistakeholder initiatives can be complex and slow, but they are preferable to the current slew of black-box content-moderation practices, or to heavy-handed government regulation. Many communities online—left, right and centre—have experienced the arbitrary decision-making of powerful content platforms. These frustrated communities are starting to call their representatives to demand government action.
Companies would do well to be proactive by engaging each other and stakeholders in solution-making before governments or courts step in. Recently, key industry players expressed their willingness to collectively discuss their content moderation practices in a more open forum. This participation is a signal of goodwill and an opportunity to create a sustainable, innovative multistakeholder body within the industry.
Danielle Tomson is the director of the Personal Democracy Forum. David Morar recently received his doctorate from George Mason University’s Schar School. This blog post first appeared in the Opinion / Commentary section of the Wall Street Journal and is being re-published here with the permission of the authors.