Home Knowledge Content Moderation: Facebook Oversight Board Announced

Content Moderation: Facebook Oversight Board Announced

 

Some eighteen months after announcing an independent external review board, the first twenty members of Facebook’s new “Oversight Board” were revealed on 6 May 2020.  The Board, embodying an idea once heralded by Mark Zuckerberg as being “almost like a Supreme Court”, will hear appeals against some content removal decisions made by Facebook’s in-house moderation teams.  The Board will also make recommendations regarding Facebook’s content policies.  The Board members include a Nobel Peace Prize laureate, a former prime minister, lawyers, journalists and free speech advocates. The full list of initial members can be viewed here.  

What Will the Oversight Board Do?  

The Board’s Charter states its purpose is “to protect free expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Facebook’s content policies”.  

In practice, it will decide whether Facebook’s content moderators correctly applied Facebook’s existing content policies in choosing to remove certain content from its platforms.  Facebook has committed to treating the Board’s appeal decisions as final and binding.  In reaching decisions, the Board will view its own past decisions as highly persuasive and will “pay particular attention to the impact of removing content in light of human rights norms protecting free expression”. The Board may also issue advisory opinions recommending that Facebook amend its content policies.   

What Content and Decisions Are In Scope?

At the outset, the Board will review only decisions to remove individual pieces of content (like posts, photos, videos, and comments) from Facebook or Instagram.  

Everything else is out of scope for now.  The Board will not initially review decisions by Facebook to leave content up on its platform following receipt of a take-down request.  It is intended that the Board’s remit will expand in the future to include leave-up decisions and decisions to remove groups, pages, and user accounts.  The Board will not review content on other Facebook products.  Unsurprisingly, the Board will not have jurisdiction over content Facebook removes due to a belief that it has a legal obligation to do so.  

Importantly for businesses, creators, and owners of intellectual property rights, the Board will not review reports involving copyright, trade marks, or counterfeits for sale (e.g. on Facebook Marketplace).  At least for now, the hundreds of thousands of intellectual property infringement reports received by Facebook every year will continue to be decided internally at Facebook.  Any lingering disputes will continue to be resolved offline by the parties involved.  

How Did We Get Here?

Social media platforms typically avail of internet intermediary immunity laws in many jurisdictions.  These include the hosting defence found in the EU’s eCommerce Directive which excludes liability where the host acts expeditiously to remove unlawful content once notified.  Such frameworks led to platforms relying on notice-and-takedown systems for unlawful content, while allowing them space to govern other user-generated content on their platforms as they saw fit.  

 

“Techlash” & Regulation

Since coming to prominence as stewards of public space, leading platforms have faced criticism as vectors of problematic content including bullying, hate speech, and misinformation.  “Fake news” and the virality of content such as footage of the Christchurch mosque shootings in 2019 have led to unprecedented scrutiny.  At the same time, platforms have come under fire for over-censoring content.  Criticism has often been accompanied by calls for greater platform accountability and transparency.

Some governments have reacted to those calls with more stringent regulation. Germany led the way in Europe with its Networks Enforcement Act (NetzDG), requiring removal of “obviously illegal” content within 24 hours and increased transparency.  France’s “Avia Law” proposes a similar regime, and the European Commission is now proposing a Digital Services Act in a similar spirit.  The United Kingdom is consulting on proposals to regulate online harms, perhaps quite strictly. In Ireland, the outgoing Government had approved the outline for an Online Safety and Media Regulation Bill to include a regulation of harmful online content.

 

Facebook’s Response

Facebook has reacted by doubling its content moderation workforce, allowing internal appeals against content decisions and publishing more-detailed internal guidelines used by its teams to enforce its content policies.  Facebook has also publicised its use of artificial intelligence to proactively identify and remove harmful content and to “down-rank” sensational and provocative “borderline” content.   The final piece of this jigsaw is the Oversight Board, first promised in November 2018 and now, with founding documents published and members revealed, close to coming into operation.  

What to Look Out For

How much the Board pushes back on Facebook decisions, and its published reasons for doing so, will be closely watched.  The Board’s initially limited scope means that any appeal it upholds may be seen as a victory for free speech advocates.  That view may evolve if or when the Board’s role expands to reviewing “leave-up” decisions – the Board will then have an opportunity to tell Facebook to remove or down-rank content it would otherwise have left up.  

Reaching decisions that harmonise Facebook’s content policies with human rights norms while meeting the expectations of diverse political communities will undoubtedly be a challenge.  If the speed, quality and impact of the Board’s decisions go some way towards achieving that, it is possible to imagine the Board’s role expanding and, if other platforms adopt a similar approach, the appetite for greater regulation of online content tapering off. Success is not guaranteed and implementing the Board’s decisions at scale will remain a significant obstacle.  Missteps, controversies, or delays in the first months and years may see the Board’s role being overtaken by the drive for stricter regulation in Europe and elsewhere.  

William Fry has dedicated Dispute Resolution, Intellectual Property, and Technology practices with deep expertise in alternative dispute resolution, intellectual property, and social media regulation.  If you have any queries or would like to know more, please reach out to Leo Moore, Laura Scott or your usual William Fry contact.  

 

Contributed by John Sugrue