Facebook outlines proposal for online platform liability reform

Washington (AFP) –


Facebook chief Mark Zuckerberg urged US lawmakers Wednesday to reform the rules for online platforms' liability to require systems in place for removing unlawful content.

The proposal outlined in testimony prepared for a congressional hearing detailed Facebook's idea for reforming a law known as Section 230 which shields internet services from liability for content posted by others.

The comments come amid growing pressure across the political spectrum to hold online platforms accountable for misinformation, incitements to violence and abusive content.

Zuckerberg said in his written remarks released by a House of Representatives panel that "people of all political persuasions want to know that companies are taking responsibility for combatting unlawful content and activity on their platforms."

He maintained that Congress "should consider making platforms' intermediary liability protection for certain types of unlawful content conditional on companies' ability to meet best practices to combat the spread of this content."

Instead of being given blanket immunity, Zuckerberg said "platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it."

He maintained that online services should not be held liable for "if a particular piece of content evades its detection," saying it is not feasible for platforms with billions of posts per day, but "should be required to have adequate systems in place."

Zuckerberg said the requirements should be "proportionate to platform size and set by a third-party" so that the biggest services don't have an advantage over new startups.

The comments come ahead of what was expected to be another contentious hearing with the CEOs of Facebook, Google and Twitter, appearing remotely, to address the problems on online disinformation.

The largest online platforms have seen a growing backlash over what many see as a failure to clamp down on false and misleading content which can have real-world consequences.

A statement from the Energy and Commerce Committee said the big platforms "maximize their reach -- and advertising dollars -- by using algorithms or other technologies to promote content... (and) often elevate or amplify disinformation and extremist content."