What does ‘regulating Facebook’ mean? Here’s an exampleMay 21, 2019 Off By Jill T Frey
Many officials claim that governments should regulate Facebook and other social platforms, but few describe what it actually means. A few days ago, France released a report that outlines what France — and maybe the European Union — plans to do when it comes to content moderation.
It’s an insightful 34-page document with a nuanced take on toxic content and how to deal with it. There are some brand new ideas in the report that are worth exploring. Instead of moderating content directly, the regulator in charge of social networks would tell Facebook and other social networks a list of objectives. For instance, if a racist photo goes viral and is distributed to 5 percent of monthly active users in France, you could consider that the social network has failed to fulfill its obligations.
This isn’t just wishful thinking as the regulator would be able to fine the company up to 4 percent of the company’s global annual turnover in case of a systemic failure to moderate toxic content.
The government plans to turn the report into new pieces of regulation in the coming months. France doesn’t plan to stop there. It is already lobbying other countries (in Europe, the Group of 7 nations and beyond) so that they could all come up with cross-border regulation and have a real impact on moderation processes. So let’s dive into the future of social network regulation.
Facebook first opened its doors
When Facebook CEO Mark Zuckerberg testified before Congress in April 2018, it felt like regulation was inevitable. And the company itself has been aware of this for a while.