France’s lower chamber of the parliament has voted in favor of a controversial law against hate speech on social networks and online platforms. As I described last year, online platforms will have to remove within 24 hours illicit content that has been flagged. Otherwise, companies will have to pay hefty fines every time they infringe the law.
What do they mean by illicit content? Essentially, anything that would be considered as an offense or a crime in the offline world is now considered as illicit content when it’s an online platform. Among other things, you could think about death threats, discrimination, Holocaust denial…
For the most extreme categories, terrorist content and child pornography, online platforms must react within an hour.
While online hate speech has been getting out of control, many fear that online platforms will censor content a bit too quickly. Companies don’t want to risk a fine so they might delete content that doesn’t infringe the law just because they’re not sure.
Essentially, online platforms have to regulate themselves. The government then checks whether they’re doing a good job or not. “It’s just like banking regulators. They check that banks have implemented systems that are efficient, and they audit those systems. I think that’s how we should think about it,” France’s digital minister Cédric O told me in an interview last year.
There are multiple levels of fines. It starts at hundreds of thousand of euros but it can reach up to 4% of the global annual revenue of the company with severe cases. The Superior Council of the Audiovisual (CSA) is the regulator in charge of those cases.
Germany has already passed similar regulation and there are ongoing discussions at the European Union level.