UK warns Facebook to focus on safety as minister eyes faster criminal sanctions for tech CEOs

Europe

The UK’s recently appointed secretary of state for digital, Nadine Dorries, has signalled she wants to take a tougher line on social media platforms than her predecessor — telling a parliamentary committee she’s looking at speeding up the application of criminal sanctions for breaches of incoming UK online safety legislation.

A provision to hold named individuals criminally accountable for failing to tackle illegal or harmful content spreading on their platforms was included in the draft Online Safety bill — but deferred for two years.

In an evidence session earlier today, Dorries told the joint committee examining the bill that she wants to accelerate that timeframe — potentially cutting the deferral of criminal liability powers down to as little as three to six months after the bill becomes law.

The bill was published in draft form back in May and last month prime minister Boris Johnson suggested it will be presented to parliament before Christmas — although reports have suggested that timeframe may slip. Nonetheless, given Johnson’s whacking Commons majority, it seems likely the bill will make it onto the statute books in 2022.

So, if Dorries gets her way, criminal liability for tech CEOs could be coming to the UK as soon as next year.

Dorries argued that tech giants already know the changes they need to make to remove illegal content (like terrorism) and legal but harmful content (like pro-anorexia or self harm content) from their platforms — implying they are holding off for financial reasons. Hence the tacit suggestion is that a pressing threat of criminal sanctions is needed to concentrate tech giants’ minds.

“To platforms take note now,” she warned. “It will not be two years — we are looking at truncating it to a very much shorter timeframe.”

“That’s one of the areas as the secretary of state that I want to go further with this bill,” she went on.

“I think it’s a nonsense that platforms have been given two years to get themselves ready for what would be criminal action. They know what they are doing now. They actually have the ability to put right what they’re going wrong now. They have the ability now to abide by their own terms and conditions. They could remove harmful algorithms tomorrow.”

The need for Facebook to make changes to its algorithm to reduce virality and prevent the amplification of misinformation has been a key call of the Facebook whistleblower, Frances Haugen, who came forward last month as the source of a trove of leaked internal documents and has accused the tech giant of prioritizing profit over safety.

The notion of harmful algorithms and a systematic lack of attention to safety appears to have resonated with UK lawmakers who have spent years drafting legislation that’s intended to reboot platforms’ attitudes around content by enforcing a duty of care.

Pointing to Facebook’s recent rebranding to ‘Meta’ — and its self-trumpeted pivot toward directly massive resource into building “the metaverse” — Dorries said the tech giant should instead apply the labour of the circa 10k-20k engineers it wants to use to develop metaverse technology toward online safety and protecting children from Internet content horrors.

Here she was directly echoing a point made by Haugen in her own earlier evidence to the UK committee — when she described herself as “shocked” that Facebook wants to “double down on the metaverse” — and suggested that the under-resourced safety teams inside the company, such as the civic integrity team where she worked, could have done many things to improve safety and reduce algorithmic amplification of toxic content if they had been given 10,000 engineers.

Instead, Haugen said, Facebook’s safety teams struggled with chronic under-resourcing because the tech giant simply views safety “as a cost”, emphasizing that it is up to regulators to force platforms to prioritize safety.

Firing her own rocket at Facebook’s founder Mark Zuckerberg and his chief spin doctor, Nick Clegg — who happens to be a former liberal democrat MP (and deputy UK PM) — Dorries warned: “Rebranding does not work.

“When harm is caused we are coming after it. Put those 10k or 20k engineers now onto abiding by your terms and conditions and to removing your harmful algorithms because if you don’t this bill will be watertight.”

“Platforms know now, they know today, they know what they’re doing wrong. They have a chance to put that right absolutely right now — why would we give them two years?” she added, hammering her point into rhetoric by reiterating the question. 

“People like Mark Zuckerberg and Nick Clegg who are wanting to take off into the metaverse,” Dorries went on. “My advice would be: ‘Stay in the real world’. Because this bill is going to be an act very, very soon and it’s the algorithms that do the harm, and this act will be there and you will be accountable to this act.”

She caveated that her preferred truncated timeframe for criminal liability still requires sign-off through standard parliamentary and legislative procedures. But she told the committee she’s pressing hard for three to six months for criminal liability — which opens up a path for senior tech executives to be prosecuted for safety failures in the near future.

(Although it’s fair to say that the risk of criminal liability would most likely shrink the chance of Zuckerberg or Clegg ever personally setting foot back on UK soil down to zero. After all, the Facebook founder has already shown he’s willing to avoid the UK entirely in order to evade parliamentary scrutiny. Enforced exile for Clegg, who is British, may sting rather more though… )

During her testimony to the committee, Dorries also predicted that the Online Safety Bill will change Internet culture for good.

“I believe that there will be huge, huge [change]. This will set off a culture change in terms of our online environments and landscape,” she said. “There will be huge kickback. Because you have to follow the money — people are making a huge amount of money from these platforms and sites. And of course there will be kickback. But we must not forget the world is watching what we are doing in terms of legislating to hold those platforms to account. That is why it has to be watertight.”

The committee questioned Dorries closely on whether the bill gives enough powers to the independent regulator, Ofcom, which will be responsible for overseeing and enforcing the online safety rules — to ensure platforms can’t just wiggle through gaps and loopholes. Notably they were concerned about risk assessments, an area which Haugen had given detailed suggestions on.

The secretary of state said she was “assured” that the bill does give Ofcom adequate powers, including around transparency issues. She also emphasized the “substantial” fines it can issue, noting that the bill’s regime of financial penalties scales to up to 10% of global turnover.

“Ofcom, I believe, with the regulatory framework we have set, has the powers to request full transparency of how those algorithms are used and promoted,” she said. “We think that Ofcom have the powers to lift the lid on the algorithms — and they have the powers to set huge fines.”

But Dorries also said she’s keen to see the committee recommendations for beefing up the bill — and would be “open” to relevant suggestions.

“I can reassure you that if you have recommendations and they can pass parliamentary council and they will not dilute the scope of the bill and that they are in line with the three core principles and they will help to make this bill watertight and groundbreaking and assist with the novel nature of this bill — if there is something that we can see helps deliver we will be very much open to your recommendations,” she added.

Given what she described as the “novel” nature of the online safety legislation, Dorries said it’s her view that ongoing monitoring will be vital — suggesting a joint parliamentary committee format may be required to “keep a watching brief”.

“I don’t believe the bill goes far enough in terms of scrutiny,” she said. “For example there is a clause in the bill which says that in two to five years we have to re-examine — that is not good enough.

“The reason why it’s not good enough is because when this bill, the idea of this bill, the genesis, TikTok had not even been heard of. It’s a rapidly changing landscape and therefore I think it’s exceptional in terms of the parliamentary scrutiny required.”

“We don’t know what is doing to happen the day after this bill receives royal assent…. I think there is a role for a committee like yours for a committee like yours to continue to work in the same way that the human rights joint committee does to continue to work to scrutinise this bill moving forward,” she added.

Products You May Like

Articles You May Like

Corning offers bundle of commitments in bid to settle EU antitrust probe
Y Combinator often backs startups that duplicate other YC companies, data shows — it’s not just AI code editors
Comcast spins off its cable channels into separate company as it looks for growth
Database startup Neo4j embraces AI to supercharge growth
Ex-Duolingo execs raise $13M for a startup that’s making it easier to access a college education

Leave a Reply

Your email address will not be published. Required fields are marked *