Surprise! Elon Musk’s tenure at Twitter is already shaping up to be confusing and contradictory.
Whether this dynamic ends up being more self-defeating for him and his new company than harmful for the rest of humanity and human civilization remains tbc.
On the one hand, a fresh report today suggests Musk is preparing major staff cuts: 25%, per the Washington Post. (He denied an earlier report by the same newspaper, last week — suggesting he’d told investors he planned to slash costs by liquidating a full 75% of staff — so how radical a haircut he’s planning is still unclear, even as reports of fired staffers are trickling onto Twitter.)
But, also today, Reuters reported that Twitter’s new CEO — the self-styled “Chief Twit” — reached out to the European Union last week to assure local lawmakers that the platform will comply with an incoming flagship reboot of the bloc’s rules on digital governance around illegal content.
A move that will, self-evidently, demand a beefed up legal, trust and safety function inside Twitter if Musk is to actually deliver compliance with the EU’s Digital Services Act (DSA) — at a time when Musk is sharpening the knives to cut headcount.
DSA compliance for a platform like Twitter will likely require a whole team in and of itself. A team that should be starting work ASAP. The comprehensive EU framework for regulating “information society services” and “intermediary services” across the bloc spans 93 articles and 156 recitals — and is due to start applying as soon as next year for larger platforms. (It’s February 17, 2024, for all the rest.)
Penalties for violations of the incoming regime can scale up to 6% of global annual turnover — which, on Twitter’s full year revenue for 2021, implies potential fines of up to a few hundred million dollars apiece. So there should be incentive to comply to avoid such costly regulatory risk. (Er, unless Musk’s strategy for “saving” Twitter involves dismantling the business entirely and running its revenue into the ground.)
Yet — in another early step — one of Musk’s first moves as owner of the social media platform was to fire a number of senior execs, including Vijaya Gadde, its former head of Legal, Policy, Trust and Safety.
Musk had been critical of her role in a decision by Twitter, back in October 2020, to — initially — limit the distribution of a controversial New York Post article reporting on emails and other data supposedly originating on a laptop belonging to U.S. president Joe Biden’s son, Hunter. The action led to accusations that Twitter was censoring journalism and demonstrating a pro-Democrat bias, even though the company subsequently rowed back on the restrictions and revised its policies.
Targeted harassment
Musk waded into the saga earlier this year with a tweet that branded the Post’s story “truthful” and dubbed Twitter’s actions “incredibly inappropriate.” He also doubled down shortly afterward by retweeting a meme targeting Gadde by name — which led to a vicious pile-on by his followers that prompted former Twitter CEO, Dick Costolo, to tweet at Musk publicly to ask why he was encouraging targeted harassment of the Twitter exec.
Put another way, a former Twitter CEO felt forced to call out the (now current) CEO of Twitter for encouraging targeted harassment of a senior staffer — who also happens to be a woman and POC. To say that this bodes badly for Twitter’s compliance with EU rules that are intended to ensure platforms act responsibility toward users — and drive accountability around how they are operated — is an understatement.
While the EU’s DSA is most focused on governance rules for handling illegal content/goods and so on — that is, rather than tackling the grayer area of online disinformation, election interference, “legal but harmful” stuff (abuse, bullying, etc.), and such, areas where the EU has some other mechanisms/approaches in the works — larger platforms can be designated as a specific category (called VLOPs, or very large online platforms) and will then have a set of additional obligations they must comply with.
These extra requirements for VLOPs include carrying out mandatory risk assessments in areas such as whether the application of their terms and conditions and content moderation policies have any negative effects on “civic discourse, electoral processes and public security,” for example; and a follow-on requirement to mitigate any risks — by putting in place “reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified” (including where risks are impacting users’ fundamental rights, so stuff like respect for human dignity and equality; nondiscrimination; respect for diversity, etc., among other core rights listed in the EU charter).
The implication is a VLOP would face major challenges under the DSA if it was to ignore risks to fundamental rights flowing from, say, a decision to apply a “free speech absolutist” approach to content moderation, as Musk has, at times, claimed is his preference (but — ever mercurial — he’s also said that, as Twitter CEO, he would comply with all legal requirements, everywhere in the world they apply).
Whether Twitter will be classed as a VLOP is one (now) very burning question for EU citizens and lawmakers.
The Commission hasn’t specified either way — but internal market commissioner, Thierry Breton, has (at least) heavily implied Musk’s Twitter will face meaningful checks and balances under the DSA. Which suggests it will be designated and regulated as a VLOP.
Hence Breton’s quick schooling of Musk last week — when, in response to Musk’s “free speech” base-inflaming “the bird is freed” tweet, the commissioner pointedly rejoined: “In Europe the bird will fly by our [EU] rules.”
Musk did not respond publicly to Breton’s schooling at the time. But, according to a Reuters report today, he reached out to the Commission to “assure” it the platform will abide by the EU’s rules on “illegal online content policing,” as the news agency puts it.
“In a previously unreported exchange last week, Musk told [Breton] that he planned to comply with the region’s Digital Services Act,” Reuters also wrote, adding: “The self-described free speech absolutist agreed to hold a meeting with Breton, a former French finance minister, in the coming weeks, [per] two EU officials familiar with the discussions.”
A spokesman for commissioner Breton confirmed Reuters’ reporting — but did not offer any further detail about what was (or will be) discussed.
Flights of fancy
In theory, Musk could seek to comply with the letter of the EU law on illegal content — such as by ensuring that the necessary “notice and action” mechanisms are in place for handling illegal content under the DSA; and providing adequate information to users on any restrictions and/or other types of interventions applied — while ignoring lawmakers’ wider expectations and civic duty entreaties to act responsibly vis-à-vis risks attached to fuzzier types of not-illegal but typically toxic and/or socially damaging content (such as targeted abuse and harassment), and still try to claim that’s compliance.
One concerning sign, for the future quality of information that’s being surfaced on Twitter, is a rumor that Musk intends to charge users to be verified and remove existing verifications from any users who won’t pay up — which, depending on how it’s applied/policed, risks being a boon to scammers/fraudsters/imposters, and so on (who could simply pay to claim a temporary stamp of faux authority, in the form of a verification badge, while public service–oriented individuals, such as journalists, might be less able to afford to pay Musk for ongoing verification). In a sweep, a switch to paid verifications could thus degrade the accessibility of quality public information on Twitter by making it harder for users to sift genuine sources from impersonators and other types of disinformation peddlers.
At the time of writing, Twitter had not responded to questions about Musk’s interaction with Breton last week — or about the agenda of his forthcoming meeting with the EU. So it remains to be seen how he’s going to play this one.
But — so far at least — the public noises Musk’s been making in response to EU outreach have been surprisingly, er, compliant-sounding. So he’s either planning to contradict himself (on the “free speech absolutism” side), or he’s playing for time — by telling the EU what it wants to hear while getting on with doing the opposite behind closed doors (i.e., ripping up Twitter’s trust & safety resource).
There is undoubtedly danger here, for the bloc and its citizens generally — and for the Commission, specifically, as the key enforcer of major parts of the DSA.
If Musk ends up essentially trolling the EU’s shiny new rulebook — by trying to fly an intentionally under-resourced Twitter under the bloc’s radar and ignoring wider societal harms that could fast-follow denuded T&Cs tacitly sanctioning toxic speech — it could not just take the shine off the EU’s newly minted digital rulebook but also undermine the whole reboot.
That said, the Commission does have some leeway here, as the DSA allows it to factor in a range of information as it takes decisions on designating VLOPs — which triggers the extra leverage of systemic risk mitigation requirements. (“The Commission shall take its decision on the basis of data reported by the provider of the online platform or of the online search engine pursuant to Article 24(2) [of the DSA], or information requested pursuant to Article 24(3) or any other information available to the Commission,” runs Article 33 — emphasis ours.)
So — on paper — the EU does have tools it can use to shrink the risk posed by Musk owning Twitter by saddling his platform with extra layers of accountability red tape.
If Musk still just ignores these requirements, then the EU could plausibly resort to appropriately large penalties and sanctions. And a heavily leveraged billionaire can’t swallow that many fines and still be able to service the big debt he’s taken on in order to buy Twitter.
Threats to democracy and global security
The Commission will certainly be under huge pressure to ensure the DSA delivers by keeping platforms accountable when it comes to acting on and reining in harms — and online hate and harassment do, at least, tend to be fairly visible harms. Making negative impacts harder to conceal/harder for regulators to ignore. So it’s also possible that a misbehaving Musk might actually help EU lawmakers screw their courage to the sticking place when it comes to enforcement against platforms’ systemic, societal risks.
After all, he’s a polarizing figure — and attracts plenty of hate himself — while his love of shitposting and trolling regulators raises the risk he’ll make very public missteps that make it easier for the Commission to enforce against such an obvious target.
There is also already a clamor over Musk’s Twitter takeover in Europe. A number of European civil society groups are calling for action over the billionaire’s ownership of the platform. A dozen NGOs, led by Defend Democracy, have penned an open letter to the EU — highlighting what they describe as the “alarming implications for democracy and security.” So plenty of external eyes will be trained on Musk — watching and waiting for any fresh missteps.
Top of the list of concerns raised by the NGOs is “global security” — with the letter pointing to Musk’s recent amplification of Kremlin narratives geared at undermining Ukraine’s territorial integrity; his threats to terminate Starlink service in Ukraine; and public support he’s expressed for China, where the letter notes he has “substantial economic interests” by “suggesting that Beijing should be handed some control over Taiwan.”
The NGOs also raise concerns over online standards and safeguards — raising the sacking of Gadde; pointing to reports of plans for mass Twitter sackings; and flagging concerns that Musk will unban former U.S. president Donald Trump, who was permanently banned after the U.S. capital riots in January 2021.
“The announcement of major staff cuts, the lifting of account bans and the rejection of content moderation is directly contrary to the letter and spirit of new EU legislation adopted earlier this year that seeks to make big tech companies take greater responsibility for hate speech and disinformation on their platforms,” they argue.
They further warn over what Musk-owned Twitter’s implications for users’ data security and privacy — warning over his “apparent closeness to a number of repressive regimes, and given his dismissive attitude to enforcing applicable legislation on a range of public policies” — before suggesting: “This could impact the work and lives of democratic politicians, human rights activists, democracy defenders, journalists, lawyers, disinformation experts, minorities and ordinary citizens everywhere.”
The NGOs’ letter, which is addressed to the presidents of the Commission, Council and Parliament, calls on the EU institutions to undertake “an urgent security and regulatory review of this takeover,” adding: “We demand urgent and adequate measures to protect European citizens, democracy and security from its harmful implications.”
We reached out to the Commission, Council and Parliament for comment on the NGOs’ call for an urgent review of Musk’s ownership of Twitter — but at the time of writing, none of the EU institutions have responded.