X claims it doesn’t automate content moderation in Europe—so why did it hint that’s why Navalny’s wife was banned?


Alexei Navalny, a Russian anti-corruption activist and nationalist politician who had been described by some as Russian President Vladimir Putin’s number one opponent, died in a Siberian penal colony on Feb. 16.

The reasons for his death are unclear, but speculation by Western officials puts the blame squarely on Putin. That theory was endorsed by his wife Yulia Navalnaya, who claimed in a video statement that his corpse is being withheld so the authorities can wait for traces of a deadly nerve agent called Novichok to leave his body.

Navalnaya made the allegation in a video she posted on X on Monday claiming Putin had killed her husband and that she’d be continuing his work.

“I want to live in a free Russia, I want to build a free Russia,” Reuters reported her saying in the video, which is in Russian.

Navalnaya quickly racked up hundreds of thousands of followers and tens of thousands of messages of sympathy—but on Tuesday her account was briefly suspended.

A few hours passed before the account was reinstated. In the interim, X exploded, with users accusing the owner, Elon Musk, of sympathizing with Putin, citing recent reporting on his restriction of Starlink in Ukraine.

According to a statement by X’s Safety team, the “platform’s defense mechanism against manipulation and spam mistakenly flagged @yulia_navalnaya as violating our rules. We unsuspended the account as soon as we became aware of the error, and will be updating the defense.”

X’s statement doesn’t explicitly say that the decision about Navalnaya’s account was made by an automated system, but blaming the suspension on a “defense mechanism” and promising to “update the defense” seemed to some information researchers to imply that a human wasn’t involved in the initial decision to shut down the account.

They quickly flagged the statement, doubting whether an explanation blaming the suspension, even implicitly, on an automated decision could be accurate.

“Fun statement given X reports under the Digital Services Act that they don’t do automated content moderation,” reacted Michael Veale, an associate professor of Digital Rights & Regulation at University College London’s Faculty of Laws, in a quote tweet.

The Digital Services Act (DSA) is an EU regulation that came into effect in October 2022 under the auspices of combating illegal content, making advertising transparent, and fighting disinformation. One of the provisions of the act requires that platforms upload moderation decisions to the DSA Transparency Database providing details like the grounds for the decision, the type of content being moderated, and whether the decision was taken through automated means.

According to a paper from 2023 by researchers at the University of Bremen looking at moderation decisions taken in a single day that were uploaded to the database, X reported that it only used human moderation for its decisions. Because of that, the platform reported far fewer moderation decisions than other platforms during the day the researchers looked at.

A review of the database by the Daily Dot found that every single moderation decision uploaded by X to the database has the Not Automated tag.

None of X’s 673,828 decisions to date have the Fully Automated or Partially Automated tag.

X’s content moderation reports referenced 8,881 Inauthentic Accounts as a keyword, though none of the reports mention the “manipulation” or “spam” referenced in the statement from X’s Safety team.

“X did not report any automated detections or decisions,” Daria Dergacheva, one of the authors of the paper, told the Daily Dot.

The paper only references content based in the EU, however. 

Navalnaya currently lives in an “undisclosed location” outside of Russia, according to a report in the Guardian. She created her X account in February, and first posted on the 19th, when she was in Brussels meeting with EU leadership about her husband’s death. Four days ago she was in Germany at a Munich security conference where she addressed her husband’s death in public.

“[X] can claim they do automated moderation in other territories depending [on] how they view Navalnaya’s account,” Dergacheva said, offering a possible explanation. 

“Theoretically if FBK and Yulia created it in Germany [though] they should not have done automated detection or decision,” Dergacheva continued, referencing Navalny’s Anti-Corruption Foundation, which was dissolved in Russia in 2021.

Twitter’s Security team couldn’t be reached to answer questions about what type of moderation tools were used to review Navalnaya’s account.

Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.

Source: https://www.dailydot.com/debug/navalny-wife-auto-ban-x/