X Claims Victory in Australian eSafety Commission Case
X says that the Commission has conceded that it was wrong to request the removal of stabbing footage.
X’s continued legal challenges to government removal orders are having an impact on the broader industry, but maybe not a good one, depending on how you look at it.
Last week, X publicly criticized Australia’s eSafety Commissioner once again for her attempts to force X to remove video footage of a violent stabbing of a religious leader in Sydney back in April.
At the time, Australian officials were concerned that the distribution of the footage could exacerbate religious tensions, leading to further violence in response. And while there were various violent clashes in the wake of the attack, X, along with other social platforms, did agree to remove the footage for Australian users.
The eSafety Commissioner then requested that X remove the footage for all users worldwide, which X refused to adhere to, arguing that Australian officials have no right to push for global censorship of content.
The case is difficult, in that X does make a valid point, in regards to officials from one country making calls on global censorship. But then again, X does indeed remove content at the behest of various governments, with its own reports indicating that it “globally deleted 40,331 items of content” between October 2023 and March 2024, in compliance with the E.U. Digital Services Act.
So while there is seemingly a case there, X is also picking and choosing which it will fight, and which requests it will uphold. And in the case of a violent stabbing incident, which could inflame tensions unnecessarily, the question all along has been: “Why not remove it?”
What, in this case, could be the argument for keeping this footage active?
Specifics aside, X opted to take the eSafety Commission to court, with the Commission and X eventually agreeing to come to terms on the case.
In its public statement, X has said that:
“X welcomes the decision of the Australian eSafety Commissioner to concede that it should not have ordered X to block the video footage of the tragic attack on Bishop Mar Mari Emmanuel.”
Which doesn’t exactly align with what the eSafety Commission posted about the agreement:
“With agreement of both parties, the Administrative Appeals Tribunal has today made orders to resolve the proceedings brought by X Corp in relation to a removal notice issued to it by eSafety requiring the company to take all reasonable steps to ensure removal of the material depicting a declared terrorist attack on a religious leader. eSafety believes that rather than test the interaction of the National Classification Scheme and the Online Safety Act in the context of this particular case, it is more appropriate to await the Federal Government’s consideration of a pending review of Australia’s statutory online safety framework.”
So the Commission hasn’t conceded that it was wrong to request removal, but has instead deferred a decision, pending a broader review of the related laws in this case.
But still, X seems pretty reassured with the conclusion:
“Six months later, the eSafety Commissioner has conceded that X was correct all along and Australians have a right to see the footage. It is regrettable the Commissioner used significant taxpayer resources for this legal battle when communities need more than ever to be allowed to see, decide and discuss what is true and important to them.”
Essentially, the case is another example of X picking its battles, and taking on governments and regulators in regions where X owner Elon Musk has personal grievances, and seemingly wants to put pressure on the sitting government.
Yet, X, overall, is complying with a lot more removal requests than previous Twitter management had been, while it’s also now censoring certain political materials which seemingly conflict with Musk’s own stated leanings.
So while X is making a big noise about standing up for free speech, and being more open to truth than Twitter, really, it’s just realigning its approach based on Musk’s own ideological lines.
The question now is whether X’s continued litigation will prompt hesitation from governments and regulators on future requests of this type. And if so, is that a good thing?