U.S. busts Russian AI bot farm spreading disinformation on X

The U.S. Justice Department has revealed it has taken actions to takedown a sophisticated information operation powered by AI, reportedly… Continue reading U.S. busts Russian AI bot farm spreading disinformation on X The post U.S. busts Russian AI bot...

U.S. busts Russian AI bot farm spreading disinformation on X

The U.S. Justice Department has revealed it has taken actions to takedown a sophisticated information operation powered by AI, reportedly orchestrated by Russia. According to the department, the operation involved nearly 1,000 accounts on the social platform X, which assumed American identities.

The operation is said to be linked to Russia’s state-run RT News network and managed by the country’s federal security service. The intention behind this was to “disseminate disinformation to sow discord in the United States and elsewhere,” as per the court documents.

The Justice Department today announced the seizure of two domain names and the search of 968 social media accounts used by Russian actors to create an AI-enhanced social media bot farm that spread disinformation in the United States and abroad. Learn more: https://t.co/ibmqaruf5U pic.twitter.com/apWv6rGRYL

— FBI (@FBI) July 9, 2024

These X accounts were allegedly set up to disseminate pro-Russian propaganda. However, they were not operated by humans but were automated “bots.” RT, formerly known as Russia Today, broadcasts in English among other languages and is notably more influential online than through traditional broadcast methods.

The initiative for this bot operation was reportedly traced back to RT’s deputy editor-in-chief in 2022 and received backing and funding from an official at the Federal Security Service, the primary successor to the KGB. The Justice Department also took control of two websites instrumental in managing the bot accounts and forced X to surrender details on 968 accounts believed to be bots.

Russian bot fams using AI

The FBI, along with Dutch intelligence and Canadian cybersecurity officials, warned about “Meliorator,” a tool capable of creating “authentic appearing social media personas en masse.” It can also generate text and images, and echo disinformation from other bots.

Court documents revealed that AI was used to create the accounts to disseminate anti-Ukraine sentiments.

“Today’s actions represent a first in disrupting a Russian-sponsored generative AI-enhanced social media bot farm,” said FBI Director Christopher Wray.

“Russia intended to use this bot farm to disseminate AI-generated foreign disinformation, scaling their work with the assistance of AI to undermine our partners in Ukraine and influence geopolitical narratives favorable to the Russian government,” Wray added.

The accounts have since been removed by X, and screenshots provided by FBI investigators showed that they had attracted very few followers.

Faking American identities

The Washington Post reported a major loophole that allowed bots to bypass X’s security measures. According to the news outlet, they “can copy-paste OTPs from their email accounts to log in.” The Justice Department reported that this operation’s use of U.S.-based domain names constitutes a violation of the International Emergency Economic Powers Act. In addition, the financial transactions supporting these operations infringe upon U.S. federal money laundering laws.

Fake X profile shows bot named Ricardo Abbott made by using Russian bot farm technologyScreenshot of an alleged fake account shared by the FBI

Many of these fabricated profiles copied American identities, using U.S.-sounding names and specifying locations across the U.S. on X. The Justice Department pointed out that these profiles generally featured headshots against gray backgrounds, which appeared to have been created using AI.

For example, a profile under the name Ricardo Abbott, claiming to be from Minneapolis, circulated a video of Russian President Vladimir Putin defending Russia’s involvement in Ukraine. Another, named Sue Williamson, shared a video of Putin explaining that the conflict in Ukraine was not about territory but about “principles on which the New World Order will be based.” These posts were subsequently liked and shared by fellow bots within the network.

Further details from the Justice Department revealed that linked email accounts could be created if the user owns the internet domain. For instance, control over the domain www.example.com allows email addresses like [email protected] to be opened.

In this case, the perpetrators managed and used the domain names “mlrtr.com” and “otanmail.com,” both registered through a U.S.-based service, to set up email servers that supported the creation of fake social media accounts through their bot farm technology.

Featured image: Canva / Ideogram