Europe gives Meta, TikTok six days to share information on response to Israel-Hamas conflict

The EU said it would like Meta and TikTok to hand over information on how they're tackling misinformation about the Israel-Hamas war.

Europe gives Meta, TikTok six days to share information on response to Israel-Hamas conflict

TikTok and Facebook logos displayed on phone screens.

Jakub Porzycki | Nurphoto via Getty Images

Facebook parent company Meta and Chinese-owned social media app TikTok have both been given a deadline of Oct. 25 by the European Commission to share information on their response to the Israel-Hamas war, which has seen misinformation in the digital sphere spread alongside the physical conflict.

The European Commission on Thursday said it is making both requests under the Digital Services Act.

It asked Meta to provide more information "on the measures it has taken to comply with obligations related to risk assessments and mitigation measures to protect the integrity of elections and following the terrorist attacks across Israel by Hamas, in particular with regard to the dissemination and amplification of illegal content and disinformation."

TikTok was likewise requested to provide further information on steps taken over its "risk assessments and mitigation measures against the spreading of illegal content, in particular the spreading of terrorist and violent content and hate speech."

The commission made similar requests last week of Elon Musk's X social media platform, previously known as Twitter.

A TikTok spokesperson told CNBC the firm had "just heard from the European Commission this morning and our team is currently reviewing the RFI [request for information]."

"We'll publish our first transparency report under the DSA next week, where we'll include more information about our ongoing work to keep our European community safe," the TikTok spokesperson added.

Meta was not immediately available for comment when contacted by CNBC.

The EU wants to see how Meta and TikTok have looked to keep misinformation about the war off their platforms. Meta also owns Instagram, WhatsApp, and Oculus, as well as Facebook.

In particular, the EU is asking to see the measures that Meta and TikTok have taken to comply with their obligations under the Digital Services Act, or DSA. The DSA is a landmark piece of legislation introduced by the EU which seeks to ensure internet giants rid their platforms of illegal and harmful content.

Meta and TikTok both have until Oct. 25 to share information related to their response to the crisis in Israel, the commission said. Meta must also share details on its measures to ensure the integrity of elections by Nov. 8., while TikTok must do the same for both elections and the protection of minors online.

Threat of big fines

If and when Meta does submit information to the commission, the regulator will then consider next steps.

That may entail the "formal opening of proceedings pursuant to Article 66 of the DSA," the commission said Thursday. "Pursuant to Article 74 (2) of the DSA, the Commission can impose fines for incorrect, incomplete or misleading information in response to a request for information."

"In case of failure to reply by Meta [and TikTok], the Commission may decide to request the information by decision," the commission said. "In this case, failure to reply by the deadline could lead to the imposition of periodic penalty payments."

Meta is one of several firms which has been designated a Very Large Online Platform, or VLOP, by the European Union. This means that it is of the size and scale where it can be scrutinized by the bloc under its strict new rules, the DSA.

If a company is found to be in breach of the EU's DSA, they could be on the hook for fines as large as 6% of a companies' total annual revenues. For a company as big as Meta, that could potentially reach billions — $7 billion, to be exact, based on Meta's annual global revenues for the fiscal year 2022.

Facebook says it has taken steps to limit the spread of disinformation during the Israel-Hamas conflict. In a blogpost this week, the company said that, in the three days following Oct. 7, it removed or marked as disturbing more than 795,000 pieces of content in Hebrew and Arabic for violating its policies.

The company says it also deleted seven times as many posts on a daily basis for violating its dangerous organizations and individuals policy.

In an update on Wednesday, Facebook said that it was taking further action to counter the spread of harmful content during the Israel-Hamas conflict, including temporarily changing the default setting on who can comment on newly created posts of people in the region to friends or established followers only.

TikTok, meanwhile, says it is adding more Arabic and Hebrew-speaking content moderators to review content related to the war and is enhancing its automated detection systems in real-time to detect and remove graphic and violent content "so that neither our moderators nor our community members are exposed to it."

WATCH: Three decades after inventing the wb, Tim Berners-Lee has some ideas on how to fix it

Three decades after inventing the web, Tim Berners-Lee has some ideas on how to fix it