Social media platforms asked to combat surge of pro-Hamas content

Israeli social threat intelligence firms claim fake profiles are disseminating pro-Hamas content on major platforms such as Facebook, X, Telegram and TikTok

Offline and online: Israeli soldiers walk through Kibbutz Be'eri four days after the 7 October Hamas attacks near the border with Gaza (photo: Getty Images)
Offline and online: Israeli soldiers walk through Kibbutz Be'eri four days after the 7 October Hamas attacks near the border with Gaza (photo: Getty Images)
user

IANS

As pro-Hamas accounts flood various social media platforms amid the ongoing Israel-Hamas war, governments are taking cognisance of the delicate situation and warning big tech companies like Meta, X, Telegram and others to either fix their content moderation algorithms or face action.

Israel-based social threat intelligence firm Cyabra claims that of the more than 162,000 profiles engaged in conversations about the Gaza war, 25 per cent — more than 40,000 profiles — are fake. 

Those fake profiles allegedly disseminated over 312,000 pro-Hamas posts and comments, with some of the accounts publishing hundreds of posts per day, on major social media platforms.

During the first week of the conflict (7-14 October), US-based for-profit organisation NewsGuard analysed the 250 most-engaged posts (likes, reposts, replies, and bookmarks) that promoted one of 10 prominent false or unsubstantiated narratives relating to the war.

It found that verified users with blue badges are the ones spreading the vast majority of misinformation about the Israel-Hamas war on X.

The results revealed that 186 of these 250 posts — 74 per cent — were posted by accounts verified by X.

“Nearly three-fourths of the most viral posts on X advancing misinformation about the Israel-Hamas war are being pushed by ‘verified' X accounts,” according to the analysis.

The European Commission formally opened an investigation into X over alleged spreading of illegal content and disinformation, in particular the spread of violent content and hate speech in the wake of the Israel-Hamas war. The Commission sent a formal notice to X under the Digital Services Act (DSA).

X CEO Linda Yaccarino said the platform is "coordinating with industry peers to respond to terrorist content being distributed online" and "taking action to remove content and bad actors that violate our policies".

She stated that the company is actively working with partners, governments, regulators and policymakers to combat misinformation.

Earlier, EU commissioner Thierry Breton had warned Elon Musk, saying that his X platform “is being used to disseminate illegal content and disinformation in the EU” following the 7 October Hamas attacks in Israel.

Responding to the EU's request, Yaccarino said the microblogging platform has removed hundreds of "Hamas-linked accounts" and "taken action to remove or label tens of thousands of pieces of content" since the attack on Israel.

Commissioner Breton also wrote to Meta CEO Mark Zuckerberg, telling him to remove pro-Hamas content across his platforms and “be very vigilant”, else it could put the company in violation of new EU regulations.


In his letter to Zuckerberg, Breton urged Meta to remove "illegal terrorist content and hate speech" amid the ongoing war in Israel.

He said the European Commission had seen “a surge of illegal content and disinformation being disseminated in the EU.”

“I urgently invite you to ensure that your systems are effective. Needless to say, I also expect you to be in contact with the relevant law enforcement authorities and Europol, and ensure that you respond promptly to any requests,” he noted.

In a statement, a Meta spokesperson said the company has created a “special operations center with experts, including fluent Hebrew and Arabic speakers” after the Hamas attacks on Israel, and curbed the spread of misinformation about the war on its platforms.

“Content containing praise for Hamas, which is designated by Meta as a Dangerous Organisation, or violent and graphic content, for example, is not allowed on our platforms. We can make errors and that is why we offer an appeals process for people to tell us when they think we have made the wrong decision, so we can look into it,” it said last week. 

In the three days following 7 October, Meta removed or marked as disturbing more than 795,000 pieces of content for violating these policies in Hebrew and Arabic.

Another social media platform Telegram had to block channels used by Hamas, but only on Android phones owing to violations of Google’s app store guidelines. 

The Jerusalem Post reported that the official Telegram channels of Hamas and the al-Qassam Brigades became “inaccessible for Telegram users who downloaded the app through the Google Play Store”.

Telegram responded to the block, saying “Some of the channels you are following may stop being accessible in your version of Telegram because of Google Play’s guidelines.”

Telegram has allegedly served as one of the central platforms for Hamas, its supporters, and other Palestinian militant groups to publish statements and propaganda, according to the report.

Chinese short-video-making app TikTok also said the platform removed over 500,000 videos and closed 8,000 livestreams to date since the attack on Israel by Hamas, for violating the company's guidelines.

Follow us on: Facebook, Twitter, Google News, Instagram 

Join our official telegram channel (@nationalherald) and stay updated with the latest headlines