IEMed Mediterranean Yearbook 2025

Content

Panorama: The Mediterranean Year

Geographical Overview

STRATEGIC SECTORS

Maps, Charts, Chronologies and other Data

Mediterranean Electoral Observatory

Migrations in the Mediterranean

Commercial Relations of the Mediterranean Countries

Signature of Multilateral Treaties and Conventions

image

Gaza Infocide: Disinformation and New Narratives Wars

Basim Tweissi

Professor
Doha Institute for Graduate Studies

The conflict between Israel and Hamas, which began on 7 October 2023, is not merely an asymmetric war or the largest 21st-century conflict in terms of civilian casualties and limited international intervention. It also represents a novel model for sophisticated information and disinformation warfare in the digital and AI age. This war has given rise to the concept of “infocide,” defined as actions and practices that eliminate objective facts and deny equal access to information for conflicting parties. This effectively deprives one of the sides in the conflict of the freedom, accessibility and reach to narrate events and present its own account to the world.

A New Generation of Disinformation Wars

Israel’s media response to the 7 October 2023, attack, which resulted in 1,200 Israeli deaths, mostly civilians, was an unprecedented media and propaganda offensive. Despite an Israeli intelligence and military failure on that day, coupled with a breakdown of information systems and advanced surveillance technology, Israel was well-prepared for a large-scale information and media war. The asymmetry in this conflict extended beyond Israel’s superior military, security institutions, weaponry and technological prowess; it also relied heavily on Israel’s immense control and media power in information warfare, media narratives and global public opinion management.

From the outset, Israel demonstrated a strong ability to construct a war narrative and “manufacture consent” on a global scale, supported by a vast network of media and diplomatic institutions. Conversely, Hamas’s capabilities in information warfare and media narratives appeared globally limited, effective primarily within Arab public opinion due to support from major Arab news channels like Al-Jazeera. However, as Israeli aerial bombardment intensified, ground operations began, civilian casualties mounted and Gaza’s cities and infrastructure suffered massive destruction, the media discourse supporting the Palestinian narrative gained strength. This situation has been globally described as genocide.

The war represented an
unprecedented case of disinformation,
characterized by a massive surge
in misleading content

In the initial weeks of the Gaza war, many major media outlets adopted the “manufacturing consent” approach. This concept, introduced by Walter Lippmann and further developed by Edward Herman and Noam Chomsky, posits that contemporary democratic public opinion can be manipulated by the media, which is often influenced by political leaders, especially during crises and wars. From the start, several mainstream Western media organizations fell prey to political orchestration, echoing narratives from Israeli and Western, particularly American, officials, often repeating misleading information and false reports.

The war represented an unprecedented case of disinformation, characterized by a massive surge in misleading content. Altered or decontextualized images and videos flooded social media, distorting the truth. Some users attempted to downplay the devastation, while false and misleading visuals spread widely, misrepresenting the conflict’s realities. Numerous fabricated stories gained traction, such as claims of beheaded children or raped women during Hamas’s attack on settlements surrounding Gaza. Many mainstream media outlets later retracted or corrected these reports once they were debunked. For instance, a widely circulated image of a burned child was later proven to be AI-generated. Consequently, news organizations like The Los Angeles Times either removed or amended such stories, as seen in their coverage of alleged sexual assaults by Hamas fighters.

Israel has used these networks in military confrontations with Gaza over the past decade, but the current campaign features unprecedented momentum and the use of advanced technological tools, augmented by paid proliferation. Israel paid to promote dozens of ads containing brutal and emotional images and videos it claimed depicted events of 7 October. In the first weeks, Israel’s Foreign Ministry released over 75 paid, multilingual posts and advertisements, including privately filmed videos, targeting Western audiences (Martin, 2023). These advertisements portrayed Hamas as a “vile terrorist organization,” drawing direct parallels with ISIS, and highlighted alleged violations with graphic imagery, such as a lifeless, naked woman in a pickup truck, accompanied by slogans like “The world defeated ISIS. The world will defeat Hamas.” A series of circulated videos spliced footage of ISIS executions with scenes purportedly showing Hamas fighters and other Palestinians, further blurring distinctions and fuelling a narrative based on fear and dehumanization.

US President Joe Biden twice repeated misleading claims that originated on social media and were propagated by senior Israeli officials. On 10 October, Biden stated he had seen images proving Hamas beheaded 40 children during the 7 October attack, a claim later debunked. That same evening, the White House clarified that the President had not, in fact, seen such images. On 28 October, Biden expressed scepticism about casualty figures released by Gaza’s Hamas-run Ministry of Health, which were later found to be largely accurate (Accorsi, 2024).

During the first week of the war, manipulated CNN footage paired with a fabricated audio clip was put into circulation, making it appear as if the CNN-covered airstrikes were staged, implying the media was orchestrating a non-existent attack. A 2015 video of a 16-year-old Guatemalan girl being burned to death was falsely linked to the Israeli-Palestinian war. Deceptive quotes and misleading posts on X and other platforms used this video to emphasize Palestinian atrocities, as illustrated by a post claiming: “Do we have a people that are dirtier, hateful and more brutal than this?!! They burned a 16-year-old girl alive in Israel.” The narrative of visual disinformation was used to delegitimize the opposing side of the conflict. More critically, it aimed to portray “the other” as a cohesive, monolithic group of villains capable of extreme and barbaric acts (Hameleers, 2025).

Conversely, Hamas initially lacked the capacity and resources to craft a narrative that could penetrate global public opinion. The 7 October attack served as its most forceful message. However, the Palestinian public, and later a rapidly growing global audience supporting the Palestinian narrative, began generating significant media momentum, especially on digital platforms. This media push eventually exerted substantial pressure on both Israel and the United States. This narrative, however, was not free from disinformation. For example, some efforts downplayed the scale and impact of the 7 October attack. Additionally, old videos were repurposed and falsely linked to current events in Gaza, such as footage of aid convoys being bombed. One widely shared clip even claimed to show a Hamas fighter shooting down an Israeli helicopter, but it was later revealed to be high-quality footage from the video game Arma 3, deceptively edited and circulated as real.

At various points during the war, Israeli information campaigns shifted from deliberate disinformation to selective misinformation, creating ambiguity and confusion. A prominent example is the 17 October 2023 bombing of the Al-Ahli Arab Hospital, initially attributed to a misfired Palestinian rocket. The attack resulted in approximately 400 civilian deaths and caused confusion even among major international media outlets. On 24 October, The New York Times published an investigative report casting doubt on the Israeli account and presenting evidence that the projectile seen in the widely circulated video was unlikely to be the cause of the hospital explosion (Willis, 2023). Similarly, the French newspaper Le Monde confirmed that the munition that struck the hospital had been launched from within Israeli territory (Mas, 2023). The same pattern applied to repeated Israeli raids on Al-Shifa Hospital, where it was alleged that Hamas’s command centre was located beneath the facility. These claims were supported by incomplete or misleading images and reports, contributing to an atmosphere of uncertainty. This shift towards ambiguity often coincides with the presence of undeniable facts. By generating confusion and competing narratives, such tactics activate confirmation bias, particularly among Western audiences, who are more likely to embrace narratives aligning with their preexisting beliefs and ideological leanings.

Artificial Intelligence and Battle Amplification

Israel employed strategic artificial intelligence systems in its military operations, notably targeting and tracking programmes such as Lavender and Gospel. These tools were instrumental in identifying tens of thousands of Palestinians in Gaza as potential targets for elimination. AI technologies were also used to compile lists of potential airstrike targets. Beyond military applications, Israel deployed new AI-driven models, used for the first time in information warfare, to control and shape the war’s narrative.

Unit 8200 of the Israeli military, along with reservists employed at major tech companies like Google, Microsoft and Meta, established “the Studio.” This innovation hub was designed to connect experts with AI-driven projects related to information warfare and narrative control (Frenkel, 2025). Within this hub, they developed an Arabic-language artificial intelligence model to power a chatbot capable of scanning and analyzing text messages, social media posts and other Arabic-language data. This large language model was created during the war’s first few months and enabled the development of a chatbot capable of conducting queries in Arabic. Designed to identify optimal strategies for executing influence and intervention campaigns aimed at controlling and redirecting the narrative. This system was tested following Israel’s assassination of Hezbollah leader Hassan Nasrallah in September 2024, with the chatbot analysing reactions across the Arabic-speaking world, distinguishing between different Lebanese dialects to gauge public sentiment. This capability allowed Israel to assess whether there was significant public pressure for a retaliatory response.

Beyond military applications,
Israel deployed new AI-driven
models, used for the first time
in information warfare, to control
and shape the war’s narrative

Shortly after the war began, Israel launched an AI-powered application aimed directly at the public, entitled “Words of Iron.” This platform leveraged the efforts of volunteers and pro-Israel advocates worldwide to amplify and globally disseminate the Israeli narrative. Developed by the Akooda team, “Words of Iron” has been described by its creators as a digital “Iron Dome” for Israel (Calcalistech, 2025). The app ingests a vast number of posts about Israel from various sources, including Israeli ones, boosting the visibility of positive content while simultaneously suppressing or removing anti-Israel narratives. This war has become the most extensive experiment to date in the use of artificial intelligence models within narrative warfare. AI-generated content —including images, texts, and videos— has been widely used to support official narratives, often resulting in the dissemination of unverified or misleading information and content.

The Era of Information Genocide (Infocide)

The disinformation landscape is rapidly expanding, driven by the accelerating pace of digital technology and compounded by ongoing political, economic, strategic and unconventional crises. In 2020 and 2021, humanity experienced an “infodemic” due to the overwhelming spread of misinformation and conspiracy theories surrounding Covid-19. The world then seamlessly transitioned into what can now be described as an “information genocide – Infocide” —a systematic erosion of truth and facts— first signalled during the Russia-Ukraine war and reaching its peak with the outbreak of the 2023 Gaza war.

Information manipulation has long been an established military strategy, often overlooked or ignored by international humanitarian law. However, with the rise of social media, artificial intelligence and increasingly right-wing political regimes, information warfare has become a far more prominent element in modern armed conflicts. This evolution directly impacts civilian protection and undermines civilians’ ability to make informed decisions for their own safety (Morris, 2024). These developments require a fundamental reassessment of the nature and dangers of contemporary information warfare, as well as a critical examination of how international humanitarian law addresses this.

For example, while using sound like music to unsettle enemy soldiers was once considered a legitimate tactic, the deployment of sonic weapons is no longer deemed acceptable due to the suffering they inflict. The practice becomes even more inhumane and illegal when used to deceive civilians and lead them to their death. Residents of the Nuseirat refugee camp in Gaza reported that some drones broadcast the sounds of crying women and children, with the deceptive aim of luring Palestinian youth into the open, where they could be targeted and killed (Fatafta, 2024).

The political and technological environment preceding and coinciding with this war created fertile ground for the spread of disinformation and the strategic shaping of media narratives. Since its war with Hamas in 2021, Israel has conducted covert social media campaigns to improve its image among both Israeli and Western audiences—efforts that intensified and peaked with the outbreak of war in Gaza in 2023. Israel also strengthened its digital diplomacy and engagement with major Western technology companies. An internal audit by Meta revealed that the company censored Arabic-language content at a significantly higher rate than Hebrew-language content, apparently in response to requests from the Israeli government.

Disinformation control systems implemented by social media platforms have proven largely ineffective, and in some cases, only partially functional. According to organizations like the European Digital Media Observatory (EDMO), these systems have failed to prevent the widespread dissemination of false and misleading content. This war coincided with early transformations that led to the deterioration of social media platforms and, consequently, a radical shift in the digital battlefield. Key developments included Elon Musk’s acquisition of Twitter in October 2022, Meta’s mass layoffs of thousands of employees, and the company’s withdrawal from content moderation, supervision and fact-checking efforts since 2023. Prior to these changes, social media algorithms had already evolved to favour content that provoked strong emotions —especially anger and polarization.

The Gaza war has emerged as a stark example of a new form of asymmetrical and unjust warfare. As international recognition grows that Israel’s actions may constitute genocide, the disinformation propagated during this conflict also amounts to an “erasure of truth.” When the more powerful party leverages its political, economic and technological might and global influence to suppress the opposing narrative, it effectively silences the other side’s voice. This was compounded by an extensive blockade on information: the denial of access to Gaza for international journalists, the continued killing of local reporters and persistent efforts to delegitimize Palestinian suffering. One striking example is the renewed and intensified promotion of the term “Pallywood” —a portmanteau of “Palestine” and “Hollywood”—, a disinformation campaign that accuses Palestinians of fabricating or exaggerating their suffering and casualty numbers during their war with Israel.

References

Accorsi, Alessandro. “How Israel Mastered Information Warfare in Gaza.Foreign policy. 11 March 2024. https://foreignpolicy.com/2024/03/11/israel-gaza-hamas-netanyahu-warfare-misinformation/.

Calcalistech The Akooda team’s innovative Words of Iron advocacy project has become an online “Iron Dome” for Israel. Calcalistech. 16 November 2023 www.calcalistech.com/ctechnews/article/rkrug57na.

Fatafta, Marwa and Leufer , Daniel. “Artificial Genocidal Intelligence: how Israel is automating human rights abuses and war crimes.” Access now, 9 May 2024. www.accessnow.org/publication/artificial-genocidal-intelligence-israel-gaza/.  

Frenkel, Sheera and Odenheimer, Natan, Israel’s A.I. Experiments in Gaza War Raise Ethical Concerns. The New York Times, 25 April 2025. www.nytimes.com/2025/04/25/technology/israel-gaza-ai.html.

Hameleers, Michael “The visual nature of information warfare: the construction of partisan claims on truth and evidence in the context of wars in Ukraine and Israel/Palestine.” Journal of Communication, Volume 75, Issue 2, April 2025, Pages 90–100. https://doi.org/10.1093/joc/jqae045.

Martin, Liv; Goujard, Clothilde and Fuchs, Haley. “Israel floods social media to shape opinion around the war.” Politico 17 October 2023. www.politico.eu/article/israel-social-media-opinion-hamas-war/.

Mas, Liselotte; Eydoux, Thomas. and Le Monde‘s video investigation team “Gaza hospital: What detailed image analysis reveals about deadly blast.Le Monde, 19 October 2023. www.lemonde.fr/en/international/article/2023/10/19/gaza-hospital-what-detailed-image-analysis-reveals-about-deadly-al-ahli-arab-hospital-blast_6189013_4.html.

Morris, Tamer. Israel – Hamas 2024 Symposium – Information Warfare and the Protection of Civilians in the Gaza Conflict. 23 January 2024. The Lieber Institute for Law & Warfare. https://lieber.westpoint.edu/information-warfare-protection-civilians-gaza-conflict/.

Willis, Aric Haley; Mellen, Riley; Cardia, Alexander; Reneau, Natalie; Barnes, Julian E. and Koettl, Christoph “A Close Look at Some Key Evidence in the Gaza Hospital Blast.” The New York Times. 24 October 2023. Updated 26 October 2023. www.nytimes.com/2023/10/24/world/middleeast/gaza-hospital-israel-hamas-video.html.