Fact check: AI-generated images of children in Gaza
Not all reports and pictures about the suffering in Gaza are accurate. Some are downright fakes generated by AI
For days, certain pictures of small children lying huddled together on muddy ground or in front of tents have been shared on social media platforms such as TikTok, Instagram and X, formerly known as Twitter. They are often accompanied by a Palestinian flag or comments suggesting the children are located in the Gaza Strip.
That people in Gaza — and children in particular — are suffering in dire conditions without sufficient access to food, clean water and medical care has been well documented by the United Nations, human rights organizations, international media and the people themselves.
DW contacted several aid organizations with staff in the besieged territory and heard about living conditions for displaced people there. Matt Sugrue, Save the Children's director of program operations in Rafah, said children and families were living in makeshift shelters or struggling to find places to spend the night, and that there was a lack of toilets and clean water.
The UN has estimated that 85% of Gaza population of 2.2 million people have been displaced by the Israeli military campaign against Hamas, which has been classified as a terrorist organization by Germany and the European Union, along with the United States and many other countries. Many of the displaced people currently live in emergency shelters.
Amid this suffering, unknown parties have chosen to create and circulate fake images about the situation on the ground using artificial intelligence. DW Fact Check examined and concluded that the following three images had been created with the use of AI.
Characteristic AI mistakes
A picture of two boys wearing identical pajamas and huddled together under a turquoise blanket in a blue tent has been seen millions of times. They lie in mud, surrounded by brown puddles of water. But this image was generated by AI, and this is not clearly indicated in the post seen by DW, or in many similar posts.
DW Fact Check circled the parts where there is evidence that AI was used: The two boys each have a foot with only four toes, which is a characteristic AI mistake. The right foot of the boy on the left also appears quite large.
Furthermore, their interlocked fingers look too uniform and their wrists are not bent enough, and at the wrong angle. There is also something wrong with the back of the head and neck of the boy on the left. The body parts merge with the canvas of the tent and are pointed toward the sternum.
The lighting in the photo also seems staged. Considering that the lamp lighting up the scene is presumably hanging from the ceiling, it provides a very even light, as seen in the reflections. Photo editing software can help achieve such effects, but the original shot must have good lighting conditions, which would require the use of complex equipment.
The same applies to this picture, which has appeared on X, Instagram, TikTok and other platforms. The reflections on the bottom of the bottle in the bottom-right corner of the picture seem particularly unnatural. However, generally, the typical AI mistakes often found on the body's extremities are more subtle.
On close inspection, the second toe of the lower foot of the girl on the right seems very large. And the bottoms of both girls' left feet are unusually straight, as if they were standing on the ground or were extremely flat-footed. Moreover, the girls' skin seems flawless, as is often the case in AI-generated images.
In the third picture that DW examined, the light seems quite natural, and the skin of the girls appears realistic. In this photo, the girls don't seem to resemble each other as much as the children in the other two pictures.
But obvious errors highlight the use of AI: their bodies seem to be fused together, and the girl in front appears to have no legs. This could be the case in reality, particularly after months of bombing by the Israeli military, but the patterns of the fabrics are also blurred in the encircled area. DW has concluded that this could also be an AI-generated mistake.
Questionable use of AI imagery
It's highly questionable to use AI imagery to illustrate real events, such as those happening in the Israel-Hamas war, especially if pictures aren't labeled as such. Such pictures have not only appeared on social media platforms, but also on certain news sites like The Palestinian Information Center and Nordhessen-Journal, a regional German news outlet.
AI images don't document objective facts — they are computer-generated images created according to parameters set by a person. DW categorizes such images, that are published and disseminated without being labeled as being generated by AI, as fake.