{"id":175338,"date":"2025-12-04T20:23:08","date_gmt":"2025-12-04T20:23:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/175338\/"},"modified":"2025-12-04T20:23:08","modified_gmt":"2025-12-04T20:23:08","slug":"googles-ai-nano-banana-pro-accused-of-generating-racialised-white-saviour-visuals-artificial-intelligence-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/175338\/","title":{"rendered":"Google\u2019s AI Nano Banana Pro accused of generating racialised \u2018white saviour\u2019 visuals | Artificial intelligence (AI)"},"content":{"rendered":"<p class=\"dcr-130mj7b\">Nano Banana Pro, Google\u2019s new AI-powered image generator, has been accused of creating racialised and \u201cwhite saviour\u201d visuals in response to prompts about humanitarian aid in Africa \u2013 and sometimes appends the logos of large charities.<\/p>\n<p class=\"dcr-130mj7b\">Asking the tool tens of times to generate an image for the prompt \u201cvolunteer helps children in Africa\u201d yielded, with two exceptions, a picture of a white woman surrounded by Black children, often with grass-roofed huts in the background.<\/p>\n<p class=\"dcr-130mj7b\">In several of these images, the woman wore a T-shirt emblazoned with the phrase \u201cWorldwide Vision\u201d, and with the UK charity World Vision\u2019s logo. In another, a woman wearing a Peace Corps T-shirt squatted on the ground, reading The Lion King to a group of children.<\/p>\n<p>AI-generated image using the tool with the prompt \u2018volunteer helps children in Africa\u2019. Illustration: Google<\/p>\n<p class=\"dcr-130mj7b\">The prompt \u201cheroic volunteer saves African children\u201d yielded multiple images of a man wearing a vest with the logo of the Red Cross.<\/p>\n<p class=\"dcr-130mj7b\">Arsenii Alenichev, a researcher at the Institute of Tropical Medicine in Antwerp studying the production of global health images, said he noticed these images, and the logos, when experimenting with Nano Banana Pro earlier this month.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe first thing that I noticed was the old suspects: the white saviour bias, the linkage of dark skin tone with poverty and everything. Then something that really struck me was the logos, because I did not prompt for logos in those images and they appear.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Examples he shared with the Guardian showed women wearing \u201cSave the Children\u201d and \u201cDoctors Without Borders\u201d T-shirts, surrounded by Black children, with tin-roofed huts in the background. These were also generated in response to the prompt \u201cvolunteer helps children in Africa\u201d.<\/p>\n<p class=\"dcr-130mj7b\">In response to a query from the Guardian, a World Vision spokesperson said: \u201cWe haven\u2019t been contacted by Google or Nano Banana Pro, nor have we given permission to use or manipulate our own logo or misrepresent our work in this way.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Kate Hewitt, the director of brand and creative at Save the Children UK, said: \u201cThese AI-generated images do not represent how we work.\u201d<\/p>\n<p>An image generated with the prompt \u2018volunteer helps children in Africa\u2019. Illustration: Google<\/p>\n<p class=\"dcr-130mj7b\">She added: \u201cWe have serious concerns about third parties using Save the Children\u2019s intellectual property for AI content generation, which we do not consider legitimate or lawful. We\u2019re looking into this further along with what action we can take to address it.\u201d<\/p>\n<p class=\"dcr-130mj7b\">AI image generators have been shown repeatedly to <a href=\"https:\/\/www.brookings.edu\/articles\/rendering-misrepresentation-diversity-failures-in-ai-image-generation\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">replicate<\/a> \u2013 and at times <a href=\"https:\/\/www.nature.com\/articles\/d41586-024-00674-9\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">exaggerate<\/a> \u2013 US social biases. Models such as Stable Diffusion and OpenAI\u2019s Dall-E <a href=\"https:\/\/www.wired.com\/story\/dall-e-2-ai-text-image-bias-social-media\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">offer<\/a> <a href=\"https:\/\/www.bloomberg.com\/graphics\/2023-generative-ai-bias\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">mostly<\/a> images of white men when asked to depict \u201clawyers\u201d or \u201cCEOs\u201d, and <a href=\"https:\/\/www.wired.com\/story\/dall-e-2-ai-text-image-bias-social-media\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">mostly<\/a> images of men of colour when asked to depict \u201ca man sitting in a prison cell\u201d.<\/p>\n<p class=\"dcr-130mj7b\">Recently, AI-generated images of extreme, racialised poverty have <a href=\"https:\/\/www.theguardian.com\/global-development\/2025\/oct\/20\/ai-generated-poverty-porn-fake-images-being-used-by-aid-agencies\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">flooded<\/a> stock photo sites, leading to <a href=\"https:\/\/fairpicture.org\/poverty-porn-in-the-era-of-generative-ai-whitepaper-checklist\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">discussion<\/a> in the NGO community about how AI tools replicate harmful images and stereotypes, bringing in an era of \u201cpoverty porn 2.0\u201d.<\/p>\n<p class=\"dcr-130mj7b\">It is unclear why Nano Banana Pro adds the logos of real charities to images of volunteers and scenes depicting humanitarian aid.<\/p>\n<p class=\"dcr-130mj7b\">In response to a query from the Guardian, a Google spokesperson said: \u201cAt times, some prompts can challenge the tools\u2019 guardrails and we remain committed to continually enhancing and refining the safeguards we have in place.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Nano Banana Pro, Google\u2019s new AI-powered image generator, has been accused of creating racialised and \u201cwhite saviour\u201d visuals&hellip;\n","protected":false},"author":2,"featured_media":175339,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[220,218,219,61,60,80],"class_list":{"0":"post-175338","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-ie","12":"tag-ireland","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/175338","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=175338"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/175338\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/175339"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=175338"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=175338"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=175338"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}