{"id":68987,"date":"2025-08-15T01:02:08","date_gmt":"2025-08-15T01:02:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/68987\/"},"modified":"2025-08-15T01:02:08","modified_gmt":"2025-08-15T01:02:08","slug":"researchers-asked-ai-to-show-a-typical-australian-dad-he-was-white-and-had-an-iguana-tama-leaver-and-suzanne-srdarov-for-the-conversation","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/68987\/","title":{"rendered":"Researchers asked AI to show a typical Australian dad: he was white and had an iguana | Tama Leaver and Suzanne Srdarov for the Conversation"},"content":{"rendered":"<p class=\"dcr-130mj7b\">Big tech company <a href=\"https:\/\/doi.org\/10.5204\/mcj.3004\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">hype<\/a> sells generative artificial intelligence (AI) as intelligent, creative, desirable, inevitable and about to radically reshape the future in many ways.<\/p>\n<p class=\"dcr-130mj7b\">Published by Oxford University Press, our <a href=\"https:\/\/doi.org\/10.1093\/9780198945215.003.0150\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">new research<\/a> on how generative AI depicts Australian themes directly challenges this perception.<\/p>\n<p class=\"dcr-130mj7b\">We found when generative AIs produce images of Australia and Australians, these outputs are riddled with bias. They reproduce sexist and racist caricatures more at home in the country\u2019s imagined monocultural past.<\/p>\n<p>Basic prompts, tired tropes<\/p>\n<p class=\"dcr-130mj7b\">In May 2024, we asked: what do Australians and Australia look like according to generative AI?<\/p>\n<p class=\"dcr-130mj7b\">To answer this question, we entered 55 different text prompts into five of the most popular image-producing generative AI tools: Adobe Firefly, Dream Studio, Dall-E 3, Meta AI and Midjourney.<\/p>\n<p class=\"dcr-130mj7b\">The prompts were as short as possible to see what the underlying ideas of Australia looked like, and what words might produce significant shifts in representation.<\/p>\n<p class=\"dcr-130mj7b\">We didn\u2019t alter the default settings on these tools, and collected the first image or images returned. Some prompts were refused, producing no results. (Requests with the words \u201cchild\u201d or \u201cchildren\u201d were more likely to be refused, clearly marking children as a risk category for some AI tool providers.)<\/p>\n<p class=\"dcr-130mj7b\">Overall, we ended up with a set of about 700 images.<\/p>\n<p class=\"dcr-130mj7b\">They produced ideals suggestive of travelling back through time to an imagined Australian past, relying on tired tropes such as red dirt, Uluru, the outback, untamed wildlife and bronzed Aussies on beaches.<\/p>\n<p>\u2018A typical Australian family\u2019 generated by Dall-E 3 in May 2024<\/p>\n<p class=\"dcr-130mj7b\">We paid particular attention to images of Australian families and childhoods as signifiers of a broader narrative about \u201cdesirable\u201d Australians and cultural norms.<\/p>\n<p class=\"dcr-130mj7b\">According to generative AI, the idealised Australian family was overwhelmingly white by default, suburban, heteronormative and very much anchored in a settler-colonial past.<\/p>\n<p>\u2018An Australian father\u2019 with an iguana<\/p>\n<p class=\"dcr-130mj7b\">The images generated from prompts about families and relationships gave a clear window into the biases baked into these generative AI tools.<\/p>\n<p class=\"dcr-130mj7b\">\u201cAn Australian mother\u201d typically resulted in white, blond women wearing neutral colours and peacefully holding babies in benign domestic settings.<\/p>\n<p>\u2018An Australian mother\u2019 generated by Dall-E 3 in May 2024<\/p>\n<p class=\"dcr-130mj7b\">The only exception to this was Firefly which produced images of exclusively Asian women, outside domestic settings and sometimes with no obvious visual links to motherhood at all.<\/p>\n<p class=\"dcr-130mj7b\">Notably, none of the images generated of Australian women depicted First Nations Australian mothers, unless explicitly prompted. For AI, whiteness is the default for mothering in an Australian context.<\/p>\n<p>\u2018An Australian parent\u2019 generated by Firefly in May 2024<\/p>\n<p class=\"dcr-130mj7b\">Similarly, \u201cAustralian fathers\u201d were all white. Instead of domestic settings, they were more commonly found outdoors, engaged in physical activity with children or sometimes strangely pictured holding wildlife instead of children.<\/p>\n<p class=\"dcr-130mj7b\">One such father was even toting an iguana \u2013 an animal not native to Australia \u2013 so we can only guess at the data responsible for this and other <a href=\"https:\/\/doi.org\/10.5204\/mcj.3123\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">glaring glitches<\/a> found in our image sets.<\/p>\n<p>Alarming levels of racist stereotypes<\/p>\n<p class=\"dcr-130mj7b\">Prompts to include visual data of Aboriginal Australians surfaced some concerning images, often with regressive visuals of \u201cwild\u201d, \u201cuncivilised\u201d and sometimes even \u201chostile native\u201d tropes.<\/p>\n<p class=\"dcr-130mj7b\">This was alarmingly apparent in images of \u201ctypical Aboriginal Australian families\u201d which we have chosen not to publish. Not only do they perpetuate problematic racial biases, but they also may be based on data and imagery <a href=\"https:\/\/www.sbs.com.au\/nitv\/article\/indigenous-cultural-protocols-what-the-media-needs-to-do-when-depicting-deceased-persons\/97xq2otnt\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">of deceased individuals<\/a> that rightfully belongs to First Nations people.<\/p>\n<p class=\"dcr-130mj7b\">But the racial stereotyping was also acutely present in prompts about housing.<\/p>\n<p class=\"dcr-130mj7b\">Across all AI tools, there was a marked difference between an \u201cAustralian\u2019s house\u201d \u2013 presumably from a white, suburban setting and inhabited by the mothers, fathers and their families depicted above \u2013 and an \u201cAboriginal Australian\u2019s house\u201d.<\/p>\n<p class=\"dcr-130mj7b\">For example, when prompted for an \u201cAustralian\u2019s house\u201d, Meta AI generated a suburban brick house with a well-kept garden, swimming pool and lush green lawn.<\/p>\n<p class=\"dcr-130mj7b\">When we then asked for an \u201cAboriginal Australian\u2019s house\u201d, the generator came up with a grass-roofed hut in red dirt, adorned with \u201cAboriginal-style\u201d art motifs on the exterior walls and with a fire pit out the front.<\/p>\n<p>\u2018An Aboriginal Australian\u2019s house\u2019, generated by Meta AI in May 2024<\/p>\n<p class=\"dcr-130mj7b\">The differences between the two images are striking. They came up repeatedly across all the image generators we tested.<\/p>\n<p class=\"dcr-130mj7b\">These representations clearly do not respect the idea of <a href=\"https:\/\/doi.org\/10.54760\/001c.133656\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Indigenous Data Sovereignty<\/a> for Aboriginal and Torres Straight Islander peoples, where they would get to own their own data and control access to it.<\/p>\n<p>Has anything improved?<\/p>\n<p class=\"dcr-130mj7b\">Many of the AI tools we used have updated their underlying models since our research was first conducted.<\/p>\n<p class=\"dcr-130mj7b\">On 7 August, OpenAI <a href=\"https:\/\/openai.com\/index\/introducing-gpt-5\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">released<\/a> their most recent flagship model, GPT-5.<\/p>\n<p class=\"dcr-130mj7b\">To check whether the latest generation of AI is better at avoiding bias, we asked ChatGPT5 to \u201cdraw\u201d two images: \u201can Australian\u2019s house\u201d and \u201can Aboriginal Australian\u2019s house\u201d.<\/p>\n<p>Image generated by ChatGPT5 on 10 August  2025 in response to the prompt \u2018draw an Australian\u2019s house\u2019<\/p>\n<p class=\"dcr-130mj7b\">The first showed a photorealistic image of a fairly typical redbrick suburban family home. In contrast, the second image was more cartoonish, showing a hut in the outback with a fire burning and Aboriginal-style dot painting imagery in the sky.<\/p>\n<p>Image generated by ChatGPT5 on 10 August  2025 in response to the prompt \u2018draw an Aboriginal Australian\u2019s house\u2019<\/p>\n<p class=\"dcr-130mj7b\">These results, generated just a couple of days ago, speak volumes.<\/p>\n<p>Why this matters<\/p>\n<p class=\"dcr-130mj7b\">Generative AI tools are everywhere. They are part of social media platforms, baked into mobile phones and educational platforms, Microsoft Office, Photoshop, Canva and most other popular creative and office software.<\/p>\n<p class=\"dcr-130mj7b\">In short, they are unavoidable.<\/p>\n<p class=\"dcr-130mj7b\">Our research shows generative AI tools will readily produce content rife with inaccurate stereotypes when asked for basic depictions of Australians.<\/p>\n<p class=\"dcr-130mj7b\">Given how widely they are used, it\u2019s concerning that AI is producing caricatures of Australia and visualising Australians in reductive, sexist and racist ways.<\/p>\n<p class=\"dcr-130mj7b\">Given the ways these AI tools are trained on tagged data, reducing cultures to cliches may well be a feature rather than a bug for generative AI systems.<\/p>\n<p class=\"dcr-130mj7b\"> Tama Leaver is a professor of internet studies at Curtin University. Suzanne Srdarov is a research fellow in media and cultural studies at Curtin University<\/p>\n<p class=\"dcr-130mj7b\"> This article was originally published in <a href=\"https:\/\/theconversation.com\/australiana-images-made-by-ai-are-racist-and-full-of-tired-cliches-new-study-shows-263117\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">the Conversation<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"Big tech company hype sells generative artificial intelligence (AI) as intelligent, creative, desirable, inevitable and about to radically&hellip;\n","protected":false},"author":2,"featured_media":68988,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-68987","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/68987","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=68987"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/68987\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/68988"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=68987"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=68987"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=68987"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}