{"id":19274,"date":"2025-07-24T19:37:19","date_gmt":"2025-07-24T19:37:19","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/19274\/"},"modified":"2025-07-24T19:37:19","modified_gmt":"2025-07-24T19:37:19","slug":"chatgpt-hallucinated-a-feature-forcing-human-developers-to-add-it","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/19274\/","title":{"rendered":"ChatGPT Hallucinated a Feature, Forcing Human Developers to Add It"},"content":{"rendered":"<p>In what might be a first, a programmer added a feature to a piece of software because ChatGPT hallucinated it, and customers kept attempting to force the software to do it.. The developers of the sheet music scanning app <a href=\"https:\/\/www.soundslice.com\/?ref=404media.co\" rel=\"nofollow noopener\" target=\"_blank\">Soundslice<\/a>, a site that lets people digitize and edit sheet music, added additional functionality to their site because the LLM kept telling people it existed. Rather than fight the LLM, Soundslice indulged the hallucination.<\/p>\n<p>Adrian Holovaty, one of Soundslices\u2019 developers, noticed something strange in the site&#8217;s error logs a few months ago. Users kept uploading ASCII tablature\u2014a basic system for notating music for guitar, despite the fact that Soundslice wasn\u2019t set up to process it, and had never advertised that it could. The error logs included pictures of what users had uploaded, and many of them were screenshots of ChatGPT conversations where the LLM had churned out ASCII tabs and told the users to send them to Soundslice.<\/p>\n<p>\u201cIt was around 5-10 images daily, for a period of a month or two. Definitely enough where I was like, \u2018What the heck is going on here?\u2019\u201d Holovaty told 404 Media. Rather than fight the LLM, Soundslice decided to add the feature ChatGPT had hallucinated. Holovaty said it only took his team a few hours to write up the code, which was a major factor in adding the feature.<\/p>\n<p>\u201cThe main reason we did this was to prevent disappointment,\u201d he said. \u201cI highly doubt many people are going to sign up for Soundslice purely to use our ASCII tab importer [\u2026] we were motivated by the, frankly, galling reality that ChatGPT was setting Soundslice users up for failure. I mean, from our perspective, here were the options:<\/p>\n<p>\u201c1. Ignore it, and endure the psychological pain of knowing people were getting frustrated by our product for reasons out of our control.<\/p>\n<p>\u201c2. Put annoying banners on our site saying: \u2018On the off chance that you&#8217;re using ChatGPT and it told you about a Soundslice ASCII tab feature, that doesn&#8217;t exist.\u2019 That&#8217;s disproportional and lame.<\/p>\n<p>\u201c3. Just spend a few hours and develop the feature.\u201d<\/p>\n<p>There\u2019s also no way to tell ChatGPT the feature doesn\u2019t exist. In an ideal world, OpenAI would have a formal procedure for removing content from its model, similar to the ability to <a href=\"https:\/\/developers.google.com\/search\/docs\/crawling-indexing\/remove-information?ref=404media.co\" rel=\"nofollow noopener\" target=\"_blank\">request the removal<\/a> of a site from Google\u2019s index. \u201cObviously with an LLM it&#8217;s much harder to do this technically, but I&#8217;m sure they can figure it out, given the absurdly high salaries their researchers are earning,\u201d Holovaty said.<\/p>\n<p>He added that the situation made him realize how powerful ChatGPT has become as an influencer of consumer behavior. \u201cIt&#8217;s making product recommendations\u2014for existent and nonexistent features alike\u2014to massive audiences, with zero transparency into why it made those particular recommendations. And zero recourse.\u201d<\/p>\n<p>This may be the first time that developers have added a feature to a piece of software because ChatGPT hallucinated it, but it won\u2019t be the last. In a personal blog, developer Niki Tonsky <a href=\"https:\/\/tonsky.me\/blog\/gaslight-driven-development\/?ref=404media.co\" rel=\"nofollow noopener\" target=\"_blank\">dubbed this phenomenon<\/a> \u201cgaslight-driven development\u201d and shared a recent experience that\u2019s similar to Holovaty\u2019s.<\/p>\n<p>One of Tonsky\u2019s projects is a <a href=\"https:\/\/github.com\/instantdb\/instant?ref=404media.co\" rel=\"nofollow noopener\" target=\"_blank\">database for frontends<\/a> called Instant. An update method for the app used a text document called \u201cupdate\u201d but LLMs that interacted with Instant kept calling the file \u201ccreate.\u201d Tonsky told 404 Media that, rather than fight the LLMs, his team just added the text file with the name the systems wanted. \u201cIn general I agree `create` is more obvious, it\u2019s just weird that we arrived at this through LLM,\u201d he said.<\/p>\n<p>He told 404 media that programmers will probably need to account for the \u201ctastes\u201d of LLMs in the future. \u201cYou kinda already have to. It\u2019s not programming for AI, but AI as a tool changes how we do programming,\u201d he said.<\/p>\n<p>Holovaty doesn\u2019t hate AI\u2014Soundslice uses machine learning to do its magic\u2014but is mixed on LLMs. He compared his experience with ChatGPT to dealing with an overzealous sales team selling a feature that doesn\u2019t exist. He also doesn\u2019t trust LLMs to write code. He experimented with it, but found it caused more problems than it solved.\u00a0<\/p>\n<p>\u201cI don&#8217;t trust it for my production Soundslice code,\u201d he said. \u201cPlus: writing code is fun! Why would I choose to deny myself fun? To appease the capitalism gods? No thanks.\u201d<\/p>\n<p>About the author<\/p>\n<p>Matthew Gault is a writer covering weird tech, nuclear war, and video games. He\u2019s worked for Reuters, Motherboard, and the New York Times.<\/p>\n<p>        <img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/07\/87e07bd5bb3d003b0b135303a3e7f8b9\" alt=\"Matthew Gault\"\/>  <\/p>\n","protected":false},"excerpt":{"rendered":"In what might be a first, a programmer added a feature to a piece of software because ChatGPT&hellip;\n","protected":false},"author":2,"featured_media":19275,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-19274","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/19274","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=19274"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/19274\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/19275"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=19274"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=19274"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=19274"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}