{"id":386565,"date":"2026-01-23T18:47:10","date_gmt":"2026-01-23T18:47:10","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/386565\/"},"modified":"2026-01-23T18:47:10","modified_gmt":"2026-01-23T18:47:10","slug":"professor-reports-that-openai-deleted-his-work-world-laughs-in-his-face","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/386565\/","title":{"rendered":"Professor Reports That OpenAI Deleted His Work, World Laughs in His Face"},"content":{"rendered":"<p>We\u2019ve all been there. We lost some part of our digital life, perhaps because we accidentally deleted it or a system failed us in some way. Well, a professor in Germany lost a large amount of work recently after changing his settings with OpenAI\u2019s ChatGPT, writing about it this week in <a href=\"https:\/\/www.nature.com\/articles\/d41586-025-04064-7\" rel=\"nofollow noopener\" target=\"_blank\">Nature<\/a>. But social media users don\u2019t seem very sympathetic. In fact, they\u2019re now repeatedly dunking on him for using AI in the first place.<\/p>\n<p>Marcel Bucher, a professor of plant sciences at the University of Cologne, writes that he signed up for a paid ChatGPT plan two years ago and found the AI tool tremendously useful.<\/p>\n<p>\u201cHaving signed up for OpenAI\u2019s subscription plan, ChatGPT Plus, I used it as an assistant every day \u2014 to write e-mails, draft course descriptions, structure grant applications, revise publications, prepare lectures, create exams and analyse student responses, and even as an interactive tool as part of my teaching,\u201d wrote Bucher.<\/p>\n<p>He acknowledged that ChatGPT, like all large language models, could be inaccurate, but liked it because it could remember the context of conversations, and he valued the \u201ccontinuity and apparent stability of the workspace.\u201d Then he tinkered with the settings for data consent.<\/p>\n<p>From Nature:<\/p>\n<p>But in August, I temporarily disabled the \u2018data consent\u2019 option because I wanted to see whether I would still have access to all of the model\u2019s functions if I did not provide OpenAI with my data. At that moment, all of my chats were permanently deleted and the project folders were emptied \u2014 two years of carefully structured academic work disappeared. No warning appeared. There was no undo option. Just a blank page. Fortunately, I had saved partial copies of some conversations and materials, but large parts of my work were lost forever.<\/p>\n<p>Bucher went on to explain that he initially thought it was a mistake and assumed that he would be able to recover his years of data. He reinstalled the app, tried different browsers, and tinkered with more settings. But nothing worked. He then tried to contact OpenAI but was predictably met with an AI agent, which couldn\u2019t help him. He eventually was able to contact a human, but they couldn\u2019t help him either. The data was gone.<\/p>\n<p> <img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2000713323\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2026\/01\/Screenshot-2026-01-23-at-8.45.01\u202fAM.jpg\" alt=\"Screenshot of a Bluesky post dunking on a professor who used AI. It features a tardigrade playing a tiny violin.\" width=\"598\" height=\"646\"  \/>\u00a9 Screenshot from Bluesky <\/p>\n<p>Again, this is the kind of story that would\u2019ve likely elicited some sympathy in another era. But here in 2026, when AI is often seen as a slop machine for generating wrong answers and <a href=\"https:\/\/gizmodo.com\/groks-sexual-deepfakes-will-become-illegal-in-the-uk-this-week-2000709212\" rel=\"nofollow noopener\" target=\"_blank\">child sexual abuse material<\/a>, there are more than a few people who will revel in someone losing all their AI chats.<\/p>\n<p>\u201cAmazing sob story: \u2018ChatGPT deleted all the work I hadn\u2019t done\u2019,\u201d one Bluesky user <a href=\"https:\/\/bsky.app\/profile\/dreadships.bsky.social\/post\/3mczxiy7v4c2f\" rel=\"nofollow noopener\" target=\"_blank\">wrote<\/a>.<\/p>\n<p>\u201cMaybe next time, actually do the work you are paid to do *yourself*, instead of outsourcing it to the climate-killing, suicide-encouraging plagiarism machine,\u201d wrote another <a href=\"https:\/\/bsky.app\/profile\/spencerfleury.bsky.social\/post\/3md24vuzlwk2l\" rel=\"nofollow noopener\" target=\"_blank\">another<\/a>. Others floated the possibility that the essay in Nature wasn\u2019t even written by Bucher.<\/p>\n<p>\u201cThis is the dumbest shit I\u2019ve read in a quite a while,\u201d a Bluesky user <a href=\"https:\/\/bsky.app\/profile\/mathijsvdsande.bsky.social\/post\/3md265pus7s2s\" rel=\"nofollow noopener\" target=\"_blank\">wrote<\/a>. \u201c(But, in his defense: there is no particular reason to assume that the guy who published this actually wrote it himself.)\u201d<\/p>\n<p>Bucher did make the point that he was being encouraged to use AI in his work, and there\u2019s validity to that complaint. Large institutions are telling their workers to incorporate AI more often under the theory that it\u2019s some kind of inevitable future:<\/p>\n<p>We are increasingly being encouraged to integrate generative AI into research and teaching. Individuals use it for writing, planning and teaching; universities are experimenting with embedding it into curricula. However, my case reveals a fundamental weakness: these tools were not developed with academic standards of reliability and accountability in mind.<\/p>\n<p>If a single click can irrevocably delete years of work, ChatGPT cannot, in my opinion and on the basis of my experience, be considered completely safe for professional use.<\/p>\n<p>It remains to be seen whether generative AI will truly transform the workplace in ways that actually matter, especially as workers are more skeptical and bosses try to insist on its use. Whatever happens, there will likely be plenty of AI skeptics around to celebrate when someone loses a bunch of work.<\/p>\n","protected":false},"excerpt":{"rendered":"We\u2019ve all been there. We lost some part of our digital life, perhaps because we accidentally deleted it&hellip;\n","protected":false},"author":2,"featured_media":386566,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,1921,874,86,56,54,55],"class_list":{"0":"post-386565","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-chatgpt","12":"tag-openai","13":"tag-technology","14":"tag-uk","15":"tag-united-kingdom","16":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/386565","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=386565"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/386565\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/386566"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=386565"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=386565"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=386565"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}