{"id":457506,"date":"2026-02-04T09:08:09","date_gmt":"2026-02-04T09:08:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/457506\/"},"modified":"2026-02-04T09:08:09","modified_gmt":"2026-02-04T09:08:09","slug":"an-ai-afterlife-is-now-a-real-option-but-what-becomes-of-your-legal-status","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/457506\/","title":{"rendered":"An \u2018AI afterlife\u2019 is now a real option \u2013 but what becomes of your legal status?"},"content":{"rendered":"<p>Would you create an interactive \u201cdigital twin\u201d of yourself that can communicate with loved ones after your death?<\/p>\n<p>Generative artificial intelligence (AI) has made it possible to <a href=\"https:\/\/theconversation.com\/should-ai-be-allowed-to-resurrect-the-dead-272643\" rel=\"nofollow noopener\" target=\"_blank\">seemingly resurrect the dead<\/a>. So-called griefbots or <a href=\"https:\/\/theconversation.com\/can-you-really-talk-to-the-dead-using-ai-we-tried-out-deathbots-so-you-dont-have-to-268902\" rel=\"nofollow noopener\" target=\"_blank\">deathbots<\/a> \u2013 an AI-generated voice, video avatar or text-based chatbot trained on the data of a deceased person \u2013 proliferate in the <a href=\"https:\/\/www.theatlantic.com\/ideas\/2026\/02\/deadbots-ai-grief-obsolete\/685811\/\" rel=\"nofollow noopener\" target=\"_blank\">booming digital afterlife industry<\/a>, also known as grief tech.<\/p>\n<p>Deathbots are usually created by the bereaved, often as part of the grieving process. But there are also services that allow you to <a href=\"https:\/\/www.mindbank.ai\/\" rel=\"nofollow noopener\" target=\"_blank\">create a digital twin of yourself<\/a> while you\u2019re still alive. So why not <a href=\"https:\/\/afterlife.ai\/services\" rel=\"nofollow noopener\" target=\"_blank\">create one for when you\u2019re gone<\/a>?<\/p>\n<p>As with any application of new technology, the idea of such digital immortality raises many legal questions \u2013 and most of them don\u2019t have a clear answer.<\/p>\n<p>Your AI afterlife<\/p>\n<p>To create an AI digital twin of yourself, you can sign up for a service that provides this feature, and answer a series of questions to provide data about who you are. You also record stories, memories and thoughts in your own voice. You might also upload your visual likeness in the form of images or video.<\/p>\n<p>The AI software then creates a digital replica based on that training data. After you die and the company is notified of your death, your loved ones can interact with your digital twin. <\/p>\n<p>But in doing this, you\u2019re also delegating agency to a company to create a digital AI simulation of yourself after death.<\/p>\n<p>From the get go, this is different to using AI to \u201cresurrect\u201d a dead person who can\u2019t consent to this. Instead, a living person is essentially licensing data about themselves to an AI afterlife company before they\u2019ve died. They\u2019re engaging in a deliberate, contractual creation of AI-generated data for posthumous use.<\/p>\n<p>However, there are many unanswered questions. What about copyright? <a href=\"https:\/\/www.thehastingscenter.org\/griefbots-are-here-raising-questions-of-privacy-and-well-being\/\" rel=\"nofollow noopener\" target=\"_blank\">What about your privacy?<\/a>. What happens if the technology becomes outdated or the business closes? Does the data get sold on? Does the digital twin also \u201cdie\u201d, and what effect does this have for a second time on the bereaved? <\/p>\n<p>What does the law say?<\/p>\n<p>Currently, Australian law doesn\u2019t protect a person\u2019s identity, voice, presence, values or personality as such. In contrast to the United States, Australians don\u2019t have a <a href=\"https:\/\/theconversation.com\/ai-deepfakes-threaten-democracy-and-peoples-identities-personality-rights-could-help-251267\" rel=\"nofollow noopener\" target=\"_blank\">general publicity or personality right<\/a>. This means, for an Australian citizen, there\u2019s currently no legal right for you to own or control your identity \u2013 the use of your voice, image or likeness.<\/p>\n<p>In short, the law doesn\u2019t recognise a proprietary right in most of the unique things that make you \u201cyou\u201d.<\/p>\n<p>Under copyright law, the concept of your presence or self is abstract, much like an idea is. Copyright doesn\u2019t offer protection for \u201cyour presence\u201d or \u201cthe self\u201d as such. That\u2019s because there has to be material form in specific categories of works for copyright to exist: these are tangible things, such as books or photos.<\/p>\n<p>However, typed responses or the voice recordings submitted to the AI for training are material. This means the data used to train the AI to create your digital twin would likely be protectable. But fully autonomous AI generated output is <a href=\"https:\/\/copyrightalliance.org\/faqs\/artificial-intelligence-copyright-ownership\/\" rel=\"nofollow noopener\" target=\"_blank\">unlikely to have any copyright attached to it<\/a>. Under current Australian law, it would likely be considered authorless because it didn\u2019t originate from the \u201cindependent intellectual effort\u201d of a human, but from a machine.<\/p>\n<p><a href=\"https:\/\/www.artslaw.com.au\/information-sheet\/moral-rights\/\" rel=\"nofollow noopener\" target=\"_blank\">Moral rights in copyright<\/a> protect a creator\u2019s reputation against false attribution and against derogatory treatment of their work. However, they wouldn\u2019t apply to a digital twin. This is because moral rights attach to actual works created by a human author, not any AI-generated output. <\/p>\n<p>So where does that leave your digital twin? Although it\u2019s unlikely copyright applies to AI-generated output, in their terms and conditions companies may assert ownership of the AI-generated data, users may be granted rights in outputs, or the company may reserve extensive reuse rights. It\u2019s something to look out for.<\/p>\n<p>There are ethical risks, too<\/p>\n<p>Using AI to make digital copies of people \u2013 living or dead \u2013 <a href=\"https:\/\/theconversation.com\/deadbots-can-speak-for-you-after-your-death-is-that-ethical-182076\" rel=\"nofollow noopener\" target=\"_blank\">also raises ethical risks<\/a>. For example, even though the training data for your digital twin might be locked upon your death, others will be accessing it in the future by interacting with it. What happens if the technology misrepresents the deceased person\u2019s morals and ethics? <\/p>\n<p>As AI is usually probabilistic and based on algorithms, there may be risk of creep or distortion, where the responses drift over time. The deathbot could lose its resemblance to the original person. It\u2019s not clear what recourse the bereaved may have if this happens.<\/p>\n<p>AI-enabled deathbots and digital twins <a href=\"https:\/\/www.scientificamerican.com\/article\/can-ai-griefbots-help-us-heal\/\" rel=\"nofollow noopener\" target=\"_blank\">can help people grieve<\/a>, but the effects so far are largely anecdotal \u2013 more study is needed. At the same time, there\u2019s potential for bereaved relatives to <a href=\"https:\/\/doi.org\/10.1007\/s11948-022-00417-x\" rel=\"nofollow noopener\" target=\"_blank\">form a dependence<\/a> on the AI version of their loved one, <a href=\"https:\/\/doi.org\/10.1007\/s11245-023-09995-2\" rel=\"nofollow noopener\" target=\"_blank\">rather than processing their grief in a healthier way<\/a>. If the outputs of AI-powered grief tech cause distress, how can this be managed, and who will be held responsible?<\/p>\n<p>The current state of the law clearly shows more regulation is needed in this burgeoning grief tech industry. Even if you consent to the use of your data for an AI digital twin after you die, it\u2019s difficult to anticipate new technologies changing how your data is used in the future.<\/p>\n<p>For now, it\u2019s important to always read the terms and conditions if you decide to create a digital afterlife for yourself. After all, you are bound by the contract you sign.<\/p>\n","protected":false},"excerpt":{"rendered":"Would you create an interactive \u201cdigital twin\u201d of yourself that can communicate with loved ones after your death?&hellip;\n","protected":false},"author":2,"featured_media":457507,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-457506","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/457506","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=457506"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/457506\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/457507"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=457506"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=457506"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=457506"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}