{"id":271829,"date":"2025-11-08T21:07:11","date_gmt":"2025-11-08T21:07:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/271829\/"},"modified":"2025-11-08T21:07:11","modified_gmt":"2025-11-08T21:07:11","slug":"scientists-say-theyve-figured-out-how-to-transcribe-your-thoughts-from-an-mri-scan","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/271829\/","title":{"rendered":"Scientists Say They\u2019ve Figured Out How to Transcribe Your Thoughts From an MRI Scan"},"content":{"rendered":"<p class=\"mb-4 text-lg md:leading-8 break-words\">We\u2019re racing towards a future in which devices <a href=\"https:\/\/futurism.com\/professor-welcomes-future-employers-reading-brain\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:will be able to read our thoughts;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">will be able to read our thoughts<\/a>.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">You see signs of it everywhere, from <a href=\"https:\/\/tech.yahoo.com\/ai\/meta-ai\/articles\/sam-altman-funding-biomedical-startup-180226192.html\" data-ylk=\"slk:brain-computer interfaces;elm:context_link;itc:0;sec:content-canvas;outcm:mb_qualified_link;_E:mb_qualified_link;ct:story;\" class=\"link  yahoo-link\" rel=\"nofollow noopener\" target=\"_blank\">brain-computer interfaces<\/a> to algorithms that <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2667305324000152\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:detect emotions from facial scans;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">detect emotions from facial scans<\/a>. And though the tech remains imperfect, it\u2019s getting closer all the time: now a team of scientists say they\u2019ve developed a model that can generate descriptions of what people\u2019s brains are seeing by simply analyzing a scan of their brain activity.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">They\u2019re calling the technique \u201cmind captioning,\u201d and it may represent an effective way for transcribing what someone\u2019s thinking, with impressively comprehensive and accurate results.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">\u201cThis is hard to do,\u201d Alex Huth, coauthor of a <a href=\"https:\/\/www.science.org\/doi\/10.1126\/sciadv.adw1464\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:new study;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">new study<\/a> in the journal Science Advances, and a computational neuroscientist at the University of California, Berkeley, <a href=\"https:\/\/www.nature.com\/articles\/d41586-025-03624-1\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:told Nature;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">told Nature<\/a>. \u201cIt\u2019s surprising you can get that much detail.\u201d<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">The implications of such technology are a double-edged sword: on the one hand, it could give a voice to people who struggle speaking due to stroke, aphasia, and other medical difficulties, but on the other hand, it may threaten our <a href=\"https:\/\/futurism.com\/neoscope\/scientists-mind-reading-rights\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:mental privacy;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">mental privacy<\/a> in an age when many other facets of our lives are surveilled and codified. But the team stress the model can\u2019t decode your private thoughts. \u201cNobody has shown you can do that, yet,\u201d Huth added.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">The researchers\u2019 new technique relies on several AI models. To train them, first a deep language model analyzed the text captions in more than 2,000 short form videos, generating unique \u201cmeaning signature.\u201d Then another AI tool was trained on the MRI brain scans of six participants while they watched the same videos, matching the brain activity to the signatures.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Combined, the resulting brain decoder could analyze a new brain scan from someone watching a video and predict the meaning signature, while an AI text generator scoured for sentences that matched the predicted signature, creating dozens of candidate descriptions and refining them along the way.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">While it sounds like an elaborate chain of guessing games, the results were remarkably descriptive and mostly on the money. According to Nature, by analyzing the brain activity of a participant who watched a video of someone jumping from the top of a waterfall, the AI model initially predicted the string \u201cspring flow,\u201d refined that into \u201cabove rapid falling water fall\u201d on the tenth guess, and finally landed on \u201ca person jumps over a deep water fall on a mountain ridge\u201d on the 100th guess.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Overall, the generated text descriptions achieved a 50 percent accuracy in identifying the correct video out of 100 possibilities. That\u2019s significantly higher than random chance, which would be around one percent, and impressive in the context of essentially divining coherent thoughts out of brain patterns.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">The researchers aren\u2019t the only ones to claim they\u2019ve developed a technique for scanning thoughts. But other attempts only produced a crude description of key words instead of providing detailed context, study coauthor Tomoyasu Horikawa, a computational neuroscientist at NTT Communication Science Laboratories in Kanagawa, Japan, told Nature. Or they used AI models to directly form the sentences, blurring the lines between what the person\u2019s actual thoughts were and what was AI-generated.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Other techniques were wildly impractical. Meta, for example, <a href=\"https:\/\/tech.yahoo.com\/science\/articles\/meta-appears-invented-device-allowing-144529649.html\" data-ylk=\"slk:created a device that lets you type text with your brain;elm:context_link;itc:0;sec:content-canvas;outcm:mb_qualified_link;_E:mb_qualified_link;ct:story;\" class=\"link  yahoo-link\" rel=\"nofollow noopener\" target=\"_blank\">created a device that lets you type text with your brain<\/a> by combining a deep learning AI model with a magnetoencephalography scanner. But such a machine is both prohibitively expensive and large, and can only be used inside a room shielded from the Earth\u2019s magnetic field.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">While this latest approach relied on the scans of an MRI machine, which is no less impractical for daily use, the researchers hope that their approach could be combined with brain implants which would provide the readings.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">\u201cIf we can do that using these artificial systems, maybe we can help out these people with communication difficulties,\u201d Huth told Nature.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">More on brain tech: <a href=\"https:\/\/tech.yahoo.com\/science\/articles\/neuralink-head-surgery-says-robot-210500560.html\" data-ylk=\"slk:Neuralink Head of Surgery Says Robot-Human Interface Happening \u201cVery Soon\u201d;elm:context_link;itc:0;sec:content-canvas;outcm:mb_qualified_link;_E:mb_qualified_link;ct:story;\" class=\"link  yahoo-link\" rel=\"nofollow noopener\" target=\"_blank\">Neuralink Head of Surgery Says Robot-Human Interface Happening \u201cVery Soon\u201d<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"We\u2019re racing towards a future in which devices will be able to read our thoughts. You see signs&hellip;\n","protected":false},"author":2,"featured_media":271830,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[157959,64,63,55069,157958,105],"class_list":{"0":"post-271829","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-technology","8":"tag-alex-huth","9":"tag-au","10":"tag-australia","11":"tag-brain-activity","12":"tag-computational-neuroscientist","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/271829","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=271829"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/271829\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/271830"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=271829"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=271829"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=271829"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}