{"id":583154,"date":"2026-04-05T05:09:09","date_gmt":"2026-04-05T05:09:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/583154\/"},"modified":"2026-04-05T05:09:09","modified_gmt":"2026-04-05T05:09:09","slug":"wetware-ai-living-brain-cells-trained-to-run-chaos-math","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/583154\/","title":{"rendered":"Wetware AI: Living Brain Cells Trained to Run Chaos Math"},"content":{"rendered":"<p>Summary: The line between biology and computer science just got even blurrier. Researchers have successfully trained living rat neurons to perform complex machine learning tasks. The study integrated cultured neuronal networks into a \u201creservoir computing\u201d framework.<\/p>\n<p>Using a technique called FORCE learning, the team taught these biological circuits to generate intricate mathematical patterns\u2014including the chaotic Lorenz attractor\u2014proving that living \u201cwetware\u201d can serve as a functional, real-time computational resource.<\/p>\n<p>Key Facts<\/p>\n<p>Reservoir Computing: This framework uses the \u201cnatural\u201d messiness and complexity of a network (the reservoir) to process data. Instead of training every single neuron, scientists only train the \u201creadout\u201d layer that interprets the network\u2019s activity.FORCE Learning: A method used to adjust output signals in real-time based on errors. This is the first time it has been successfully applied to a Biological Neural Network (BNN) to generate time-series data.The \u201cChaos\u201d Test: The living neurons didn\u2019t just learn simple sine waves; they successfully reproduced the Lorenz attractor, a complex set of equations used to model chaotic systems like weather patterns.Microfluidic Precision: Researchers used tiny \u201cplumbing\u201d (microfluidics) to guide how the neurons grew. By creating modular \u201cneighborhoods\u201d of cells, they prevented the neurons from all firing at once (synchronization), which is critical for high-level computing.Versatility: The same biological system was flexible enough to learn waves with periods ranging from 4 to 30 seconds, demonstrating that living networks are remarkably adaptable.<\/p>\n<p>Source: Tohoku University<\/p>\n<p>A research team at Tohoku University and Future University Hakodate has demonstrated that living biological neurons can be trained to perform a supervised temporal pattern learning task previously carried out by artificial systems. <\/p>\n<p>By integrating cultured neuronal networks into a machine learning framework, the team showed that these biological systems can generate complex time-series signals, marking a significant step forward in both neuroscience and bio-inspired computing.<\/p>\n<p>The study was published online in Proceedings of the National Academy of Sciences (PNAS) on March 12, 2026, highlighting a novel intersection between living neural systems and computational technology. The findings suggest that biological neural networks (BNNs) may serve as viable alternatives or complements to existing machine learning models.<\/p>\n<p>Artificial neural networks (ANNs) and spiking neural networks (SNNs) have long been used in machine learning and neuromorphic hardware. A framework known as reservoir computing has emerged as an efficient approach for processing time-dependent data by leveraging the dynamic properties of recurrently connected ANNs and SNNs.<\/p>\n<p>In conventional ANN-based reservoir computing, methods such as First-Order Reduced and Controlled Error (FORCE) learning enable real-time adaptation by continuously adjusting output signals in response to errors.<\/p>\n<p>These techniques allow artificial systems to generate a wide range of temporal patterns, including periodic and chaotic signals. However, whether similar approaches could be applied to biological neural networks has remained an open question.<\/p>\n<p>To address this gap, the researchers constructed biological neural networks using cultured rat cortical neurons and incorporated them into a reservoir computing framework.<\/p>\n<p>By applying FORCE learning to optimize the system\u2019s readout layer, the team successfully trained the biological networks to produce complex temporal signals comparable to those involved in motor control.<\/p>\n<p>A key innovation in the study was the use of microfluidic devices to precisely guide neuronal growth and control network connectivity. This approach enabled the researchers to create modular network architectures that minimized excessive synchronization, thereby promoting the rich, high-dimensional dynamics required for effective reservoir computing.<\/p>\n<p>Using this system, the BNN-based framework was able to generate a variety of time-series patterns, including sine waves, triangular waves, square waves, and even chaotic trajectories such as the Lorenz attractor. Notably, the network demonstrated flexibility by learning and stably reproducing sine waves with periods ranging from 4 to 30 seconds within the same system.<\/p>\n<p>\u201cThis work shows that living neuronal networks are not only biologically meaningful systems but may also serve as novel computational resources,\u201d said Hideaki Yamamoto, a professor at Tohoku University.<\/p>\n<p>\u201cBy bridging neuroscience and machine learning, we are opening a pathway toward new forms of computing that leverage the intrinsic dynamics of biological systems.\u201d<\/p>\n<p>Looking ahead, the research team aims to improve the stability of signal generation after training has concluded. Future efforts will focus on reducing feedback delays and refining the FORCE learning algorithm. In parallel, the platform may be expanded into a microphysiological system for studying drug responses and modeling neurological disorders, further extending its impact across both scientific and medical fields.<\/p>\n<p>Key Questions Answered:Q: Are we basically building \u201cCyborg\u201d computers now?<\/p>\n<p class=\"schema-faq-answer\">A: We\u2019re moving in that direction! This is called \u201cWetware Computing.\u201d Unlike traditional silicon chips, these biological reservoirs use the intrinsic, \u201cnoisy\u201d physics of living cells to solve problems. They are incredibly energy-efficient and can adapt to new information in ways that rigid AI models often struggle with.<\/p>\n<p>Q: How do you \u201cteach\u201d a dish of cells to do math?<\/p>\n<p class=\"schema-faq-answer\">A: It\u2019s like a conductor leading an orchestra. The \u201creservoir\u201d of neurons is already playing a million different notes. The researchers use FORCE learning to listen to those notes and \u201creward\u201d the ones that fit the pattern they want (like a sine wave). Over time, the output layer learns exactly which neurons to \u201clisten\u201d to to get the right result.<\/p>\n<p>Q: What is the benefit of using real neurons over a standard AI?<\/p>\n<p class=\"schema-faq-answer\">A: Biology is the ultimate master of parallel processing. A single biological network can handle massive amounts of time-dependent data with very little power. Additionally, these systems could be used to test how drugs affect \u201cthinking\u201d circuits or to model neurological diseases in a dish without needing animal testing.<\/p>\n<p>Editorial Notes:This article was edited by a Neuroscience News editor.Journal paper reviewed in full.Additional context added by our staff.About this AI and neuroscience research news<\/p>\n<p class=\"has-background\" style=\"background-color:#ffffe8\">Author:\u00a0<a href=\"http:\/\/neurosciencenews.com\/cdn-cgi\/l\/email-protection#9df0f8eff4b3eff2ebfcddf2e8f1e8b3fbf4\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Public Relations Office<\/a><br \/>Source:\u00a0<a href=\"https:\/\/oulu.fi\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Tohoku University<\/a><br \/>Contact:\u00a0Public Relations Office \u2013 Tohoku University<br \/>Image:\u00a0The image is credited to Neuroscience News<\/p>\n<p class=\"has-background\" style=\"background-color:#ffffe8\">Original Research:\u00a0Open access.<br \/>\u201c<a href=\"https:\/\/dx.doi.org\/10.1073\/pnas.2521560123\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Online supervised learning of temporal patterns in biological neural networks under feedback control<\/a>\u201d by Yuki Sono, Hideaki Yamamoto, Yusei Nishi, Takuma Sumi, Yuya Sato, Ayumi Hirano-Iwata, Yuichi Katori, and Shigeo Sato.\u00a0PNAS<br \/>DOI:10.1073\/pnas.2521560123<\/p>\n<p>Abstract<\/p>\n<p>Online supervised learning of temporal patterns in biological neural networks under feedback control<\/p>\n<p>In vitro biological neural networks (BNNs) provide well-defined model systems for constructively investigating how living cells interact with their environments to shape high-dimensional dynamics that can be used to generate coherent temporal outputs, such as those required for motor control.<\/p>\n<p>Here, we develop a real-time closed-loop BNN system that is capable of generating periodic and chaotic temporal signals by integrating cultured cortical neurons with microfluidic devices and high-density microelectrode arrays.<\/p>\n<p>We show that training a simple linear decoder with fixed feedback weights enables the system to learn and autonomously generate diverse temporal patterns. When feedback is switched on, the irregular activity in the BNNs is transformed into low-dimensional, structured dynamics, producing coherent trajectories that are characterized by stable transitions between different neural states.<\/p>\n<p>BNNs trained on various target frequencies\u2014ranging from 4 to 30 s\u2014can be trained to sustain oscillations at distinct frequencies, demonstrating their adaptability. Importantly, top\u2013down control of the self-organized network formation with microfluidic devices is the key to suppressing excessive synchronization and increasing dynamic complexity in BNNs, facilitating the training process and the generation of robust outputs.<\/p>\n<p>This work offers a biologically inspired platform for understanding the physical basis of cortical computations and for advancing energy-efficient neuromorphic computing paradigms.<\/p>\n","protected":false},"excerpt":{"rendered":"Summary: The line between biology and computer science just got even blurrier. Researchers have successfully trained living rat&hellip;\n","protected":false},"author":2,"featured_media":583155,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[62,276,277,223025,49,48,25061,223026,796,9078,92079,61,79720,223027],"class_list":{"0":"post-583154","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-biological-neural-networks","12":"tag-ca","13":"tag-canada","14":"tag-deep-learning","15":"tag-force-learning","16":"tag-machine-learning","17":"tag-neuroscience","18":"tag-reservoir-computing","19":"tag-technology","20":"tag-tohoku-university","21":"tag-wetware-computing"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/583154","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=583154"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/583154\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/583155"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=583154"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=583154"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=583154"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}