{"id":330593,"date":"2026-03-05T17:06:09","date_gmt":"2026-03-05T17:06:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/330593\/"},"modified":"2026-03-05T17:06:09","modified_gmt":"2026-03-05T17:06:09","slug":"thermodynamic-computing-advances-with-design-and-training","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/330593\/","title":{"rendered":"Thermodynamic Computing Advances with Design and Training"},"content":{"rendered":"<p>\n\t\t\t\t\t\t\t\t\t\tBYLINE: Elizabeth Ball\t\t\t\t\t\t\t\t\t\t<\/p>\n<p>Newswise \u2014 What if the thermal noise that hinders the efficiency of both classical and quantum computers could, instead, be used as a power source? What if computers could make use of the noise instead of suppressing or overcoming it? These are the goals of a relatively new branch of computing known as thermodynamic computing. A collaboration between researchers at the Molecular Foundry and the National Energy Research Scientific Computing Center (NERSC), both U.S. Department of Energy (DOE) user facilities located at Lawrence Berkeley National Laboratory (Berkeley Lab), is bringing them closer to reality. In <a href=\"https:\/\/www.nature.com\/articles\/s41467-025-67958-0\" rel=\"nofollow noopener\" target=\"_blank\">a paper published in Nature Communications<\/a>, the researchers have proposed a design and training framework for a type of thermodynamic computer that mimics a neural network, which could drastically reduce the energy requirements of machine learning.\u00a0<\/p>\n<p>Modern computing requires energy: a single Google search, for example, consumes enough energy to power a six-watt LED for three minutes. This is partly because computers must contend with thermal noise \u2014 that is, the vibration of charge carriers, mostly electrons, within electronically conductive materials. In classical computers, even the smallest devices, such as transistors and gates, operate at energy scales thousands of times larger than that of this vibration. This difference in scale between signal and noise enables the consistent output that makes computation possible, but it comes at an energy cost: classical computers require large amounts of power to work reliably and operate far above the threshold of thermodynamic efficiency.<\/p>\n<p>Both classical and quantum computing seek to eliminate or tamp down thermal noise. But thermodynamic computing, a branch of unconventional computing, inverts the paradigms of both and uses those same fluctuations as its power source. This drastically reduces the amount of external energy required to perform computations and allows for operation at room temperature, unlike many quantum computers. In this way, thermodynamic computing is an exciting example of Beyond-Moore\u2019s-Law microelectronics and low-power, energy-aware computing.<\/p>\n<p>\u201cThermodynamic computing is noise-powered,\u201d said Molecular Foundry staff scientist Stephen Whitelam, an author on the paper. \u201cThe premise of thermodynamic computing is that if you take a physical device with an energy scale comparable to that of thermal energy and leave it alone, it will change state over time, driven by thermal fluctuations. The goal is to program it so that this time evolution does something useful. Classical and quantum computing fight noise; thermodynamic computing is powered by it.\u201d<\/p>\n<p>Overcoming roadblocks<\/p>\n<p>Thus far, two primary challenges have stood in the way of thermodynamic computing as a practical framework for computation. First, existing thermodynamic computers are designed to do computation at thermodynamic equilibrium, meaning researchers must wait for the computer to settle into its lowest-energy configuration before they can perform a calculation. Even if a system\u2019s ground state is well-defined, the amount of time it takes to reach equilibrium is unpredictable \u2013 and it can be too long to be practical for day-to-day computational use.<\/p>\n<p>Additionally, the range of computations that can be performed using thermodynamic computing has been limited to solving linear algebra problems. For thermodynamic computing to be useful for general-purpose computation, systems will also need to be able to solve nonlinear calculations.<\/p>\n<p>In their paper, Whitelam and his colleague Corneel Casert of NERSC address these challenges, using digital simulations to demonstrate that nonlinear computations \u2014 like those performed by neural networks \u2014 are indeed possible using thermodynamic computers that are not working at equilibrium.\u00a0<\/p>\n<p>According to Whitelam and Casert, when the components of the computer are themselves nonlinear, it becomes possible to train a thermodynamic computer to perform nonlinear computations at specified times, regardless of its equilibrium status. This means the computer operates more like a classical computer, without the need to wait for equilibrium. It also expands the set of thermodynamic algorithms to the same types of complex, nonlinear problems a neural network can do, meaning thermodynamic computing could be an appropriate tool for machine-learning workloads that have previously been outside its capabilities.<\/p>\n<p>\u201cA nonlinear thermodynamic circuit can behave like a neuron in a neural network,\u201d said Whitelam. \u201cNonlinearity is what gives a neural network its expressive power. What we reasoned is that if you build these thermodynamic neurons into a connected structure, then that structure should have the expressive power to mimic a neural network and so be able to do machine learning.\u201d\u00a0<\/p>\n<p>Together, these solutions expand what thermodynamic computing can do.<\/p>\n<p>Inverted training<\/p>\n<p>The challenge, then, becomes training such a system. A thermodynamic computer is a stochastic system, meaning that no two runs on a thermodynamic computer look the same, and the methods used for training digital neural networks don\u2019t apply. But Whitelam and Casert have offered a solution there as well.<\/p>\n<p>To train Whitelam\u2019s model of the thermodynamic computer, Casert engineered a large-scale computational framework. Using 96 GPUs in parallel on the Perlmutter supercomputer at NERSC, Casert built and ran massively parallel evolutionary simulations, evaluating billions of noisy dynamical trajectories per generation to discover the most effective network parameters.\u00a0<\/p>\n<p>In particular, he used a framework known as a genetic algorithm: beginning with a set of different thermodynamic neural networks and evaluating the effectiveness of each, he selected the best-performing and mutated them, adding random noise to their parameters, and evaluated them again. Ultimately, Casert simulated more than a trillion runs of a thermodynamic computer, using Perlmutter\u2019s GPUs in parallel. This training framework is considerably more costly than the methods used to train digital networks, but it yields a computer that can operate using very little energy after it\u2019s built and trained.<\/p>\n<p>\u201cIt\u2019s a very different way of optimizing a neural network. Training a thermodynamic neural network by simulating it digitally is expensive, but once trained and built as physical hardware,\u00a0 we can perform inference on that hardware for a very low energy cost,\u201d said Casert.\u00a0<\/p>\n<p>The combination of design and training show that a machine-learning computer that uses far less energy is possible.\u00a0<\/p>\n<p>More hardware, more algorithms<\/p>\n<p>The field of thermodynamic computing is relatively young \u2013 so where does it go from here? According to Whitelam, it\u2019s important to work out how to realize these designs in hardware. Currently, the team is looking for experimental partners to make both hardware and software a reality \u2014 another step exploring what\u2019s possible with thermodynamic computing.<\/p>\n<p>Another step, he says, is more algorithms. Existing algorithms are meant for systems working at equilibrium; with that requirement no longer a roadblock, new ones will need to be developed. The field will also need new algorithms for nonlinear computations, mirroring the ones used for digital neural networks.\u00a0<\/p>\n<p>\u201cIt\u2019s an exciting field,\u201d said Whitelam. \u201cWe\u2019re looking for more efficient ways of computing, and thermodynamic computing is definitely one of them.\u201d<\/p>\n<p class=\"text-center\">###<\/p>\n<p><a href=\"https:\/\/www.lbl.gov\/\" rel=\"nofollow noopener\" target=\"_blank\">Lawrence Berkeley National Laboratory<\/a> (Berkeley Lab) is committed to groundbreaking research focused on discovery science and solutions for abundant and reliable energy supplies. The lab\u2019s expertise spans materials, chemistry, physics, biology, earth and environmental science, mathematics, and computing. Researchers from around the world rely on the lab\u2019s world-class scientific facilities for their own pioneering research. Founded in 1931 on the belief that the biggest problems are best addressed by teams, Berkeley Lab and its scientists have been recognized with 17 Nobel Prizes. Berkeley Lab is a multiprogram national laboratory managed by the University of California for the U.S. Department of Energy\u2019s Office of Science.\u00a0<\/p>\n<p>DOE\u2019s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov\/science.<\/p>\n","protected":false},"excerpt":{"rendered":"BYLINE: Elizabeth Ball Newswise \u2014 What if the thermal noise that hinders the efficiency of both classical and&hellip;\n","protected":false},"author":2,"featured_media":184848,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7],"tags":[1380,25685,580,61,60,40925,31931,1378,4132,82,2647,152523],"class_list":{"0":"post-330593","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-science","8":"tag-all-journal-news","9":"tag-doe-science-news-source","10":"tag-energy","11":"tag-ie","12":"tag-ireland","13":"tag-lawrence-berkeley-national-laboratory","14":"tag-nature-journal","15":"tag-newswise","16":"tag-quantum-mechanics","17":"tag-science","18":"tag-supercomputing","19":"tag-thermodynamicscomputingthermal-noisesupercomputinghigh-performace-computingmicroelectronics"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/330593","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=330593"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/330593\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/184848"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=330593"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=330593"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=330593"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}