{"id":10612,"date":"2025-09-09T19:58:16","date_gmt":"2025-09-09T19:58:16","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/10612\/"},"modified":"2025-09-09T19:58:16","modified_gmt":"2025-09-09T19:58:16","slug":"china-unveils-brain-inspired-ai-for-next-gen-efficient-computing-2","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/10612\/","title":{"rendered":"China unveils brain-inspired AI for next-gen efficient computing"},"content":{"rendered":"<p>BEIJING &#8212; Breaking from models like ChatGPT, a team of Chinese researchers has developed a novel AI system that mimics brain neurons, charting a new course for next-gen energy-efficient computing and hardware.<\/p>\n<p>Scientists from the Institute of Automation under the Chinese Academy of Sciences introduced &#8220;SpikingBrain-1.0,&#8221; a large-scale model trained and inferred entirely on home-grown GPU computing.<\/p>\n<p>Unlike mainstream generative AI systems that rely on the resource-intensive Transformer architecture &#8212; where intelligence grows with ever-larger networks, computing budgets and datasets &#8212; the novel model pursues a different path, allowing intelligence to emerge from spiking neurons.<\/p>\n<p>This model enables highly efficient training on extremely low data volumes. Using only about 2 percent of the pre-training data required by mainstream large models, it achieves performance comparable to multiple open-source models on language understanding and reasoning challenges, according to the team.<\/p>\n<p>By harnessing event-driven spiking neurons at the inference stage, one SpikingBrain variant is shown to deliver a 26.5-fold speed-up over Transformer architectures when generating the first token from a one-million-token context.<\/p>\n<p>The model&#8221;s ability to handle ultra-long sequences offers clear efficiency gains for tasks such as legal or medical document analysis, high-energy particle-physics experiments and DNA sequence modeling.<\/p>\n<p>The research team has open-sourced the SpikingBrain model and launched a public test page, along with releasing a large-scale, industry-validated bilingual technical report.<\/p>\n<p>&#8220;This large model opens up a non-Transformer technical path for the new generation of AI development,&#8221; said Xu Bo, director of the Institute of Automation. &#8220;It might inspire the design of next-generation neuromorphic chips with lower power consumption.&#8221;<\/p>\n<p>Reported last year in Nature Communications, scientists from the institute, working with Swiss counterparts, developed an energy-efficient sensing-computing neuromorphic chip that mimics the neurons and synapses of the human brain.<\/p>\n<p>The chip, dubbed &#8220;Speck,&#8221; boasts an impressively low resting power consumption of just 0.42 milliwatts, meaning it consumes almost no energy when there is no input.<\/p>\n<p>The human brain, capable of processing incredibly intricate and expansive neural networks, operates with a total power consumption of merely 20 watts, significantly lower than that of current AI systems.<\/p>\n","protected":false},"excerpt":{"rendered":"BEIJING &#8212; Breaking from models like ChatGPT, a team of Chinese researchers has developed a novel AI system&hellip;\n","protected":false},"author":2,"featured_media":10613,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[21],"tags":[220,84,114,241,61,60,10800,80],"class_list":{"0":"post-10612","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-ai","9":"tag-brain","10":"tag-china","11":"tag-computing","12":"tag-ie","13":"tag-ireland","14":"tag-model","15":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/10612","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=10612"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/10612\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/10613"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=10612"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=10612"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=10612"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}