{"id":147573,"date":"2025-09-10T22:48:10","date_gmt":"2025-09-10T22:48:10","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/147573\/"},"modified":"2025-09-10T22:48:10","modified_gmt":"2025-09-10T22:48:10","slug":"china-unveils-brain-inspired-ai-for-next-gen-efficient-computing","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/147573\/","title":{"rendered":"China unveils brain-inspired AI for next-gen efficient computing"},"content":{"rendered":"<p>\nFacebook<br \/>\nTwitter<br \/>\nLinkedIn<br \/>\nThreads<br \/>\nWhatsApp<br \/>\nTelegram\n<\/p>\n<p><img fetchpriority=\"high\" decoding=\"async\" aria-describedby=\"caption-attachment-1277425\" class=\"size-full wp-image-1277425\" src=\"https:\/\/www.newsbeep.com\/us\/wp-content\/uploads\/2025\/09\/brain-inspired-AI1.jpg\" alt=\"\" width=\"1000\" height=\"701\"\/><\/p>\n<p id=\"caption-attachment-1277425\" class=\"wp-caption-text\">A visitor interacts with a robot dog of Lenovo at the exhibition area of the 2025 Global Industrial Internet Conference in Shenyang, northeast China\u2019s Liaoning Province on Sept 6, 2025. \u2013 Xinhua photo<\/p>\n<p>BEIJING (Sept 11): Breaking from models like ChatGPT, a team of Chinese researchers has developed a novel AI system that mimics brain neurons, charting a new course for next-gen energy-efficient computing and hardware.<\/p>\n<p>Scientists from the Institute of Automation under the Chinese Academy of Sciences introduced \u201cSpikingBrain-1.0\u201d, a large-scale model trained and inferred entirely on home-grown GPU computing.<\/p>\n<p>Unlike mainstream generative AI systems that rely on the resource-intensive Transformer architecture \u2013 where intelligence grows with ever-larger networks, computing budgets and datasets \u2013 the novel model pursues a different path, allowing intelligence to emerge from spiking neurons.<\/p>\n<p>This model enables highly efficient training on extremely low data volumes. Using only about 2 percent of the pre-training data required by mainstream large models, it achieves performance comparable to multiple open-source models on language understanding and reasoning challenges, according to the team.<\/p>\n<p>By harnessing event-driven spiking neurons at the inference stage, one SpikingBrain variant is shown to deliver a 26.5-fold speed-up over Transformer architectures when generating the first token from a one-million-token context.<\/p>\n<p>The model\u2019s ability to handle ultra-long sequences offers clear efficiency gains for tasks such as legal or medical document analysis, high-energy particle-physics experiments and DNA sequence modeling.<\/p>\n<p>The research team has open-sourced the SpikingBrain model and launched a public test page, along with releasing a large-scale, industry-validated bilingual technical report.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-1277426\" class=\"size-full wp-image-1277426\" src=\"https:\/\/www.newsbeep.com\/us\/wp-content\/uploads\/2025\/09\/brain-inspired-AI2.jpg\" alt=\"\" width=\"1000\" height=\"659\"  \/><\/p>\n<p id=\"caption-attachment-1277426\" class=\"wp-caption-text\">A humanoid robot interacts with visitors during the World Smart Industry Expo 2025 in southwest China\u2019s Chongqing Municipality on Sept 7, 2025. \u2013 Xinhua photo<\/p>\n<p>\u201cThis large model opens up a non-Transformer technical path for the new generation of AI development,\u201d said Xu Bo, director of the Institute of Automation.<\/p>\n<p>\u201cIt might inspire the design of next-generation neuromorphic chips with lower power consumption.\u201d<\/p>\n<p>Reported last year in Nature Communications, scientists from the institute, working with Swiss counterparts, developed an energy-efficient sensing-computing neuromorphic chip that mimics the neurons and synapses of the human brain.<\/p>\n<p>The chip, dubbed \u201cSpeck\u201d, boasts an impressively low resting power consumption of just 0.42 milliwatts, meaning it consumes almost no energy when there is no input.<\/p>\n<p>The human brain, capable of processing incredibly intricate and expansive neural networks, operates with a total power consumption of merely 20 watts, significantly lower than that of current AI systems. \u2013 Xinhua<\/p>\n<p>\nFacebook<br \/>\nTwitter<br \/>\nLinkedIn<br \/>\nThreads<br \/>\nWhatsApp<br \/>\nTelegram<\/p>\n","protected":false},"excerpt":{"rendered":"Facebook Twitter LinkedIn Threads WhatsApp Telegram A visitor interacts with a robot dog of Lenovo at the exhibition&hellip;\n","protected":false},"author":2,"featured_media":147574,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[46],"tags":[182,181,144,191,74,34379],"class_list":{"0":"post-147573","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-china","11":"tag-computing","12":"tag-technology","13":"tag-xinhua"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/147573","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=147573"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/147573\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/147574"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=147573"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=147573"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=147573"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}