{"id":610382,"date":"2026-04-16T09:11:11","date_gmt":"2026-04-16T09:11:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/610382\/"},"modified":"2026-04-16T09:11:11","modified_gmt":"2026-04-16T09:11:11","slug":"boston-dynamics-robot-dog-now-reads-gauges-and-thermometers-with-googles-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/610382\/","title":{"rendered":"Boston Dynamics\u2019 robot dog now reads gauges and thermometers with Google&#8217;s AI"},"content":{"rendered":"<p>Robots such as Boston Dynamics\u2019 four-legged Spot can now accurately read analog thermometers and pressure gauges while roaming around factories and warehouses. Those improvements come courtesy of Google DeepMind\u2019s newest robotic AI model that aims to enhance robotic capabilities for \u2018embodied reasoning\u2019 when interacting with physical environments.<\/p>\n<p>The new <a href=\"https:\/\/deepmind.google\/blog\/gemini-robotics-er-1-6\/\" rel=\"nofollow noopener\" target=\"_blank\">Gemini Robotics-ER 1.6<\/a> model announced on April 14 performs as a \u201chigh-level reasoning model for a robot\u201d that can plan and execute tasks, according to Google DeepMind. This model also unlocks the capability of accurately reading instruments such as complex gauges and doing visual inspections using sight glasses that provide a transparent window to peek inside tanks and pipes\u2014a performance upgrade that came about through <a href=\"https:\/\/bostondynamics.com\/blog\/boston-dynamics-google-deepmind-form-new-ai-partnership\/\" rel=\"nofollow noopener\" target=\"_blank\">Google DeepMind\u2019s ongoing collaboration<\/a> with robotics company Boston Dynamics.<\/p>\n<p>Boston Dynamics has a keen interest in testing both quadruped and humanoid robotic workers in a wide range of industrial facilities, including the automotive factories of the robotic company\u2019s corporate owner, Hyundai Motor Group. The company\u2019s robot \u201cdog,\u201d Spot, is being trialled as a robotic inspector that roams throughout industrial facilities to check up on everything. Such inspection duties require \u201ccomplex visual reasoning\u201d to interpret the multiple needles, liquid levels, container boundaries and tick marks, along with text, in various instruments.<\/p>\n<p>The model driving it<\/p>\n<p>To handle such tasks, the Gemini Robotics-ER 1.6 model provides robots with \u201cagentic vision\u201d that combines visual reasoning with the capability of executing code to create a \u201cvisual scratchpad\u201d for inspecting and manipulating images. Such <a href=\"https:\/\/blog.google\/innovation-and-ai\/technology\/developers-tools\/agentic-vision-gemini-3-flash\/\" rel=\"nofollow noopener\" target=\"_blank\">agentic vision<\/a> was introduced in Google\u2019s <a href=\"https:\/\/arstechnica.com\/google\/2025\/12\/google-releases-gemini-3-flash-promising-improved-intelligence-and-efficiency\/\" rel=\"nofollow noopener\" target=\"_blank\">Gemini 3.0 Flash model <\/a>back in January 2026.<\/p>\n<p>The agentic vision capability reportedly boosts robotic performance on instrument reading tasks from 23 percent in the older <a href=\"https:\/\/arstechnica.com\/google\/2025\/09\/google-deepmind-unveils-its-first-thinking-robotics-ai\/\" rel=\"nofollow noopener\" target=\"_blank\">Gemini Robotics-ER 1.5 model<\/a> to 98 percent in the new Gemini Robotics-ER 1.6 model. For comparison, Gemini 3.0 Flash delivered just 67 percent accuracy.<\/p>\n<p>The baseline Gemini Robotics-ER 1.6 model can still achieve 86 percent accuracy in reading instruments even without agentic vision. That is because the model uses a process of pointing to different elements in a visual image to process complex tasks, such as counting items or identifying the most salient features. It also supposedly delivers an improved \u201cmulti-view reasoning\u201d capability that allows a robotic system to use multiple camera streams to better understand its environment.<\/p>\n","protected":false},"excerpt":{"rendered":"Robots such as Boston Dynamics\u2019 four-legged Spot can now accurately read analog thermometers and pressure gauges while roaming&hellip;\n","protected":false},"author":2,"featured_media":610383,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-610382","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/610382","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=610382"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/610382\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/610383"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=610382"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=610382"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=610382"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}