{"id":539654,"date":"2026-03-23T02:29:38","date_gmt":"2026-03-23T02:29:38","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/539654\/"},"modified":"2026-03-23T02:29:38","modified_gmt":"2026-03-23T02:29:38","slug":"openai-is-throwing-everything-into-building-a-fully-automated-researcher","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/539654\/","title":{"rendered":"OpenAI is throwing everything into building a fully automated researcher"},"content":{"rendered":"<p>\u201cI think it\u2019s going to be a long time before we can really be like, okay, this problem is solved,\u201d he says. \u201cUntil you can really trust the systems, you definitely want to have restrictions in place.\u201d Pachocki thinks that very powerful models should be deployed in sandboxes, cut off from anything they could break or use to cause harm.\u00a0<\/p>\n<p>AI tools have already been used to come up with novel cyberattacks. Some worry that they will be used to design synthetic pathogens that could be used as bioweapons. You can insert any number of evil-scientist scare stories here. \u201cI definitely think there are worrying scenarios that we can imagine,\u201d says Pachocki.\u00a0<\/p>\n<p>\u201cIt\u2019s going to be a very weird thing. It\u2019s extremely concentrated power that\u2019s in some ways unprecedented,\u201d says Pachocki. \u201cImagine you get to a world where you have a data center that can do all the work that OpenAI or Google can do. Things that in the past required large human organizations would now be done by a couple of people.\u201d<\/p>\n<p>\u201cI think this is a big challenge for governments to figure out,\u201d he adds.<\/p>\n<p>And yet some people would say governments are part of the problem. The <a href=\"https:\/\/www.technologyreview.com\/2026\/03\/12\/1134243\/defense-official-military-use-ai-chatbots-targeting-decisions\/\" rel=\"nofollow noopener\" target=\"_blank\">US government wants to use AI on the battlefield<\/a>, for example. The recent showdown between Anthropic and the Pentagon revealed that there is little agreement across society about where we draw red lines for how this technology should and should not be used\u2014let alone who should draw them. In the immediate aftermath of that dispute, <a href=\"https:\/\/www.technologyreview.com\/2026\/03\/02\/1133850\/openais-compromise-with-the-pentagon-is-what-anthropic-feared\/\" rel=\"nofollow noopener\" target=\"_blank\">OpenAI stepped up to sign a deal with the Pentagon<\/a> instead of its rival. The situation remains murky.<\/p>\n<p>I pushed Pachocki on this. Does he really trust other people to figure it out or does he, as a key architect of the future, feel personal responsibility? \u201cI do feel personal responsibility,\u201d he says. \u201cBut I don\u2019t think this can be resolved by OpenAI alone, pushing its technology in a particular way or designing its products in a particular way. We\u2019ll definitely need a lot of involvement from policymakers.\u201d<\/p>\n<p>Where does that leave us? Are we really on a path to the kind of AI Pachocki envisions? When I asked the Allen Institute&#8217;s Downey, he laughed. \u201cI\u2019ve been in this field for a couple of decades and I no longer trust my predictions for how near or far certain capabilities are,\u201d he says.\u00a0<\/p>\n<p>OpenAI\u2019s stated mission is to ensure that artificial general intelligence (a hypothetical future technology that many AI boosters believe will be able to match humans on most cognitive tasks) will benefit all of humanity. OpenAI aims to do that by being the first to build it. But the only time Pachocki mentioned AGI in our conversation, he was quick to clarify what he meant by talking about \u201ceconomically transformative technology\u201d instead.<\/p>\n<p>LLMs are not like human brains, he says: \u201cThey are superficially similar to people in some ways because they\u2019re kind of mostly trained on people talking. But they\u2019re not formed by evolution to be really efficient.\u201d\u00a0<\/p>\n<p>\u201cEven by 2028, I don\u2019t expect that we\u2019ll get systems as smart as people in all ways. I don&#8217;t think that will happen,\u201d he adds. \u201cBut I don\u2019t think it\u2019s absolutely necessary. The interesting thing is you don\u2019t need to be as smart as people in all their ways in order to be very transformative.\u201d <\/p>\n","protected":false},"excerpt":{"rendered":"\u201cI think it\u2019s going to be a long time before we can really be like, okay, this problem&hellip;\n","protected":false},"author":2,"featured_media":539655,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[45],"tags":[182,181,507,74],"class_list":{"0":"post-539654","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/539654","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=539654"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/539654\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/539655"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=539654"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=539654"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=539654"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}