{"id":372692,"date":"2026-04-10T08:56:07","date_gmt":"2026-04-10T08:56:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/nz\/372692\/"},"modified":"2026-04-10T08:56:07","modified_gmt":"2026-04-10T08:56:07","slug":"ai-products-are-reaching-further-into-our-lives-does-it-matter-who-controls-the-companies-behind-them-van-badham","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/nz\/372692\/","title":{"rendered":"AI products are reaching further into our lives. Does it matter who controls the companies behind them? | Van Badham"},"content":{"rendered":"<p class=\"dcr-130mj7b\">The joke on the internet asks: \u201cWhat are the seven most terrifying words in the English language?\u201d The answer: \u201cRonan Farrow\u2019s been asking questions about you.\u201d<\/p>\n<p class=\"dcr-130mj7b\">The investigative journalist has a piece in The New Yorker this week, where the subject of said inquiries is <a href=\"https:\/\/www.theguardian.com\/technology\/sam-altman\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">Sam Altman<\/a>, the billionaire founder and CEO of OpenAI, the company that owns ChatGPT.<\/p>\n<p class=\"dcr-130mj7b\">Farrow\u2019s new piece suggests timely, broader questions of who has power, who should have it, who absolutely shouldn\u2019t \u2026 and what we do if they have it, anyway.<\/p>\n<p class=\"dcr-130mj7b\">OpenAI\u2019s products now reach into everything, from your smartphone to defence contracts to law enforcement. Its operations have a growing hunger for electric power; its datacentres are spreading across the planet; and the labour market implications of its potential to replace jobs suggest an industrial upheaval for white-collar workers on a world-changing scale.<\/p>\n<p class=\"dcr-130mj7b\">The commercial momentum of this company is such that, despite a projected loss of $14bn in 2026 reported in early March \u2013 tripling estimates made in 2025 \u2013 OpenAI still held an eye-watering market valuation of $852bn by March\u2019s end.<\/p>\n<p><a data-link-name=\"standard link button Primary\" data-spacefinder-role=\"inline\" data-ignore=\"global-link-styling\" href=\"https:\/\/www.theguardian.com\/email-newsletters?CMP=copyembed&amp;CMP=emailbutton\" class=\"dcr-svb9qg\" rel=\"nofollow noopener\" target=\"_blank\">Sign up for the Breaking News Australia email<\/a><\/p>\n<p class=\"dcr-130mj7b\">Farrow\u2019s piece claims the OpenAI board had doubts about whether they could trust Altman when they fired him in 2023.<\/p>\n<p class=\"dcr-130mj7b\">As per Farrow, Altman then convened a \u201cwar room\u201d comprising of crisis communicators \u2013 and some influential company investors \u2013 to defend his reputation. He was <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/nov\/22\/sam-altman-openai-ceo-return-board-chatgpt\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">reinstated five days later<\/a>, reportedly, pressure from investor Microsoft and a threat from 700 staff to resource any competing Altman venture were critically persuasive in discussions.<\/p>\n<p class=\"dcr-130mj7b\">Three years later, the company, with a CEO its own board did not allegedly trust, has publicly concluded a deal with the US military <a href=\"https:\/\/www.theguardian.com\/technology\/2026\/mar\/03\/openai-pentagon-ceo-sam-altman-chatgpt\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">to use its technology in classified operations<\/a>.<\/p>\n<p class=\"dcr-130mj7b\">The deal was announced in the wake of its AI rival, Anthropic, expressing concern that the US government could, potentially, employ its own proprietary AI tools as instruments of \u201cmass surveillance\u201d and for \u201cfully autonomous weapons\u201d.<\/p>\n<p class=\"dcr-130mj7b\">The Trump administration emphatically <a href=\"https:\/\/www.bbc.com\/news\/articles\/cn48jj3y8ezo\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">ceased business with Anthropic<\/a>, and OpenAI leapt in.<\/p>\n<p class=\"dcr-130mj7b\">Facing a backlash, Altman described the original deal OpenAI concluded with Pete Hegseth\u2019s department as \u201copportunistic and sloppy\u201d. The company subsequently released a statement <a href=\"https:\/\/www.bbc.com\/news\/articles\/c3rz1nd0egro\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">reassuring the public<\/a> its Pentagon agreement had \u201cmore guardrails than any previous agreement for classified AI deployments, including Anthropic\u2019s\u201d.<\/p>\n<p class=\"dcr-130mj7b\">At OpenAI\u2019s word, the company believes \u201cstrongly in democracy\u201d and that the \u201conly good path forward requires deep collaboration between AI efforts and the democratic process\u201d.<\/p>\n<p class=\"dcr-130mj7b\">How perplexing! As <a href=\"https:\/\/www.techpolicy.press\/five-unresolved-issues-in-openais-deal-with-the-department-of-defense\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Jake Laperruque from Tech Policy<\/a> observes, OpenAI\u2019s cited \u201cred lines\u201d against mass domestic surveillance, direct autonomous weapons systems and high-stakes automated decisions seem to be largely indistinguishable from those \u201cthat caused the planned Anthropic agreement not only to fail, but to explode in shocking fashion\u201d.<\/p>\n<p class=\"dcr-130mj7b\">I\u2019m also curious regarding the company\u2019s interpretation of \u201cdeep collaboration\u201d with the democratic process. Perhaps we could glean the nature of it with the information that OpenAI top executive, Greg Brockman, was revealed as a <a href=\"https:\/\/www.wired.com\/story\/openai-president-greg-brockman-political-donations-trump-humanity\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">$25m donor<\/a> to a Trump fundraising vehicle in January.<\/p>\n<p class=\"dcr-130mj7b\">Brockman is also a participant in an <a href=\"https:\/\/www.cnbc.com\/2026\/01\/30\/ai-industry-super-pac-raises-campaign-money.html#:~:text=Super%20PAC%20Leading%20the%20Future,to%20help%20elevate%20that%20message.%22\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">AI \u201cSuperPAC\u201d<\/a> fundraising vehicle that in 2025 raised $125m to further its goal of <a href=\"https:\/\/www.cnbc.com\/2025\/11\/20\/trump-ai-executive-order-state-funding.html\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">backing candidates who support national AI regulations rather than state-by-state rules.<\/a><\/p>\n<p class=\"dcr-130mj7b\">In December last year, Trump signed an executive order limiting state regulations of AI, preferring a \u201c<a href=\"https:\/\/www.pbs.org\/newshour\/show\/trumps-executive-order-limits-state-regulations-of-artificial-intelligence\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">minimally burdensome national standard<\/a>\u201d to regulate technology.<\/p>\n<p class=\"dcr-130mj7b\">I\u2019m sure it\u2019s just a coincidence. <a href=\"https:\/\/youtu.be\/cPpEotcZf1A?si=JXolhHsvSRNNnJDh\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">So are they all, all honourable men<\/a>.<\/p>\n<p class=\"dcr-130mj7b\">And yet, somehow, concerns do nag about the character of decision-making processes regarding a technology that OpenAI\u2019s own staff researchers believe is a \u201c<a href=\"https:\/\/www.afr.com\/technology\/humanity-threatening-ai-development-before-sam-altman-s-ouster-sources-20231123-p5emej\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">threat to humanity<\/a>\u201d.<\/p>\n<p class=\"dcr-130mj7b\">Ethical anxiety has inspired activist\/historian <a href=\"https:\/\/www.theguardian.com\/commentisfree\/2026\/mar\/04\/quit-chatgpt-subscription-boycott-silicon-valley\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Rutger Bregman to start a \u201cQuitGPT\u201d campaign for a worldwide boycott of Altman\u2019s company<\/a>. Meanwhile, questions remain over the role of AI tools such as Palantir\u2019s Maven in US strikes on Iran, including the<a href=\"https:\/\/www.theguardian.com\/news\/2026\/mar\/26\/ai-got-the-blame-for-the-iran-school-bombing-the-truth-is-far-more-worrying?CMP=Share_iOSApp_Other\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\"> bombing of a girls\u2019 school in Minab<\/a>.<\/p>\n<p class=\"dcr-130mj7b\">The rubble of that school is the grotesque terrain over which the debate over who gets entrusted with power over tools that could kill us all must be asked &#8211; because AI is just one of the mechanisms for our own mass annihilation proliferating now.<\/p>\n<p class=\"dcr-130mj7b\">Those who gain power over these may be good people, bad people, misunderstood people or the overwhelmingly more common mixture of every kind of person on any given day. Whether their talents are for computer programming or demagoguery, every social organisation, from the local tech startup to the collective representatives of nation-states, has to affirm meaningful social, political, legal and economic guardrails that channel their available options away from human fallibility and collectively minimise the harm they can do.<\/p>\n<p class=\"dcr-130mj7b\">Dear god, haven\u2019t we learned by now that self-regulated enterprises do not regulate in the interest of anyone or anything beyond their commercial or political self-interest? Sanctions, recalls, suspensions and multiple supervisory stakeholders with the authority to enforce these are what keeps us alive.<\/p>\n<p class=\"dcr-130mj7b\">The moment demands a global and unified willingness to regulate the complex risks posed. It\u2019s a problem we cannot outsource to Farrow or AI. Our shared fates depend on sitting down with one another and all our human fallibilities, and working it out for ourselves.<\/p>\n","protected":false},"excerpt":{"rendered":"The joke on the internet asks: \u201cWhat are the seven most terrifying words in the English language?\u201d The&hellip;\n","protected":false},"author":2,"featured_media":372693,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[365,363,364,111,139,69,145],"class_list":{"0":"post-372692","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-new-zealand","12":"tag-newzealand","13":"tag-nz","14":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/372692","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/comments?post=372692"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/372692\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media\/372693"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media?parent=372692"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/categories?post=372692"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/tags?post=372692"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}