{"id":398152,"date":"2026-04-26T07:02:07","date_gmt":"2026-04-26T07:02:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/nz\/398152\/"},"modified":"2026-04-26T07:02:07","modified_gmt":"2026-04-26T07:02:07","slug":"opinion-can-an-a-i-company-ever-be-good","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/nz\/398152\/","title":{"rendered":"Opinion | Can an A.I. Company Ever Be Good?"},"content":{"rendered":"<p class=\"css-ac37hb evys1bk0\">Over three decades of watching the tech industry and watching big companies grow from tiny teams to global powers, I\u2019ve observed the same pattern: Ethics don\u2019t scale up. Tech companies like to start with a mission. Google wanted to connect the world\u2019s information; Microsoft wanted to put a computer on every desktop; Twitter wanted to give all people a platform to publish their thoughts. These are good ideas \u2014 the stuff of TED Talks. But users show up with their own beliefs and ideas, by the millions. As a tech founder, you end up putting enormous work into making users behave (and stopping them from breaking the law). Lawsuits pour in, saying you did wrong, some because you\u2019re a convenient target.<\/p>\n<p class=\"css-ac37hb evys1bk0\">All the while, money keeps gushing in. You start out transparent, sharing your journey, but then before an initial public offering of shares, you must honor the S.E.C.-mandated quiet period and restrict promotional communications. After that, the transparency never quite returns. The market demands a rising stock price. Your company still makes a lot of software, but a huge amount of time goes to tax strategy and compliance.<\/p>\n<p class=\"css-ac37hb evys1bk0\">At that scale, people start to blur together, and human users can become aggregate pools of statistics and growth vectors that go up and down \u2014 a mulch into which you plant your products.<\/p>\n<p class=\"css-ac37hb evys1bk0\">The entire culture of American technology is built around two terms: disruption and, of course, scale. But ethics are constraints on disruption and scale. Truly ethics-bound organizations \u2014 the U.S. justice system, the American Medical Association, the Catholic priesthood \u2014 have hard scaling limits. Their rules run deep, and their requirements to serve are so onerous that only a few people can do the job. Punishments for transgressors include losing their licenses, being defrocked and being disbarred. Software industry people might have good degrees and are often good people, but they are making it up as they go along. They take no oath, are inconsistently certified and can only be fired, not exiled from the trade.<\/p>\n<p class=\"css-ac37hb evys1bk0\">OpenAI set out to be inherently good \u2014 a dot-org. But it stumbled into a seam of pure digital gold in the form of large language models. To develop that technology further, it has made a painful, awkward transition to being a dot-com. (OpenAI says the for-profit arm continues to be overseen by the original nonprofit entity.) The subsequent level of drama has been difficult to behold. A few years ago, Mr. Altman publicly called for industry regulation, <a class=\"css-yywogo\" href=\"https:\/\/thehill.com\/policy\/technology\/5817906-openai-ai-policy-recommendations\" title=\"\" rel=\"noopener noreferrer nofollow\" target=\"_blank\">and he still does<\/a>, but OpenAI has also lobbied against it \u2014 for example, supporting <a class=\"css-yywogo\" href=\"https:\/\/www.wired.com\/story\/openai-backs-bill-exempt-ai-firms-model-harm-lawsuits\/\" title=\"\" rel=\"noopener noreferrer nofollow\" target=\"_blank\">an Illinois bill<\/a> that, if it becomes law, will limit the liability of A.I. companies in mass deaths.<\/p>\n","protected":false},"excerpt":{"rendered":"Over three decades of watching the tech industry and watching big companies grow from tiny teams to global&hellip;\n","protected":false},"author":2,"featured_media":398153,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[365,13474,11124,204613,363,364,58648,205633,205636,17448,205634,203831,17449,111,139,69,205635,205632,205637,145,1430],"class_list":{"0":"post-398152","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-altman","10":"tag-amodei","11":"tag-anthropic-ai-llc","12":"tag-artificial-intelligence","13":"tag-artificialintelligence","14":"tag-computers-and-the-internet","15":"tag-copyrights-and-copyright-violations","16":"tag-dario","17":"tag-elon","18":"tag-ethics-personal","19":"tag-law-and-legislation","20":"tag-musk","21":"tag-new-zealand","22":"tag-newzealand","23":"tag-nz","24":"tag-openai-labs","25":"tag-regulation-and-deregulation-of-industry","26":"tag-samuel-h","27":"tag-technology","28":"tag-united-states"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/398152","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/comments?post=398152"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/398152\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media\/398153"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media?parent=398152"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/categories?post=398152"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/tags?post=398152"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}