{"id":397015,"date":"2026-01-29T14:38:12","date_gmt":"2026-01-29T14:38:12","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/397015\/"},"modified":"2026-01-29T14:38:12","modified_gmt":"2026-01-29T14:38:12","slug":"inside-openais-unit-economics","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/397015\/","title":{"rendered":"Inside OpenAI&#8217;s unit economics"},"content":{"rendered":"<p>AI companies are being priced into the hundreds of billions. That forces one awkward question to the front: do the unit economics actually work?<\/p>\n<p>Jevons\u2019 paradox suggests that as tokens get cheaper, demand explodes. You\u2019ve likely felt some version of this in the last year. But as usage grows, are these models actually profitable to run?<\/p>\n<p>In our collaboration with <a href=\"https:\/\/epoch.ai\/\" rel=\"nofollow noopener\" target=\"_blank\">Epoch AI<\/a>, we tackle that question using OpenAI\u2019s GPT-5 as the case study. What looks like a simple margin calculation is closer to a forensic exercise: we triangulate reported details, leaks, and Sam Altman\u2019s own words to bracket plausible revenues and costs.<\/p>\n<p>Here\u2019s the breakdown.<\/p>\n<p>\u2014 Azeem<\/p>\n<p>Originally published on <a href=\"https:\/\/epoch.ai\/gradient-updates\/can-ai-companies-become-profitable\" rel=\"nofollow noopener\" target=\"_blank\">Epoch AI\u2019s blog<\/a>. Analysis by <a href=\"https:\/\/open.substack.com\/users\/22111350-jaime-sevilla?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Jaime Sevilla&quot;,&quot;id&quot;:22111350,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substackcdn.com\/image\/fetch\/$s_!-E-h!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b11760e-8d38-4dcf-869e-f1452cee0371_564x564.jpeg&quot;,&quot;uuid&quot;:&quot;b5ce8663-b7ba-459b-b44a-72978a0f2f6a&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Jaime Sevilla<\/a>, Exponential View\u2019s <a href=\"https:\/\/open.substack.com\/users\/296829881-hannah-petrovic?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Hannah Petrovic&quot;,&quot;id&quot;:296829881,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substackcdn.com\/image\/fetch\/$s_!gwHU!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1df07d07-1752-4012-928e-8d02ed473e94_1989x1989.jpeg&quot;,&quot;uuid&quot;:&quot;e61dcbda-83f8-46d2-8b9c-86b971096576&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Hannah Petrovic<\/a>, and <a href=\"https:\/\/open.substack.com\/users\/327131465-anson-ho?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Anson Ho&quot;,&quot;id&quot;:327131465,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substackcdn.com\/image\/fetch\/$s_!YpJm!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc9a56c6b-d918-48a9-b335-58313d2bb76f_3097x3182.jpeg&quot;,&quot;uuid&quot;:&quot;a96ca451-9c60-4299-9c1b-f9e36776c0d6&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Anson Ho<\/a><\/p>\n<p>Are AI models profitable? If you ask <a href=\"https:\/\/www.axios.com\/2025\/08\/15\/sam-altman-gpt5-launch-chatgpt-future\" rel=\"nofollow noopener\" target=\"_blank\">Sam Altman<\/a> and <a href=\"https:\/\/open.substack.com\/pub\/cheekypint\/p\/a-cheeky-pint-with-anthropic-ceo?r=5erk95&amp;selection=97f244b2-8e86-48a6-a4d4-ea366c90bc83&amp;utm_campaign=post-share-selection&amp;utm_medium=web&amp;aspectRatio=instagram&amp;textColor=%23ffffff&amp;bgImage=true\" rel=\"nofollow noopener\" target=\"_blank\">Dario Amodei<\/a>, the answer seems to be yes \u2014 it just doesn\u2019t appear that way on the surface.<\/p>\n<p>Here\u2019s the idea: running each AI model generates enough revenue to cover its own R&amp;D costs. But that surplus gets outweighed by the costs of developing the next big model. So, despite making money on each model, companies can lose money each year.<\/p>\n<p>This is big if true. In fast-growing tech sectors, investors typically accept losses today in exchange for big profits down the line. So if AI models are already covering their own costs, that would paint a healthy financial outlook for AI companies.<\/p>\n<p>But we can\u2019t take Altman and Amodei at their word \u2014 you\u2019d expect CEOs to paint a rosy picture of their company\u2019s finances. And even if they\u2019re right, we don\u2019t know just how profitable models are.<\/p>\n<p>To shed light on this, we looked into a notable case study: using public reporting on OpenAI\u2019s finances, we made an educated guess on the profits from running GPT-5, and whether that was enough to recoup its R&amp;D costs. Here\u2019s what we found:<\/p>\n<p>Whether OpenAI was profitable to run depends on which profit margin you\u2019re talking about. If we subtract the cost of compute from revenue to calculate the gross margin (on an accounting basis), it seems to be about 50% \u2014 lower than the norm for software companies (where 60-80% is typical) but still higher than many industries.<\/p>\n<p>But if you also subtract other operating costs, including salaries and marketing, then OpenAI most likely made a loss, even without including R&amp;D.<\/p>\n<p>Moreover, OpenAI likely failed to recoup the costs of developing GPT-5 during its 4-month lifetime. Even using gross profit, GPT-5\u2019s tenure was too short to bring in enough revenue to offset its own R&amp;D costs. So if GPT-5 is at all representative, then at least for now, developing and running AI models is loss-making.<\/p>\n<p>This doesn\u2019t necessarily mean that models like GPT-5 are a bad investment. Even an unprofitable model demonstrates progress, which attracts customers and helps labs raise money to train future models \u2014 and that next generation may earn far more. What\u2019s more, the R&amp;D that went into GPT-5 likely informs future models like GPT-6. So these labs might have a much better financial outlook than it might initially seem.<\/p>\n<p>Let\u2019s dig into the details.<\/p>\n<p>To answer this question, we consider a case study which we call the \u201cGPT-5 bundle\u201d. This includes all of OpenAI\u2019s offerings available during GPT-5\u2019s lifetime as the flagship model \u2014 GPT-5 and GPT-5.1, GPT-4o, ChatGPT, the API, and so on. We then estimate the revenue and costs of running the bundle.<\/p>\n<p>Revenue is relatively straightforward: since the bundle includes all of OpenAI\u2019s models, this is just <a href=\"https:\/\/epoch.ai\/data\/ai-companies#explore-the-data\" rel=\"nofollow noopener\" target=\"_blank\">their total revenue<\/a> over GPT-5\u2019s lifetime, from August to December last year. This works out to $6.1 billion.<\/p>\n<p>At first glance, $6.1 billion sounds healthy, until you juxtapose it with the costs of running the GPT-5 bundle. These costs come from four main sources:<\/p>\n<p>Inference compute: $3.2 billion. This is based on public <a href=\"https:\/\/www.theinformation.com\/articles\/openai-spend-100-billion-backup-servers-ai-breakthroughs?rc=spkbjw\" rel=\"nofollow noopener\" target=\"_blank\">estimates<\/a> of OpenAI\u2019s total inference compute spend in 2025, and assuming that the allocation of compute during GPT-5\u2019s tenure was proportional to the fraction of the year\u2019s revenue raised in that period.<\/p>\n<p>Staff compensation: $1.2 billion, which we can back out from <a href=\"https:\/\/epoch.ai\/data\/ai-companies?dataView=staff&amp;yAxis=Staff+count#explore-the-data\" rel=\"nofollow noopener\" target=\"_blank\">OpenAI staff counts<\/a>, reports on <a href=\"https:\/\/www.wsj.com\/tech\/ai\/openai-is-paying-employees-more-than-any-major-tech-startup-in-history-23472527\" rel=\"nofollow noopener\" target=\"_blank\">stock compensation<\/a>, and things like <a href=\"https:\/\/h1bgrader.com\/h1b-sponsors\/openai-opco-llc-60d97wl6k8\/salaries\/2025\" rel=\"nofollow noopener\" target=\"_blank\">H1B filings<\/a>. One big uncertainty with this: how much of the stock compensation goes toward running models, rather than R&amp;D? We assume 40%, matching the fraction of <a href=\"https:\/\/www.theinformation.com\/articles\/openai-spend-100-billion-backup-servers-ai-breakthroughs\" rel=\"nofollow noopener\" target=\"_blank\">compute<\/a> that goes to inference. Whether staffing follows the same split is uncertain, but it\u2019s our best guess.<\/p>\n<p>Sales and marketing (S&amp;M): $2.2 billion, assuming OpenAI\u2019s spending on this grew between the first and second halves of last year.<\/p>\n<p>Legal, office, and administrative costs: $0.2 billion, assuming this grew between 1.6\u00d7 and 2\u00d7 relative to their <a href=\"https:\/\/www.theinformation.com\/articles\/openai-projections-imply-losses-tripling-to-14-billion-in-2026\" rel=\"nofollow noopener\" target=\"_blank\">2024 expenses<\/a>. This accounts for <a href=\"https:\/\/aibusiness.com\/responsible-ai\/openai-signs-partnership-with-uk-government-plans-office-expansion\" rel=\"nofollow noopener\" target=\"_blank\">office expansions<\/a>, <a href=\"https:\/\/www.reuters.com\/world\/asia-pacific\/openai-open-office-seoul-amid-growing-demand-chatgpt-2025-05-26\/\" rel=\"nofollow noopener\" target=\"_blank\">new office setups<\/a>, and rising <a href=\"https:\/\/www.reuters.com\/world\/asia-pacific\/openai-open-office-seoul-amid-growing-demand-chatgpt-2025-05-26\/\" rel=\"nofollow noopener\" target=\"_blank\">administrative costs<\/a> with their growing workforce.<\/p>\n<p>So what are the profits? One option is to look at gross profits. This only counts the direct cost of running a model, which in this case is just the inference compute cost of $3.2 billion. Since the revenue was $6.1 billion, this leads to a profit of $2.9 billion, or gross profit margin of 48%, and in line with other estimates. This is lower than other software businesses (typically <a href=\"https:\/\/www.key.com\/content\/dam\/kco\/documents\/businesses___institutions\/2024_kbcm_sapphire_saas_survey.pdf\" rel=\"nofollow noopener\" target=\"_blank\">70-80%<\/a>) but high enough to eventually build a business on.<\/p>\n<p>On the other hand, if we add up all four cost types, we get close to $6.8 billion. That\u2019s somewhat higher than the revenue, so on these terms the GPT-5 bundle made an operating loss of $0.7 billion, with an operating margin of -11%.<\/p>\n<p>Stress-testing the analysis with more aggressive or conservative assumptions doesn\u2019t change the picture much:<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/substackcdn.com\/image\/fetch\/$s_!nHfd!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00de449c-3935-431a-ba8f-0d94358f631e_2546x790.jpeg\" data-component-name=\"Image2ToDOM\" rel=\"nofollow noopener\" class=\"image-link image2 can-restack\"><img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2026\/01\/https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/00de449c-3935-431a-ba8f-0d94358f631e_2546.jpeg\" width=\"726\" height=\"225.37912087912088\" data-attrs=\"{&quot;src&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/00de449c-3935-431a-ba8f-0d94358f631e_2546x790.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:452,&quot;width&quot;:1456,&quot;resizeWidth&quot;:726,&quot;bytes&quot;:212631,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image\/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https:\/\/www.exponentialview.co\/i\/185640868?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00de449c-3935-431a-ba8f-0d94358f631e_2546x790.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" alt=\"\"   loading=\"lazy\" class=\"sizing-normal\"\/><\/a>Confidence intervals are obtained from a Monte Carlo analysis. <\/p>\n<p>And there\u2019s one more hiccup: OpenAI signed a deal with Microsoft to hand over about <a href=\"https:\/\/www.theinformation.com\/articles\/openai-gain-50-billion-cutting-revenue-share-microsoft-partners?rc=spkbjw\" rel=\"nofollow noopener\" target=\"_blank\">20%<\/a> of their $6.1 billion revenue, making their losses even larger still. This doesn\u2019t mean that the revenue deal is entirely harmful to OpenAI \u2014 for example, Microsoft also shares revenue back to OpenAI. And the deal probably shouldn\u2019t significantly affect how we see model profitability \u2014 it seems more to do with OpenAI\u2019s economic structure rather than something fundamental to AI models. But the fact that OpenAI and Microsoft <a href=\"https:\/\/techcrunch.com\/2025\/03\/05\/u-k-s-competition-authority-says-microsofts-openai-partnership-doesnt-quality-for-investigation\/\" rel=\"nofollow noopener\" target=\"_blank\">have been<\/a> <a href=\"https:\/\/blogs.microsoft.com\/blog\/2025\/10\/28\/the-next-chapter-of-the-microsoft-openai-partnership\/\" rel=\"nofollow noopener\" target=\"_blank\">renegotiating<\/a> this deal suggests it\u2019s a real drag on OpenAI\u2019s path to profitability.<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/substackcdn.com\/image\/fetch\/$s_!CYan!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba92684e-3bf4-455f-a05c-cbc08f811d1a_1280x1280.jpeg\" data-component-name=\"Image2ToDOM\" rel=\"nofollow noopener\" class=\"image-link image2 is-viewable-img can-restack\"><img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2026\/01\/https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/ba92684e-3bf4-455f-a05c-cbc08f811d1a_1280.png\" width=\"621\" height=\"621\" data-attrs=\"{&quot;src&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/ba92684e-3bf4-455f-a05c-cbc08f811d1a_1280x1280.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1280,&quot;width&quot;:1280,&quot;resizeWidth&quot;:621,&quot;bytes&quot;:127328,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image\/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https:\/\/www.exponentialview.co\/i\/185640868?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba92684e-3bf4-455f-a05c-cbc08f811d1a_1280x1280.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" alt=\"\"   loading=\"lazy\" class=\"sizing-normal\"\/><\/a><\/p>\n<p>In short, running AI models is likely profitable in the sense of having decent gross margins. But OpenAI\u2019s operating margin, which includes marketing and staffing, is likely negative. For a fast-growing company, though, operating margins can be misleading \u2014 S&amp;M costs typically grow sublinearly with revenue, so gross margins are arguably a better proxy for long-run profitability.<\/p>\n<p>So our numbers don\u2019t necessarily contradict Altman and Amodei yet. But so far we\u2019ve only seen half the story \u2014 we still need to account for R&amp;D costs, which we\u2019ll turn to now.<\/p>\n<p>Let\u2019s say we buy the argument that we should look at gross margins. On those terms, it was profitable to run the GPT-5 bundle. But was it profitable enough to recoup the costs of developing it?<\/p>\n<p>In theory, yes \u2014 you just have to keep running them, and sooner or later you\u2019ll earn enough revenue to recoup these costs. But in practice, models might have too short a lifetime to make enough revenue. For example, they could be outcompeted by products from rival labs, forcing them to be replaced.<\/p>\n<p>So to figure out the answer, let\u2019s go back to the GPT-5 bundle. We\u2019ve already figured out its gross profits to be around $3 billion. So how do these compare to its R&amp;D costs?<\/p>\n<p>Estimating this turns out to be a finicky business. We estimate that OpenAI spent $16 billion on R&amp;D in 2025, but there\u2019s no conceptually clean way to attribute some fraction of this to the GPT-5 bundle. We\u2019d need to make several arbitrary choices: should we count the R&amp;D effort that went into earlier reasoning models, like o1 and o3? Or what if experiments failed, and didn\u2019t directly change how GPT-5 was trained? Depending on how you answer these questions, the development cost could vary significantly.<\/p>\n<p>But we can still do an illustrative calculation: let\u2019s conservatively assume that OpenAI started R&amp;D on GPT-5 after o3\u2019s <a href=\"https:\/\/openai.com\/index\/introducing-o3-and-o4-mini\/\" rel=\"nofollow noopener\" target=\"_blank\">release<\/a> last April. Then there\u2019d still be four months between then and GPT-5\u2019s release in August, during which OpenAI spent around $5 billion on R&amp;D. But that\u2019s still higher than the $3 billion of gross profits. In other words, OpenAI spent more on R&amp;D in the four months preceding GPT-5, than it made in gross profits during GPT-5\u2019s four-month tenure.<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/substackcdn.com\/image\/fetch\/$s_!VthJ!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf319274-744f-49fc-bb1e-6310f0dc2166_1024x1280.png\" data-component-name=\"Image2ToDOM\" rel=\"nofollow noopener\" class=\"image-link image2 is-viewable-img can-restack\"><img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2026\/01\/https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/af319274-744f-49fc-bb1e-6310f0dc2166_1024.png\" width=\"579\" height=\"723.75\" data-attrs=\"{&quot;src&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/af319274-744f-49fc-bb1e-6310f0dc2166_1024x1280.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1280,&quot;width&quot;:1024,&quot;resizeWidth&quot;:579,&quot;bytes&quot;:96073,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image\/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https:\/\/www.exponentialview.co\/i\/185640868?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faf319274-744f-49fc-bb1e-6310f0dc2166_1024x1280.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" alt=\"\"   loading=\"lazy\" class=\"sizing-normal\"\/><\/a><\/p>\n<p>So in practice, it seems like model tenures might indeed be too short to recoup R&amp;D costs. Indeed, GPT-5\u2019s short tenure was driven by external competition \u2014 <a href=\"https:\/\/www.wired.com\/story\/openai-gpt-launch-gemini-code-red\/\" rel=\"nofollow noopener\" target=\"_blank\">Gemini 3 Pro<\/a> had arguably surpassed the GPT-5 base model within three months.<\/p>\n<p>One way to think about this is to treat frontier models like rapidly-depreciating infrastructure: their value must be extracted before competitors or successors render them obsolete. So to evaluate AI products, we need to look at both profit margins in inference as well as the time it takes for users to migrate to something better. In the case of the GPT-5 bundle, we find that it\u2019s decidedly unprofitable over its full lifecycle, even from a gross margin perspective.<\/p>\n<p>So the finances of the GPT-5 bundle are less rosy than Altman and Amodei suggest. And while we don\u2019t have as much direct evidence on other models from other labs, they\u2019re plausibly in a similar boat \u2014 for instance, Anthropic has <a href=\"https:\/\/www.theinformation.com\/articles\/anthropic-lowers-profit-margin-projection-revenue-skyrockets\" rel=\"nofollow noopener\" target=\"_blank\">reported<\/a> similar gross margins to OpenAI. So it\u2019s worth thinking about what it means if the GPT-5 bundle is at all representative of other models.<\/p>\n<p>The most crucial point is that these model lifecycle losses aren\u2019t necessarily cause for alarm. AI models don\u2019t need to be profitable today, as long as companies can convince investors that they will be in the future. That\u2019s standard for fast-growing tech companies.<\/p>\n<p>Early on, investors value growth over profit, believing that once a company has captured the market, they\u2019ll eventually figure out how to make it profitable. The archetypal example of this is Uber \u2014 they accumulated a <a href=\"https:\/\/www.sec.gov\/Archives\/edgar\/data\/1543151\/000154315123000010\/uber-20221231.htm\" rel=\"nofollow noopener\" target=\"_blank\">$32.5 billion deficit<\/a> over 14 years of net losses, before their first profitable year in 2023. By that measure, OpenAI is thriving: revenues are tripling annually, and <a href=\"https:\/\/epoch.ai\/gradient-updates\/openai-is-projecting-unprecedented-revenue-growth\" rel=\"nofollow noopener\" target=\"_blank\">projections<\/a> show continued growth. If that trajectory holds, profitability looks very likely.<\/p>\n<p>And there are reasons to even be really bullish about AI\u2019s long-run profitability \u2014 most notably, the sheer scale of value that AI could create. <a href=\"https:\/\/www.wired.com\/story\/sam-altman-says-the-gpt-5-haters-got-it-all-wrong\/\" rel=\"nofollow noopener\" target=\"_blank\">Many<\/a> <a href=\"https:\/\/www.darioamodei.com\/essay\/machines-of-loving-grace#:~:text=However%2C%20I%20do%20think%20in%20the%20long%20run%20AI%20will%20become%20so%20broadly%20effective%20and%20so%20cheap%20that%20this%20will%20no%20longer%20apply.%20At%20that%20point%20our%20current%20economic%20setup%20will%20no%20longer%20make%20sense%2C%20and%20there%20will%20be%20a%20need%20for%20a%20broader%20societal%20conversation%20about%20how%20the%20economy%20should%20be%20organized.\" rel=\"nofollow noopener\" target=\"_blank\">higher<\/a>&#8211;<a href=\"https:\/\/www.youtube.com\/watch?v=PqVbypvxDto&amp;t=2309s\" rel=\"nofollow noopener\" target=\"_blank\">ups<\/a> <a href=\"https:\/\/x.com\/elonmusk\/status\/1980765809338147193\" rel=\"nofollow\">at<\/a> AI companies expect AI systems to outcompete humans across virtually all economically valuable tasks. If you truly believe that in your heart of hearts, that means potentially capturing <a href=\"https:\/\/epoch.ai\/epoch-after-hours\/ai-in-2030#:~:text=So%20I%20think%20one,trying%20to%20capture%20that.\" rel=\"nofollow noopener\" target=\"_blank\">trillions of dollars<\/a> from labor automation. The resulting revenue growth could dwarf development costs even with thin margins and short model lifespans.<\/p>\n<p>That\u2019s a big leap, and some investors won\u2019t buy the vision. Or they might doubt that massive revenue growth automatically means huge profits \u2014 what if R&amp;D costs scale up like revenue? These investors might pay special attention to the profit margins of current AI, and want a more concrete picture of how AI companies could be profitable in the near term.<\/p>\n<p>There\u2019s an answer for these investors, too. Even if you doubt that AI will become good enough to spark the intelligence explosion or <a href=\"https:\/\/www.darioamodei.com\/essay\/machines-of-loving-grace#1-biology-and-health\" rel=\"nofollow noopener\" target=\"_blank\">double human lifespans<\/a>, there are still ways that AI companies could turn a profit. For example, OpenAI is now <a href=\"https:\/\/x.com\/openai\/status\/2012223373489614951\" rel=\"nofollow\">rolling out ads<\/a> to some ChatGPT users, which could add between <a href=\"https:\/\/www.theinformation.com\/articles\/openais-international-conundrum\" rel=\"nofollow noopener\" target=\"_blank\">$2 to 15 billion<\/a> in yearly revenue even without any user growth. They\u2019re moving beyond individual consumers and increasingly <a href=\"https:\/\/www.youtube.com\/watch?v=tUVSuFT301U\" rel=\"nofollow noopener\" target=\"_blank\">leaning on enterprise adoption<\/a>. Algorithmic innovations mean that running models could get many times <a href=\"https:\/\/epoch.ai\/data-insights\/llm-inference-price-trends\" rel=\"nofollow noopener\" target=\"_blank\">cheaper<\/a> each year, and <a href=\"https:\/\/www.exponentialview.co\/p\/ai-is-ready-is-your-company\" rel=\"nofollow noopener\" target=\"_blank\">possibly much faster<\/a>. And there\u2019s still a lot of <a href=\"https:\/\/www.exponentialview.co\/p\/can-openai-reach-100-billion-by-2027\" rel=\"nofollow noopener\" target=\"_blank\">room to grow<\/a> their user base and usage intensity \u2014 for example, ChatGPT has close to <a href=\"https:\/\/epochai.substack.com\/p\/the-changing-drivers-of-llm-adoption\" rel=\"nofollow noopener\" target=\"_blank\">a billion users<\/a>, compared to around <a href=\"https:\/\/www.itu.int\/itu-d\/reports\/statistics\/2025\/10\/15\/ff25-internet-use\/\" rel=\"nofollow noopener\" target=\"_blank\">six billion<\/a> internet users. Combined, these could add many <a href=\"https:\/\/www.exponentialview.co\/p\/can-openai-reach-100-billion-by-2027\" rel=\"nofollow noopener\" target=\"_blank\">tens of billions of revenue<\/a>.<\/p>\n<p>It won\u2019t necessarily be easy for AI companies to do this, especially because individual labs will need to come face-to-face with AI\u2019s \u201cdepreciating infrastructure\u201d problem. In practice, the \u201cstate-of-the-art\u201d is often <a href=\"https:\/\/www.exponentialview.co\/p\/openais-narrow-100-billion-path\" rel=\"nofollow noopener\" target=\"_blank\">challenged<\/a> within months of a model\u2019s release, and it\u2019s hard to make a profit from the latest GPT if Claude and Gemini keep drawing users away.<\/p>\n<p>But this inter-lab competition doesn\u2019t stop all AI models from being profitable. Profits are often high in oligopolies because consumers have limited alternatives to switch to. One lab could also pull ahead because they have some kind of algorithmic \u201csecret sauce\u201d, or they have more compute. Or they develop <a href=\"https:\/\/epoch.ai\/gradient-updates\/the-huge-potential-implications-of-long-context-inference\" rel=\"nofollow noopener\" target=\"_blank\">continual learning<\/a> techniques that make it <a href=\"https:\/\/epoch.ai\/epoch-after-hours\/luis-garicano-not-so-simple-macroeconomics-of-ai#:~:text=So%20I%20would%20think%20that%20that%20layer%20remains%20quite%20competitive%2C%20with%20one%20caveat%2C%20which%20is%20the%20introduction%20of%20switching%20costs%20through%20memory.%20If%20the%20system%20starts%20to%20remember%20you%20and%20starts%20to%20know%20who%20you%20are%2C%20then%20switching%20systems%20is%20going%20to%20be%20costly.\" rel=\"nofollow noopener\" target=\"_blank\">harder for consumers to switch<\/a> between model providers.<\/p>\n<p>These competitive barriers can also be circumvented. Companies could form their own niches, and we\u2019ve already seen that to some degree: Anthropic is <a href=\"https:\/\/www.businessinsider.com\/anthropic-ceo-dario-amodei-drags-openai-and-google-code-red-2025-12\" rel=\"nofollow noopener\" target=\"_blank\">pursuing<\/a> something akin to a \u201ccode is all you need\u201d mission, Google DeepMind wants to \u201c<a href=\"https:\/\/www.technologyreview.com\/2016\/03\/31\/161234\/how-google-plans-to-solve-artificial-intelligence\/\" rel=\"nofollow noopener\" target=\"_blank\">solve intelligence<\/a>\u201d and use that to solve everything from cancer to climate change, and Meta strives to make <a href=\"https:\/\/www.meta.com\/superintelligence\/\" rel=\"nofollow noopener\" target=\"_blank\">AI friends too cheap to meter<\/a>. This lets individual companies gain revenue for longer.<\/p>\n<p>So will AI models (and hence AI companies) become profitable? We think it\u2019s very possible. While our analysis of the GPT-5 bundle is more conservative than Altman and Amodei hint at, what matters more is the trend: Compute margins are falling, enterprise deals are stickier, and models can stay relevant longer than the GPT-5 cycle suggests.<\/p>\n<p>Authors\u2019 note: We\u2019d like to thank <a href=\"https:\/\/open.substack.com\/users\/1078663-js-denain?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;JS Denain&quot;,&quot;id&quot;:1078663,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/d0ad9cbb-9d01-4424-8b6c-9635c7d26b1b_899x873.jpeg&quot;,&quot;uuid&quot;:&quot;5a59dce6-0fbe-4799-b444-8b5545e8f9b0&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">JS Denain<\/a>, Josh You, David Owen, <a href=\"https:\/\/open.substack.com\/users\/80637143-yafah-edelman?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Yafah Edelman&quot;,&quot;id&quot;:80637143,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substackcdn.com\/image\/fetch\/$s_!9Nz7!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9e2fa11f-2e88-4062-aeb5-efe06fdb452e_144x144.png&quot;,&quot;uuid&quot;:&quot;88d2f6a2-b6aa-4f1e-889e-77e241827abd&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Yafah Edelman<\/a>, Ricardo Pimentel, <a href=\"https:\/\/open.substack.com\/users\/117491-marija-gavrilov?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Marija Gavrilov&quot;,&quot;id&quot;:117491,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/e5a5c2ab-df72-45bc-b433-8b9d6cd50a44_1124x844.jpeg&quot;,&quot;uuid&quot;:&quot;2f030e8c-f333-4c56-a654-e7f709e91b2d&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Marija Gavrilov<\/a>, <a href=\"https:\/\/open.substack.com\/users\/84749728-caroline-falkman-olsson?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Caroline Falkman Olsson&quot;,&quot;id&quot;:84749728,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/98e2a6b9-33b2-44e5-a2ba-bdf7827d937c_1024x1024.webp&quot;,&quot;uuid&quot;:&quot;a8e1ffdd-1cf0-4b61-8ae9-c359652373da&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Caroline Falkman Olsson<\/a>, Lynette Bye, Jay Tate, <a href=\"https:\/\/open.substack.com\/users\/4281466-dwarkesh-patel?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Dwarkesh Patel&quot;,&quot;id&quot;:4281466,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substackcdn.com\/image\/fetch\/$s_!5eJb!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb715ffd1-f7d7-4755-af88-c48efe647f5b_400x400.jpeg&quot;,&quot;uuid&quot;:&quot;97500a66-6fbc-4517-95e9-ed3a1fe2952f&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Dwarkesh Patel<\/a>, Juan Garc\u00eda, Charles Dillon, Brendan Halstead, Isabel Johnson and Markov Gray for their feedback and support on this post. Special thanks to <a href=\"https:\/\/open.substack.com\/users\/710379-azeem-azhar?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Azeem Azhar&quot;,&quot;id&quot;:710379,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/bucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com\/public\/images\/09961c12-4209-4296-8a12-0762a41809a3_400x400.jpeg&quot;,&quot;uuid&quot;:&quot;a2457017-ad3d-4f2f-bb0e-a86f14ad4bd7&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Azeem Azhar<\/a> for initiating this collaboration and vital input, and <a href=\"https:\/\/open.substack.com\/users\/8567449-benjamin-todd?utm_source=mentions\" target=\"_blank\" rel=\"noopener nofollow\" data-attrs=\"{&quot;name&quot;:&quot;Benjamin Todd&quot;,&quot;id&quot;:8567449,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https:\/\/substackcdn.com\/image\/fetch\/$s_!-kPF!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d0e4705-0c8a-46a2-98e8-1dff8d79dcbd_1366x1366.jpeg&quot;,&quot;uuid&quot;:&quot;066bd77a-1c04-4e47-b3c2-90971c15c118&quot;}\" data-component-name=\"MentionUser\" class=\"mention-pnpTE1\">Benjamin Todd<\/a> for in-depth feedback and discussion.<\/p>\n","protected":false},"excerpt":{"rendered":"AI companies are being priced into the hundreds of billions. That forces one awkward question to the front:&hellip;\n","protected":false},"author":2,"featured_media":397016,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-397015","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/397015","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=397015"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/397015\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/397016"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=397015"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=397015"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=397015"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}