51 Comments
User's avatar
CAT's avatar

Do you still run a company or is it just banger factory now

Ethan Ding's avatar

firstly, claude is really the true author of this

secondly, claude is also running my company

cdman's avatar

I stopped reading when you said "disney makes marigns on in-park purchases, so they make the hotels, parking, and flights cheaper" have you been to disney? Parking is $35. The hotels are the most expensive in the area, Grand Californian is like 1k per night or something insane. Disney charges a premium on in-park purchases and charges a premium on hotels and parking and I'm certain they would charge a premium on air line tickets if they owned one.

Ethan Ding's avatar

think the authors citing the joel essay which was written in 2002

the author also hasn’t been to disneyland

Spouting Thomas's avatar

The media business got a lot tougher so DIS is now a hospitality company masquerading as a beloved movie studio. As a result, they jacked the prices way up across the board at the parks, which are their profit center.

Which still seems to fit the thesis here: what has been commoditized, in relative terms, is content, which increasingly serves as advertising for DIS parks instead of a profit center in its own right. They're not quite giving it away, but the streaming business is running at a 5% operating margin compared to 30% for the hospitality business. TV networks are still there but are a melting ice cube.

Henry's avatar

35 is cheaper than parking near Fisherman's Wharf

Chris B's avatar

I know what the benchmarks say and I have no doubt that GPT-5 can one-shot "demo" programs better than Claude.

But for my production-sized repos, Claude (both Opus and Sonnet) just do a better job of following patterns and getting it right the first time, which saves me more hours than GPT-5 (and I really put GPT-5 through its paces while its free in Cursor).

Dev time is very expensive. If Claude stays ahead in saving me time, the price would have to be quite extreme for me to consider switching, even if other options become free.

Rbbb's avatar

You say “ here are the 4 places where all of tech’s profit lives:”, then you don’t include Nvidia, Apple, or TSMC?

Jake's avatar

He focused on software companies. But the companies you listed are hardware. And accordingly, they make their software free (at least for their own platforms).

Rbbb's avatar

You think Advertising and Cloud infrastructure are software?

Jake's avatar

Advertising is software, but you are right that cloud isn't primarily software.

Earl Lee's avatar

The commoditizing complements framework doesn't perfectly fit OpenAI's move here because the customer that allows it to generate advertising revenue (consumers) isn't the same as the customer buying inference through the API—unless OpenAI starts serving ads through its API as well. That's not impossible, but it's unlikely.

Selling inference at or below cost does not drive more consumer attention to OpenAI's ad inventory, ChatGPT. If anything, it has the opposite impact: making inference cheap makes applications layer tools more abundant and powerful to use, drawing attention away from generic, horizontal tools for accessing intelligence, i.e., ChatGPT.

Taken to the extreme, imagine a world where inference was insanely expensive. So expensive that AI-powered apps were far and few in between, or they had to charge insane amounts to pass the cost down to consumers. There would be much less usage of such apps and consumers would instead just use ChatGPT—it's free! OpenAI obviously can't take this stance because that just drives more of the inference business to other providers. It's kind of a Catch-22.

Here's another way to look at this is: Google and Amazon are in both the advertising and cloud infrastructure business, but they're not subsidizing the latter to drive profits through the former. Doing so would have no impact on advertising revenue—does making GCP cheaper drive more consumer searches? (Aside: Amazon actually has a massive ad business from the consumer attention it gets on Amazon.com. It might even generates a significant portion if not most of the profits in its retail business. Driving prices down through Amazon basics, 3rd party marketplace, making shipping faster, reducing friction to purchase, etc—these are all great examples of commoditizing complements to focus on making margin on ads.)

That said, here's what I think could tie this story together a bit more. OpenAI just wants to kill anything that can draw attention away from its advertising surface ChatGPT. Claude.ai (the chat interface) is one of them. So by making inference cheap, they weaken Anthropic's ability to subsidize Claude.ai and perhaps even turn the company as a whole into a long-term unsustainable business.

Long-term, though, I don't see how this works out because there still are other players, i.e., Google and Meta, who don't need to make money on inference that can offer consumers access to general intelligence, and as a leading frontier lab, Anthropic can continue to raise funding to subsidize Claude.ai.

As an aside, I agree with and enjoyed reading your previous posts! I'm based in NYC if you ever want to jam on or get feedback on ideas like these.

Siebe Rozendal's avatar

Can't we just assume that OpenAI's pricing means they're trying to conquer API market share, rather than burning it or maximizing short-term profits? And hurting Anthropic short-term

Also, it's a huge assumption to believe most revenue lies in advertising. AGI is much more valuable and requires broad tool use & integration with large-scale industries

Jose Pons Vega's avatar

Sir, I believe only Sam Altman is allowed to not capitalize sentences. It would make your article much easier to read, otherwise you might as well drop the periods.

Bruce Lambert's avatar

Why don’t you capitalize?

Julius Gonzalez's avatar

Makes it more authentic

Lauri Elias's avatar

No plan to move off Claude Code 2025-08-13. Could change quickly, granted

Gijs Verheijke's avatar

The only thing you didn’t really take into account here is Google. With Gemini 2.5 they released a model that was on par and outperformed Claude which was then at 3.7

For a time it was the preferred model for many in cursor. Gemini 2.5 pro is the same price as gpt-5

Rodrigo's avatar

Really good article, hopefully you can fix your caps key soon.

Sam's avatar

> casinos make money on gambling, so they make the hotel room, the food, and the drinks cheaper

This was true in the past, but I don’t think it really describes modern Vegas casinos, which now make most of their money from entertainment/hotels/dining/etc.

Also great article!

Ethan Ding's avatar

i dont go to any of those places often enough to know, but interesting that they don’t do this anymore

wonder what the counter acting force is

airlines definitely do this, loss lead the fuck out of their volume to boost credit card margins

Flume, Nom de's avatar

He gets the same thing wrong about Disney and its parks.

Romil's avatar

phenomenal read. but the aws and azure cloud revenue numbers seem off. as of q2 2025, aws has an annual cloud revenue run rate of about $124 billion, while azure’s is about $75 billion.

Dan's avatar
Aug 12Edited

Why is the API business a complement to an AI-driven consumer advertising business? It doesn’t feel like the same analogy, where more browser usage drives more demand for search & search ads. When lovable grows as a result of cheaper GPT-API, their consumers don’t naturally flow into the OpenAI’s consumer product, right?

In fact, wouldn’t the more obvious subsidy simply make the max inference version of consumer GPT free to everyone and insert ads? Let everyone run unlimited Deep Research, but make it ad supported. Deep research on nutrition should have VitaminShoppe banner ads interleaved

Nathan's avatar

that's a good point! i'd think of less of a "commoditize to reduce friction" and more of a "commodization attack". look at something like, why did microsoft bundle IE and ruthlessly crush netscape by commoditizing the browser? they did so because the web was a platform for future applications, just like the assistant/chat wars have become the battleground where OpenAI, Anthropic, Google, and Meta are currently duking it out for eyeballs.

If OpenAI can become a big bully aggressively that destroys the competition's business model before they get a chance to pivot to something else, they win big. Diverting focus away from Claude the App into "we need to save our core inference business!" would fulfill strategic goals. But I'm not 100% convinced of my own argument here. Just riffing.

(2) - yeah. I believe that's the end destination Ethan is thinking this lands in.

Matt's avatar
Aug 14Edited

So as we already knew, Anthropic has faults but OpenAI is the bad guy. Try to turn Web 3.0 into an even worse dystopia than web 2.0.

Marciano's avatar

“microsoft did it first with pcs and ms-dos. they wanted to sell software, so they made hardware a commodity. they didn't build pcs - they made pcs so cheap to build that compaq and dell fought each other to the death while microsoft collected windows licenses from every corpse.” If Microsoft didn’t build pcs, how did they “make pcs so cheap”?

DotProduct's avatar

Amusingly it is their own businesses that the hyperscalers will see AI consume. Google’s search, Microsoft’s office, Salesforce, SAP and so on, even possibly Meta’s social media: AI will become the platform through which all of this is used, and the incumbents won’t all be winners. They are currently betting the house in terms of data centre expenditures. Next step: the sharks, in their frenzy, will start to eat each other.