Apple's accidental moat: How the "AI Loser" may end up winning
walterbell
99 points
82 comments
April 13, 2026
Related Discussions
Found 5 related stories in 63.6ms across 4,351 title embeddings via pgvector HNSW
- Apple Just Lost Me syx · 444 pts · March 25, 2026 · 61% similar
- How the AI Bubble Bursts martinvol · 355 pts · March 30, 2026 · 57% similar
- How "Hardwired" AI Will Destroy Nvidia's Empire and Change the World amelius · 19 pts · March 14, 2026 · 56% similar
- AI Will Be Met with Violence, and Nothing Good Will Come of It gHeadphone · 331 pts · April 12, 2026 · 56% similar
- The AI Bubble Is an Information War spking · 26 pts · March 03, 2026 · 56% similar
Discussion Highlights (19 comments)
grtteee
This is the classic apple approach - wait to understand what the thing is capable of doing (aka let others make sunk investments), envision a solution that is way better than the competition and then architect a path to building a leapfrog product that builds a large lead.
bigyabai
I just realized that next year Apple's Neural Engine will be 10 years old, just like the "NPUs will change AI forever!" puff pieces. Here's to another 10 years of scuffed Metal Compute Shaders, I guess.
worthless-trash
Don't worry, when apple introduce it, it'll be revolutionary and 10% thinner.
livinglist
But why do I feel like the quality of the software from Apple declined sharply in recent years? The liquid glass design feels very unpolished and not well thought out throughout almost everywhere… seems like even Apple can’t resist falling victim to AI slop
javchz
What I think was a wasted opportunity was not bringing the xserve back, being one of the few e2e solutions out there at scale.
pram
I've had it turned off since Sequoia, and this I truly appreciate. It hasn't nagged me once to turn it or Siri on, and it isn't mandatory. When I open up JIRA or Slack I am always greeted with multiple new dialogues pointing at some new AI bullshit, in comparison. We hates it precious
hapticmonkey
Apple aren’t in the business of building chatbots to impress investors (other than some WWDC2024 vaporware they’d rather not talk about any more). They’re in the business of consumer hardware. Consumers want iPhones and (if Apple are right) some form of AR glasses in the next decade. That’s their focus. There’s a huge amount of machine learning and inference that’s required to get those to work. But it’s under the hood and computed locally. Hence their chips. I don’t see what Apple have to gain by building a competitor to what OpenAI has to offer.
46493168
Apple is almost 2 years out from their announcement of Apple Intelligence. It has barely delivered on any of the hype. New Siri was delayed and barely mentioned in the last WWDC; none of the features are released in China. In other news, people keep buying iPhones, and Apple just had its best quarter ever in China. AAPL is up 24% from last year.
nl
> Then Stargate Texas was cancelled, OpenAI and Oracle couldn’t agree terms, and the demand that had justified Micron’s entire strategic pivot simply vanished. Micron’s stock crashed. Well.. no. The Stargate expansion was cancelled the orginally planned 1.2MW (!) datacenter is going ahead: > The main site is located in Abilene, Texas, where an initial expansion phase with a capacity of 1.2 GW is being built on a campus spanning over 1,000 acres (approximately 400 hectares). Construction costs for this phase amount to around $15 billion. While two buildings have already been completed and put into operation, work is underway on further construction phases, the so-called Longhorn and Hamby sections. Satellite data confirms active construction activity, and completion of the last planned building is projected to take until 2029. > The Stargate story, however, is also a story of fading ambitions. In March 2026, Bloomberg reported that Oracle and OpenAI had abandoned their original expansion plans for the Abilene campus. Instead of expanding to 2 GW, they would stick with the planned 1.2 GW for this location. OpenAI stated that it preferred to build the additional capacity at other locations. Microsoft then took over the planning of two additional AI factory buildings in the immediate vicinity of the OpenAI campus, which the data center provider Crusoe will build for Microsoft. This effectively creates two adjacent AI megacampus locations in Abilene, sharing an industrial infrastructure. The original partnership dynamics between OpenAI and SoftBank proved problematic: media reports described disagreements over site selection and energy sources as points of contention. https://xpert.digital/en/digitale-ruestungsspirale/ > Micron’s stock crashed. [the link included an image of dropping to $320] Micron’s stock is back to $420 today > One analysis found a max-plan subscriber consuming $27,000 worth of compute with their 200$ Max subscription. Actually, no. They'd miscalculated and consumed $2700 worth of tokens. The same place that checked that claim also points out: > In fact, Anthropic’s own data suggests the average Claude Code developer uses about $6 per day in API-equivalent compute. https://www.financialexpress.com/life/technology-why-is-clau... I like Apple's chips, but why do we put up with crappy analysis like this?
rvz
Apple never competed in the "AI race" in the first place, because they already knew they were already at the finish line. This was really unsurprising [0]. [0] https://news.ycombinator.com/item?id=40278371
amazingamazing
Gemma4 in my view is good enough to do things similar to Gemini 2.5 flash, meaning if I point it code and ask for help and there is a problem with the code it’ll answer correctly in terms of suggestions but it’s not great at using all tools or one shooting things that require a lot of context or “expert knowledge” If a couple more iterations of this, say gemma6 is as good as current opus and runs completely locally on a Mac, I won’t really bother with the cloud models. That’s a problem. For the others anyway.
int32_64
Nvidia restricts gamer cards in data centers through licensing, eventually they will probably release a cheaper consumer AI card to corner the local AI market that can't be used in data centers if they feel too much of a threat from Apple. Imagine a future where Nvidia sells the exact same product at completely different prices, cheap for those using local models, and expensive for those deploying proprietary models in data centers.
sublinear
> Pure strategy, luck, or a bit of both? I keep going back and forth on this, honestly, and I still don’t know if this was Apple’s strategy all along, or they didn’t feel in the position to make a bet and are just flowing as the events unfold maximising their optionality. Maximizing the available options is in fact a "strategy", and often a winning one when it comes to technology. I would love to be reminded of a list of tech innovators who were first and still the best. Anyway, hasn't this always been Apple's strategy?
microslop2026
I like how we are acting like this market is so novel and emergent revering the luck of some while lamenting the failures of others when it was all "roadmapped" a decade ago. It's like watching a Shaanxi shadow puppet show with artificial folk lore about the origins of the industry. I hate reality television!
asdev
Apple is just waiting for all the slop to inevitably crash to see what actually works
ajross
This seems mistaken to me. The core idea is that LLMs are commoditizing and that the UI (Siri in this case) is what users will stick with. But... what's the argument that the bulk of "AI value" in the coming decade is going to be... Siri Queries?! That seems ridiculous on its face. You don't code with Siri, you don't coordinate automated workforces with Siri, you don't use Siri to replace your customer service department, you don't use Siri to build your documentation collation system. You don't implement your auto-kill weaponry system in Siri. And Siri isn't going to be the face of SkyNet and the death of human society. Siri is what you use to get your iPhone to do random stuff. And it's great. But ... the world is a whole lot bigger than that.
jayd16
My capex is even less than Apple, I can ship to user's Apple hardware and I can't access iPhone user photos either...so really I'm the winner.
-1
Maybe they thought an investment in a product with lots of substitutes & high capital requirements wasn't very attractive.
nielsbot
> I am actually of the opinion that without some kind of bailout, OpenAI could be bankrupt in the next 18-24 months, but I am horrible at predictions I find this intriguing.. Does anyone here have enough insight to speculate more?