Anthropic expands partnership with Google and Broadcom for next-gen compute

l1n 208 points 93 comments April 06, 2026
www.anthropic.com · View on Hacker News

Discussion Highlights (7 comments)

mikert89

There's no limit to the algorithms. People dont understand yet. They can learn the whole universe with a big enough compute cluster. We built a generalizable learning machine

Eufrat

Can someone explain why everything is being marketed in terms of power consumption?

skybrian

I guess gigawatts is how we roughly measure computing capacity at the datacenter scale? Also saw something similar here: > Costs and pricing are expressed per “token”, but the published data immediately seems to admit that this is a bad choice of unit because it costs a lot more to output a token than input one. It seems to me that the actual marginal quantity being produced and consumed is “processing power”, which is apparently measured in gigawatt hours these days. In any case, I think more than anything this vindicates my original decision not to get too precise. [...] https://backofmind.substack.com/p/new-new-rules-for-the-new-... Is it priced that way, though? I assume next-gen TPU's will be more efficient?

cebert

I’m surprised Anthropic wanted to partner with Broadcom when they have such a negative reputation with antics such as their VMWare acquisition.

ketzo

$19B -> $30B annualized revenue in a month ? Feels like the lede is buried here!

mahadillah-ai

Interesting to see Anthropic investing in compute infrastructure. The bottleneck I keep hitting is not raw compute but where that compute lives — EU customers increasingly need guarantees their data stays in-region. More sovereign compute options in Europe would unlock a lot of enterprise AI adoption.

holografix

I don’t understand Claude Code’s moat here. What can it do that opencode can’t or couldn’t fairly easily implement?

Semantic search powered by Rivestack pgvector
3,752 stories · 35,056 chunks indexed