Anthropic expands partnership with Google and Broadcom for next-gen compute
l1n
208 points
93 comments
April 06, 2026
Related Discussions
Found 5 related stories in 49.0ms across 3,752 title embeddings via pgvector HNSW
- Anthropic Subprocessor Changes tencentshill · 56 pts · March 26, 2026 · 61% similar
- The Anthropic Institute paulpauper · 11 pts · March 14, 2026 · 54% similar
- Anthropic chief back in talks with Pentagon about AI deal ajam1507 · 33 pts · March 05, 2026 · 54% similar
- Anthropic is burning more and more dev goodwill tosh · 59 pts · April 06, 2026 · 52% similar
- IBM Announces Strategic Collaboration with Arm bonzini · 265 pts · April 02, 2026 · 52% similar
Discussion Highlights (7 comments)
mikert89
There's no limit to the algorithms. People dont understand yet. They can learn the whole universe with a big enough compute cluster. We built a generalizable learning machine
Eufrat
Can someone explain why everything is being marketed in terms of power consumption?
skybrian
I guess gigawatts is how we roughly measure computing capacity at the datacenter scale? Also saw something similar here: > Costs and pricing are expressed per “token”, but the published data immediately seems to admit that this is a bad choice of unit because it costs a lot more to output a token than input one. It seems to me that the actual marginal quantity being produced and consumed is “processing power”, which is apparently measured in gigawatt hours these days. In any case, I think more than anything this vindicates my original decision not to get too precise. [...] https://backofmind.substack.com/p/new-new-rules-for-the-new-... Is it priced that way, though? I assume next-gen TPU's will be more efficient?
cebert
I’m surprised Anthropic wanted to partner with Broadcom when they have such a negative reputation with antics such as their VMWare acquisition.
ketzo
$19B -> $30B annualized revenue in a month ? Feels like the lede is buried here!
mahadillah-ai
Interesting to see Anthropic investing in compute infrastructure. The bottleneck I keep hitting is not raw compute but where that compute lives — EU customers increasingly need guarantees their data stays in-region. More sovereign compute options in Europe would unlock a lot of enterprise AI adoption.
holografix
I don’t understand Claude Code’s moat here. What can it do that opencode can’t or couldn’t fairly easily implement?