Nvidia Will Spend $26B to Build Open-Weight AI Models
bigwheels
25 points
7 comments
March 11, 2026
Related Discussions
Found 5 related stories in 54.7ms across 3,471 title embeddings via pgvector HNSW
- Nvidia backs AI data center startup Nscale as it hits $14.6B valuation voxadam · 52 pts · March 09, 2026 · 65% similar
- Nvidia's Huang pitches AI tokens on top of salary wmat · 19 pts · March 20, 2026 · 58% similar
- Nvidia is reportedly planning its own open source OpenClaw competitor mikece · 14 pts · March 11, 2026 · 58% similar
- OpenAI closes funding round at an $852B valuation surprisetalk · 398 pts · March 31, 2026 · 57% similar
- Jensen Huang says Nvidia is pulling back from OpenAI and Anthropic jnord · 91 pts · March 05, 2026 · 57% similar
Discussion Highlights (3 comments)
gigatexal
Why would they do this? Ahh to keep the AI bubble afloat. Got it.
lemonish97
Some of the nemotron models are really good. Hope this encourages more open-weight/source models from the west
ivanvoid
I will just leave here article that open-weight is not open-training. when i use model i wanna be able to see and modify it, i don’t want another 12Gb black box. https://www.workshoplabs.ai/blog/open-weights-open-training