Graphs that explain the state of AI in 2026
bryanrasmussen
84 points
52 comments
April 18, 2026
Related Discussions
Found 5 related stories in 72.7ms across 4,930 title embeddings via pgvector HNSW
- The beginning of scarcity in AI gmays · 40 pts · April 16, 2026 · 59% similar
- We're running out of benchmarks to upper bound AI capabilities gmays · 15 pts · April 10, 2026 · 58% similar
- The first 40 months of the AI era jpmitchell · 156 pts · March 28, 2026 · 57% similar
- If AI has a bright future, why does AI think it doesn't? JCW2001 · 15 pts · March 06, 2026 · 56% similar
- Stanford report highlights growing disconnect between AI insiders and everyone ZeidJ · 233 pts · April 13, 2026 · 56% similar
Discussion Highlights (12 comments)
amelius
Also nobody will ever have a moat, so the graph of investor stupidity is going through the roof.
hydrocomplete
I still don't understand the State of AI in 2026.
bix6
China’s robotics lead holy cow.
cloud-oak
> Training AI models can generate enormous carbon emissions Sure, but what I'd really like to see is a graph for how much carbon is generated serving these models globally.
HelloMcFly
Besides the lead in robotics for China, those Grok emissions charts are the thing that most leap off the page.
xnx
The "China leads in robotics" seems to be unaffected by AI. The China line is basically on the same trajectory since 2012. The chart does no belong in the article.
fyrn_
Worth calling out AI sentiment among young people is not nearly so rosy: https://news.gallup.com/poll/708224/gen-adoption-steady-skep...
themafia
Profits generated by AI: <not graphed> The absence speaks volumes.
ChrisArchitect
[dupe] https://news.ycombinator.com/item?id=47758028 Source: https://hai.stanford.edu/ai-index/2026-ai-index-report
eulgro
> The report estimates that carbon emissions from models with the least efficient inference are over 10 times as high as those with the most efficient inference. DeepSeek’s V3 models were estimated to consume around 23 watts when responding to a “medium-length” prompt, while Claude 4 Opus was estimated to consume about 5 watts. This makes absolutely no sense. I suppose they meant watt hours, and that's a weird way to explain carbon emissions...
i_love_retros
Stating "Software engineers are all-in on AI" because of an increase in github projects being created is hilarious. I didn't realise creating a github repo made someone a software engineer. If only I had known this I wouldn't have bothered learning all the other stuff!
tqi
> The report estimates that training the latest frontier large language models, such as xAI’s Grok 4, can generate over 72,000 tons of carbon-equivalent emissions. That seems pretty trivial, relative to 38bn per year globally?