AMD AI director says Claude Code is becoming dumber and lazier since update
Logans_Run
37 points
10 comments
April 08, 2026
Related Discussions
Found 5 related stories in 57.3ms across 3,961 title embeddings via pgvector HNSW
- Anthropic Races to Contain Leak of Code Behind Claude AI Agent sonabinu · 21 pts · April 01, 2026 · 58% similar
- Claude Code users hitting usage limits 'way faster than expected' samizdis · 293 pts · March 31, 2026 · 55% similar
- A leak reveals that Anthropic is testing a more capable AI model "Claude Mythos" Tiberium · 11 pts · March 27, 2026 · 54% similar
- AI Team OS – Turn Claude Code into a Self-Managing AI Team cronus1141 · 40 pts · March 21, 2026 · 53% similar
- Anthropic's AI tool Claude central to U.S. campaign in Iran, amid a bitter feud spenvo · 24 pts · March 04, 2026 · 53% similar
Discussion Highlights (5 comments)
e3df
Lol OAI and AMD did a deal together so whatever. In reality as they scale up, the models lose nuance and become noisier. The boosters do not want to admit this. We need highly-specialised models/interfaces. Not one thing and trying to force-fit it.
ratg13
Boris from the Claude Code team explained this on HN 2 days ago https://news.ycombinator.com/item?id=47664442
SunshineTheCat
Ok, I thought I was going insane. The last two larger coding tasks I gave Claude Code it left about 35% of my request completely undone or done sloppily. I because of this, the next task I gave it on the larger side, I ran its work through Codex which identified 7 glaring unfinished parts of the task. The trend was starting the part of the task but then leaving a "skeleton" of what I has requested without any of the actual working parts. The way I would describe it is a kid cramming his 3 month project into a Sunday evening for Monday's due date.
niobe
I felt it had been enshittified 1-2 weeks back, not in Feb. But it's very subjective.
zambelli
Anyone expecting a higher tier subscription to be announced since this current reduction? Cynicism aside - I do wonder what the future will hold given that current token burn rates aren't sustainable without VC cash. Anthropic even pushed us to use haiku for claude code for "many" tasks in our enterprise training, so I'm wondering if it's not a company need of sorts to reduce the burn?