12k AI-generated blog posts added in a single commit
noslop
143 points
143 comments
April 04, 2026
Related Discussions
Found 5 related stories in 40.2ms across 3,558 title embeddings via pgvector HNSW
- Things I've Done with AI shepherdjerred · 80 pts · March 09, 2026 · 51% similar
- Show HN: We scored 50k PRs with AI – what we learned about code complexity chuboy · 11 pts · March 30, 2026 · 51% similar
- 6 Practices that turned AI from prototyper to workhorse (106 PRs in 14 days) waleedk · 15 pts · March 01, 2026 · 50% similar
- Y Combinator's CEO says he ships 37,000 lines of AI code per day jcbhmr · 13 pts · April 03, 2026 · 49% similar
- A GitHub Issue Title Compromised 4k Developer Machines edf13 · 368 pts · March 05, 2026 · 48% similar
Discussion Highlights (20 comments)
ThrowawayR2
If the dead Internet theory wasn't true before, it sure will be soon.
MattGaiser
One of the issues is that the purpose of business internet writing is not to be read, but to be ranked well.
r_lee
I've seen this blog slop on Google for the last month or so, no action taken whatsoever. it's mostly bullshit or regurgitated info from docs. like Google or their Search team really doesn't seem to care at all. all of a sudden a random blog website just happens to rank first page on every topic
miyuru
Commit maker is here and have only posts slop here as well. https://news.ycombinator.com/submitted?id=ndhandala wonder when will he submit them here.
fn-mote
I thought somebody counted them… incredibly, the log message admits to committing 12,000 articles. I guess that means the log message was authored by AI as well. Figures.
cachius
At which URL(s) are the blog posts visible?
gib444
"Showing 1 - 25 of 45488 posts" I miss the days when we could assume that's just a pagination code bug
cebert
What is the point of this?
StrLght
I am so glad DuckDuckGo allows blocking specific sites from the search. Just did this for a domain linked in this repository.
WJW
Github only reports 5012 changed files though.
tadfisher
> Showing 1-25 of 58891 posts I have to imagine that one quality post worth reading would be linked in multiple places, thus would beat tens of thousands of slop articles for SEO purposes?
wartywhoa23
AI is the stellar moment for all mediocrity and conmen.
ConceitedCode
I suspect we'll address this by just going back to older ranking algorithms for search. We'll go back to the primary signal of good content being links from trusted sources. People gaming the content based algorithms will eventually cause their own downfall.
sigmonsays
when AI starts training itself accidentally on AI generated content, we all lose...
ieie3366
Ironically due to slop I feel like we are regressing as a civilization 2020, want to know how to use Redix for Redis connections in Elixir? Google it and the results were most likely high quality, written by senior engineers who knew what they were doing Today google that, and it will be endless amounts of slop
antiloper
"Nawaz Dhandala"
arcza
So whatever OneUptime is, I now know it has zero integrity and is something I should avoid.
hirako2000
> All content must be original and not published anywhere else. Do what I say, not what I do.
ugiox
Now we know why GitHub has a hard time with stability and reliability. Because of this AI slop BS inflicted on us by the Silicon Valley tech bros and all their followers.
username223
[GitHub] platform activity is surging. — https://twitter.com/kdaigle/status/2040164759836778878