White-collar AI apocalypse narrative is just another bullshit

mmiliauskas 59 points 99 comments March 23, 2026
martynasm.com · View on Hacker News

Discussion Highlights (19 comments)

robotswantdata

The current deployments of chatbots are not the bar to compare with. There’s an incoming wave of extremely capable agents and process reimagining that is going to be highly disruptive. Been in this space over a decade and this time really is different. It’s hard for humans to perceive the exponential, it will be slow then sudden.

bluegatty

It's already been an apocalypse for some jobs, and that will continue on. But probably not broadly.

intellectronica

Every morning the turkey rejoiced and said to himself "oh joy, I'm such a lucky turkey, I don't have to do anything, the food is plentiful, I just eat and shoot the breeze the whole day long, what an awesome life!" Until one morning, the day before thanksgiving, the turkey rejoiced about the awesome day he's about to have ... just to be picked up 5 minutes later and dragged to the slaughterhouse.

kensai

Just watch this for some depressive vibes: https://x.com/TechLayoffLover

monegator

I love how the comments are always full of doomposting. Get prepared. Something is coming * soon* And how any even slightly skepctical commend gets downvoted to hell. One may start thinking there are bots promoting the narrative.

faangguyindia

wait what? I am in india and jobs are being lost by thousands everyday in IT field. infact, i go and implement dumb AI models in many companies and executives immediately show "how many people they can fire with this advancement".

rando77

It's worth taking actions that take large scale job losses due to AI in the future into consideration, even if now is not the time

aurareturn

Let's suppose you are a medium sized business. You've always wanted to provide top quality customer service but couldn't do it before because you'd need to hire 5 people to do it right. Instead, you strategically decided to not provide quality customer service and sell the product at a lower price than competitors. So you have no customer service person in the company. Service is bad. It limits growth. But it was strategic to not provide good service in order to gain an advantage somewhere else in the business. But now, you can hire 1 customer service person, who could then use AI agents to provide the top quality customer service. Previously, you needed to hire 5 people, which wasn't worth it. So you went from no customer service employee to 1. I suspect that this is what will happen. Many companies will hire their first customer service person or more. Many big companies will layoff most of their customer service people. The net effect might actually increase total customer service employment. I suspect that job openings for customer service employees will actually be higher than now but companies won't be able to find enough AI-skilled people to fill the job. We're going to read about how there are more job openings than ever but companies can't find the AI skillset they need. This is why I think people who adopt AI now, learn it, understand it, get good at it, will be in high demand.

fabian2k

I think the argument here is a bit of a strawman, though there is a good point in there as well. AI will not automate all customer support, but it has the potential to automate a large fraction of it. The anecdote in there is about complex B2B enterprise software. That's not the majority of customer support, and is very heavy on escalating to actual experts. You don't have to remove 100% of the jobs to have huge effects. Automating large parts of a few sectors would already create significant disruptions.

jdalsgaard

> Because the remaining 10% is what required most of the CS team’s time. They built an FAQ you can talk to. These days it's hard to get people to read an email longer then 5 lines - yet people are super excited about abundant masses of text generated by LLMs. It does not compute....

jonathanstrange

My biggest worry currently isn't even job-related, it's that corporations and authorities will use AI for customer/client relations but that this AI will not be allowed to make any significant changes and is therefore an utter waste of time. In many places, this could turn an already dire situation into an absolute nightmare. What might make it even worse is that authorities - and probably also corporations - will likely ban or block user AI agents, so you cannot even use your own AI to negotiate with their AI. That's something that needs to be addressed by lawmakers ASAP. There needs to be a right to speak to a human, or (the perhaps overly tech optimistic route) a prohibition of AI that doesn't have adequate decision-making power.

rspoerri

People tend to forget how fast technological development advances. Even if you lived trough it you tend to forget how recently the world looked very different. - before 2012 there was no smartphone - before 2001 there was no wikipedia - before 1995 less then 10 percent of the rich country home users had internet - before 2023 there was no ai available to home users. Hardware has been getting faster by a factor of 100 in 10 years and ~10‘000 in 20 years. Ai currently develops faster because of a combination of software and hardware improvements. Even if the best current system is only right 1/100 times right now, its likely nearly allways accurate in 10 years. I also like to remind people that the phone i am writing this on (iphone 12), has the same computing power as the earth simulator in 2003. that was the fastest computer on the earth back then. Imagine this development and think what changes might come.

keiferski

Bifurcation is the right model and it’s already happening: For things where the end customer doesn’t care if they’re interacting with an AI, reading content by an AI, etc. – or if the company doesn’t care what the customer thinks (see: automated phone customer support lines for the last twenty years) – the work will be replaced by AI work. Examples are any kind of rote documentation, generic digital asset creation like blog images, low level customer support, and most things where the company doesn’t really care about the customer, because the company is getting paid regardless. If it does matter what the end customer thinks, the role will become increasingly humanistic in nature. Examples are high-end enterprise sales, personality and expertise-driven media and content, and anything where being “revealed” as an AI is perceived negatively.

lelanthran

To be perfectly honest, the majority of work is going to see a restructure soon anyway. "Triaging by LLM before sending task to any human" can work for almost anything, not just support calls. On another story I saw someone mention that they'd like something like an ad-blocker, but for content - a "content-blocker". Not too hard running even a local model that, via a browser extension, scnas the current page and places it into one of several bins: Read verbatim, summarise with ChatAI, Ignore completely, Read and mark for re-reading. Software dev? Bin a ticket into "complex", "simple", "talk to lead dev". Software proposal? Bin the proposal into "CotS available", "FOSS available", "Quick dev", "Too costly to proceed". Bookkeeping? Accounting? They all have tasks that can be binned. What does this all mean, I hear you ask? Well, you no longer need as many employees if some of the bins are "ChatAI and/or agent can complete this" with human review. So, yeah, a lot of people are going to be out of work if this works like they say it does.

flanked-evergl

Managed decline policies of western governments are much more threatening to white-collar workers and everyone else than AI will ever be. AI will enable significantly faster economic growth, which is something the EU has been making impossible with legislation designed to destroy Europe's economic advantage.

tamimio

Of course it is, it’s just a scapegoat to lower the wages, another power dynamic trick pulled on employees. I have noticed a lot of managers going with coop+AI combo or outsourcing+AI, thinking it’s the ultimate goldmine to minimize expenses and maximize profits, and they soon hit a reality check. And when they do, unfortunately, instead of resolving the root cause issue, they go and hire only one senior in the team and overload and overwork him, while praising the AI how it increased the productivity and all.

badgersnake

AI bros flagged it to death

odyssey7

The fact that a work of satire that stimulated interesting discussion has been flagged is telling.

jeremie_strand

The real question isn't whether white-collar jobs disappear overnight, but whether the skills floor keeps rising faster than most workers can adapt. Historically, new technology created adjacent roles - but the timescale was decades, not 3-5 years. That's a meaningfully differnent constraint.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed