AI doesn't replace white collar work

amarble 58 points 91 comments March 08, 2026
www.marble.onl · View on Hacker News

Discussion Highlights (20 comments)

gjsman-1000

I said on a different thread, everyone right now is focused on productivity gains, AI making us faster. We are only one major incident away from this trend reversing. Now that we have AI, regulation is less burdensome. More testing requirements, more certification requirements, more security requirements, more accessibility requirements. Everyone keeps their jobs; the bar goes up. Whenever an industry gets better tools, we raise standards instead of making more cheap junk. We make $25K cars instead of $5K cars at 1960s engineering standards.

pjmlp

It certainly replaced a couple of white collars that used to do translations and asset creations for CMS, in some projects that I am aware of.

spaghetdefects

Most white collar work is writing documents that no one cares about. I've replaced 99% of my non-meeting workload with AI, and it's doing a great job.

athrowaway3z

Just to throw out the counterargument here. The way AI replaces work is in that there is an enormous ROI to work with fewer (and smarter) people. Those social interactions are a big part of work, but they are only very rarely "the work", and they cost time. In the cases that they are required; they seem to cluster and the ROI of fewer social synchronization problems increases even more. But that might all be wrong. I'm not confident enough to say where we'll land. I also see its possible demand will go up faster because of/and enabled by the increase in supply, and the social aspect is "the real work" to be done.

simianwords

The author talks about jobs requiring a human element but its not always true. A job always requires you to show your task one level higher - to the manager or whoever requires it. For example UI design can be replaced by AI. Unless UI or UX design people were bringing something like _taste_ instead of simply mechanically operating figma - they are not keeping their jobs. I genuinely don't need to learn SQL ever in my life. I just don't need it for dashboards or analytics use. A person whose main job was to translate requirements to SQL into a dashboard and nothing else would not keep their job anymore. The person to whom they were providing the analysis to could just perform the analysis themselves using AI. I do think that most jobs would change dramatically but for sure some of them would be eliminated completely.

laborcontract

I'm tired of arguments like this. If AI is helping you do work that you would have otherwise have had to pay people to do, then it is replacing white collar work.

ctoth

It's the same error pattern every time: identify what AI is currently "bad" at, define that as the essential core of the work, declare the work safe. Wait 6 months, shocked Pikachu gif.

Bratmon

I call dibs on writing this article next week!

georgemcbay

The fact that most of these "everything will be fine because we still need some humans in the loop" arguments never really talk about is that AI doesn't need to replace literally every white collar job to cause massive economic damage. The unemployment rate during the peak of the Great Depression was 25%, not 100%.

10xDev

The next revolution is coming and it is well needed. Society is becoming older, more tired and we need new fresh ideas to bring a lot of fields back to life. I hope it comes soon.

woeirua

I don’t get how you can see where we started three years ago and see where we are today and then _confidently_ say AI will not continue to improve. It’s not about where we are today folks (the intercept of the line). It’s about the rate of progress (the slope of the line).

andai

Not sure how "I want to know the meaning of this word" doesn't fall in the author's second category. Or why he couldn't have asked a human about the NaN thing. I know those are arbitrary examples but.. the behavior doesn't really seem to depend on the category? It might have to do more with how urgently the knowledge is needed?

bufordtwain

What about the 4000 Square employees that just got the boot?

andai

If I'm reading this right, the core thesis is that the main value in consulting is not in the correctness of the advice, but the ability to avoid taking responsibility. (And that this therefore cannot be automated by definition.) I suspect that will change as trust in automated systems increases. (For example the author seems to consider AI a source of "correctness", which implies this trust is already surprisingly high.)

jatins

> They rely on judgement, experience, and trust to set a plausible course and correct it when needed, and don’t hinge on determining a correct answer or providing facts We need judgement when we can't verify/prove that the answer is correct so we need a human we can trust. For example in author's example the pandas snippet is verifiably correct and I don't really care about judgement in that case. When there is a verification/test that gives a clear pass/fail to AI, the AI can just keep throwing stuff at the wall until it's green and it's good enough for a lot of use cases.

brtkwr

If your plan for humanity is to maintain the current number of jobs on one planet, sure, AI is a threat. But if you think we should be building civilisations beyond Earth, terraforming, mining the outer solar system, understanding the universe — then a 10x productivity gain per person isn't a disruption. It's barely enough to get started. We need 100x. 1000x. And we'll still be short-handed. The chain of operation never ends either. Every AI system needs someone to run it. Whatever runs it needs to be built and maintained. Follow that chain as far as you like — human agency doesn't disappear, it scales up. The universe is not running out of things that need doing. "AI will take our jobs" is not a civilisational concern. It's a failure to imagine what civilisation could actually be.

keiferski

I don’t think people actually read the article; because it makes a unique point about certain types of queries: I would have been interested in the experience and thoughts of someone whose opinions I respected, both as a social thing and to learn something. In other words, some types of questions are aimed at 1) building a social connection with the person you’re asking and 2) because you want to know what they , specifically, think about their topic. AI can’t really replace either of these. AIs might function as a weak social replacement for some people, but you aren’t really going to advance in your personal or professional life by making friends with Claude. A good example of the second one are AskMeAnything type forum posts: I don’t care what some generic celebrity/famous figure thinks about something, I care specifically about what George Clooney thinks about it. The AI will always be guessing, building a model on what George has said in the past, but it will never actually say what he thinks right now. For a more serious and contemporary example: there are dozens of videos on YouTube right now, interviews with various experts and pundits on the situation in Iran. Many of them have hundreds of thousands of views. But why would someone watch this instead of just asking ChatGPT what’s going on in Iran? Because we want to know what this particular person thinks.

est31

From my opinion, the block layoffs were a test, to see how a) a software company manages with only half of its employees now that there's powerful LLMs, and b) how the remaining employees react to the imminent threat of them being laid off as well. If block succeeds, we'll see more layoffs of that kind, probably even more extreme ones. You are not top senior level employee? Out. You don't single handedly cause 30% of the AI spend on your 15 person team? Out. People say how in five years there won't be seniors because one stopped junior hiring... in five years the seniors won't be needed either. Already today, we have single person billion dollar exits, high schoolers making millions from food apps. This is thanks to LLMs. The technology is there to replace most of the white collar work, it's just not applied enough yet. The economic system needs to adapt to not having labor being such a big redistributor.

kypro

Technology didn't replace agricultural workers either, but you'd think it did when you consider how few people work in agriculture as a share of the population today. By providing productivity tools you do effectively replace jobs because there's only so much of a good or service a person will want to consume. For example, just because a game dev studios can make 10x more games with AI, this doesn't mean the industry will make 10x more money unless demand for video games increases. Instead what is likely to happen as the cost of making games reduces is that the price of games for consumers will drop too as competition increases, which will turn hurt game dev profits, so game dev studios will likely have to be 10x smaller in the future – even if there's still technically people working in the industry. However when the work of agricultural workers became increasingly automated there were lots of other industries people could work instead, at the time that was factory work, and although the details will be different, I'm sure to some extent this will happen with white collar work too. But the question I'd ask today is what is that alternative source work, and is it as good as white collar work? Our economy went from, farming -> factory work -> office work. I strongly suspect the next step will be more people working in manual labour jobs and working in servant type roles. It's hard to see where else the demand will come from.

cadamsdotcom

People used to be programmers, but the ratio of typing to problem solving eventually caught up. Now programming is just part of the job. Software engineering is falling to this trend too (somewhat) The solution is to stop merely thinking of yourself as a software engineer and move up to the level of “manager of agents”.. but actually, managers deal with human stuff and this is fascinatingly mechanical - in fact even the unpredictability of these new tools is quite predictable. And so, a more useful framing is “software development process engineer”. You can look at all the literature on building factories and production lines for ideas on what you’ll be doing. You shouldn’t ever just have your agent write the software then review and ship it. You are missing massive opportunities to take yourself out of more loops over time. What self-reflection are you and the model doing to catch opportunities to improve? What is your method for codifying your acceptance criteria, so your agents can do the work to higher quality over time without you in the loop to get it there? What’s your process for continuous improvement? How do your models know what work other team members’ models are doing simultaneously so there’s less stepping on toes? Can THAT be automated so you don’t need to sit in Slack and trade “human-verbal locks” on areas of the architecture? There’s immense room for creativity in the role of a software development process engineer.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed