Unix Isn't for Agents

handfuloflight 18 points 15 comments March 05, 2026
pwhite.org · View on Hacker News

Discussion Highlights (12 comments)

xyzsparetimexyz

It's not X it's Y ass blog post

tensegrist

…what about, you know, sockets?

jaen

Might've fact-checked this article after letting Claude write it... Erlang processes are in no way what's commonly called "persistent" - there's no way to persist them automatically (ie. freeze to disk and wake later). It's even preferred to either use external databases like Mnesia or manually implement persistence in the actor itself for robustness. The only even slightly mainstream persistent environment is Smalltalk with its images. For Linux there's also CRIU, but that has quite a few caveats. If the definition of "persistent" is "processes run until they're done or they crash", well, pretty much everything fulfils that, so the distinction is meaningless. > "You can send messages to a process from a terminal, from another process, from a web socket, from anywhere." Also technically wrong. This only works if you have a cluster of erlang nodes already set up in full mesh mode. It's not auto-discoverable, scalable, P2P or anything remotely like that. If you want that, it's outside the Erlang stdlib. Websockets etc. need to be manually proxied. There's no standard protocol, security boundary or anything... Ugh, actually I give up, there's too many mistakes to even go into. Final opinion: slop.

AreShoesFeet000

It sounds like people are being paid to push Erlang for agentic stuff. Cool story, but if I end up feeling like I need to “return to the 20th century” for abstractions, I’ll just use Lisp.

sergius

Unix is a base layer that provides enough abstractions to build services. Perhaps there are better abstractions if one looks at Plan 9... but that never took off unfortunately. What they want is a framework for persisting stuff... that is an application level service... go ahead and build one.

chrisshroba

Processes by default get stdin and stdout. If you run a command interactively from your tty, the tty shows your stdout and collects stdin from you to pass to the process. But nothing about Unix prevents you from hooking something else up to stdin and stdout. As another commenter said, you could use sockets and have a completely generic input and output stream which an arbitrary tool could read and write to. Or you could spin up a websocket server that spawns the process and converts between websocket messages and stdin/stdout bytes. Or http requests and long polled responses. Or read input from a speech to text stream and push output into a text to speech stream. I feel like OP’s main gripe is that persistent interactive sessions should be supported without a third party tool like tmux or screen or zellij, but one of the main strengths of Unix is that it provides platform building blocks which can be composed to create whatever experience you’re looking for.

hybrid_study

This is kinda expected — OSes aren’t going to suddenly be rewritten for AI. What we’re really seeing is a hybrid model: hardware and kernel stay the same (primitives, isolation, scheduling), and on top of that you get an AI runtime / agent platform that handles task scheduling, shared state, and inter-agent coordination. Agents, tools, and workflows sit above that, orchestrating tasks. Deterministic programs remain because they’re cheaper, faster, and easier to verify; the AI layer just adds a structured way to coordinate and automate things without replacing the underlying OS. Basically, the kernel stays the substrate; the “agentic paradigm” lives above it.

yomismoaqui

Reading the article reminded me of a thing that I've seen Codex do lately. Instead of writing new files using a write tool (or apply_patch) it just does this: cat > lib/example.txt <<'EOF' Line 1 Line 2 EOF I don't know if Unix isn't for agents but surely they know how to use it.

jauntywundrkind

Coincidentally, I noticed yesterday that Deciduous (a tool, skill-set, & program for LLMs to look at git history & build decision graphs) opened a draft to switch to Elixer, from rust. https://github.com/notactuallytreyanastasio/deciduous/pull/1... OTP Supervision is listed as one of the primary reasons for this switch. I also noticed: 15,441 lines added / 109,254 lines removed ! There's definitely a bunch of agent related thing out there they have various daemons they kick off. I feel like often this is over some kind of fancy protocol, where there is still a somewhat thick CLI client. I wanted to play around with making a very thin CLI (and with a prefork model in rust) and wrote a little sample repo for myself to explore the idea: it just passes the stdin/stdout/stderr directly to the daemon, for it process with. https://tangled.org/jauntywk.bsky.social/pfd

bryanlarsen

> For AI agents, we're all running tmux. Or screen. Or nohup with tail -f. None of the above for me. I'm using either mosh or emacs --daemon.

cap11235

no skill, no taste, didn't read

cadamsdotcom

People used to hack persistence on to irc with long running daemons that acted as irc proxies and relayed stored messages when you connected. Then the paradigm changed to persistent storage owned by proprietary companies served over proprietary protocols and now your data isn’t your own. This will probably happen to agentic coding, so enjoy your tmux agent running on your own compute while you have it. We usually can’t have nice things, but thanks to an accident of Claude Code team being ferociously in love with the terminal, we currently do.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed