Reliable Software in the LLM Era
mempirate
102 points
33 comments
March 12, 2026
Related Discussions
Found 5 related stories in 46.9ms across 3,471 title embeddings via pgvector HNSW
- How I write software with LLMs indigodaddy · 69 pts · March 16, 2026 · 65% similar
- Lf-lean: The frontier of verified software engineering alpaylan · 18 pts · March 12, 2026 · 57% similar
- Taming LLMs: Using Executable Oracles to Prevent Bad Code mad44 · 32 pts · March 26, 2026 · 56% similar
- LLMs work best when the user defines their acceptance criteria first dnw · 137 pts · March 07, 2026 · 56% similar
- Llm9p: LLM as a Plan 9 file system mleroy · 15 pts · March 08, 2026 · 55% similar
Discussion Highlights (7 comments)
dude250711
AI Era, Agentic Era, LLM Era... Can we settle on Slop Decade?
sastraxi
The idea is interesting, but have some more respect for your potential readers and actually write the post. There’s so much AI sales drivel here it’s hard to see what’s interesting about your product. I’m more interested in the choices behind your design decisions than being told “trust me, it’ll work”.
_pdp_
Nothing changes in terms of how to make reliable software. You need the same things like unit tests, integration tests, monitoring tools, etc. Basically AI now makes every product operate as if it has a vibrant open-source community with hundreds of contributions per day and a small core team with limited capacity.
OutOfHere
"Spec validation" is extremely underrated. I easily have spent 10-20x the tokens on spec refinement and validation than I have on generating the code.
esafak
I haven't even used TLA+ yet and now it's got derivatives... My understanding is: TLA+ but like C, functional, and typed.
dijit
> But here’s the hopeful part: I hope this is a tongue in cheek jab at how AI writes prose, because Claude loves to prefix lines with this.
shanjai_raj7
the part that is hard is when the model gets updated and your prompts behave differently. we dont always catch it in tests because the output still looks correct, just slightly off. by the time you notice something is wrong it has already been like that for a while.