Rob Pike’s Rules of Programming (1989)
vismit2000
901 points
423 comments
March 18, 2026
Related Discussions
Found 5 related stories in 50.9ms across 3,471 title embeddings via pgvector HNSW
- The unwritten laws of software engineering AntonZ234 · 16 pts · March 17, 2026 · 50% similar
- Can Programming Be Liberated from the von Neumann Style? (1977) [pdf] tosh · 21 pts · March 22, 2026 · 49% similar
- How to Write Unmaintainable Code (1999) downbad_ · 35 pts · April 03, 2026 · 48% similar
- Methods in Languages for Systems Programming (2023) surprisetalk · 17 pts · March 16, 2026 · 48% similar
- The Two Worlds of Programming HotGarbage · 13 pts · March 19, 2026 · 46% similar
Discussion Highlights (20 comments)
embedding-shape
"Epigrams in Programming" by Alan J. Perlis has a lot more, if you like short snippets of wisdom :) https://www.cs.yale.edu/homes/perlis-alan/quotes.html > Rule 5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming. Always preferred Perlis' version, that might be slightly over-used in functional programming to justify all kinds of hijinks, but with some nuance works out really well in practice: > 9. It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures.
kleiba
I believe the "premature evil" quote is by Knuth, not Hoare?!
bsenftner
Obvious. Why the elevation of the obvious?
heresie-dabord
See Tony Hoare: https://news.ycombinator.com/item?id=47325225
tobwen
Added to AGENTS.md :)
CharlieDigital
I feel like 1 and 2 are only applicable in cases of novelty. The thing is, if you build enough of the same kinds of systems in the same kinds of domains, you can kinda tell where you should optimize ahead of time. Most of us tend to build the same kinds of systems and usually spend a career or a good chunk of our careers in a given domain. I feel like you can't really be considered a staff/principal if you can't already tell ahead of time where the perf bottleneck will be just on experience and intuition.
Mercuriusdream
never expected it to be a single HTML file so kind of surprised, but straight to the point, to be honest.
anthk
9front it's distilled Unix. I corrected Russ Cox' 'xword' to work in 9front and I am just a newbie. No LLM's, that's Idiocratic, like the movie; just '9intro.us.pdf' and man pages. LLM's work will never be reproducible by design.
piranha
> Rule 5 is often shortened to "write stupid code that uses smart objects". This is probably the worst use of the word "shortened" ever, and it should be more like "mutilated"?
keyle
Rule 5 is definitely king. Code acts on data, if the data is crap, you're already lost. edit: s/data/data structure/
DaleBiagio
The attribution to Hoare is a common error — "Premature optimization is the root of all evil" first appeared in Knuth's 1974 paper "Structured Programming with go to Statements." Knuth later attributed it to Hoare, but Hoare said he had no recollection of it and suggested it might have been Dijkstra. Rule 5 aged the best. "Data dominates" is the lesson every senior engineer eventually learns the hard way.
elcapitan
Meta: Love the simplicity of the page, no bullshit. Funny handwritten html artifact though: <title> <h1>Rob Pike's 5 Rules of Programming</h1> </title>
doe88
Great rules, but Rule 3.: WOW, so true, so well enunciated, masterful.
Devasta
> "Premature optimization is the root of all evil." This Axiom has caused far and away more damage to software development than the premature optimization ever will.
nateb2022
Previous discussion: https://news.ycombinator.com/item?id=15776124 (8 years ago, 18 comments)
jcmartinezdev
Rule 6: Never disagree with AI slop
igtztorrero
Rule 4, I have always practiced and demanded of junior programmers, to make algorithms and structures that are simple to understand, for our main user: the one who will modify this code in the future. I believe that's why Golang is a very simple but powerful language.
anymouse123456
There are very few phrases in all of history that have done more damage to the project of software development than: "Premature optimization is the root of all evil." First, let's not besmirch the good name of Tony Hoare. The quote is from Donald Knuth, and the missing context is essential. From his 1974 paper, "Structured Programming with go to Statements": "Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%." He was talking about using GOTO statements in C. He was talking about making software much harder to reason about in the name of micro-optimizations. He assumed (incorrectly) that we would respect the machines our software runs on. Multiple generations of programmers have now been raised to believe that brutally inefficient, bloated, and slow software is just fine. There is no limit to the amount of boilerplate and indirection a computer can be forced to execute. There is no ceiling to the crystalline abstractions emerging from these geniuses. There is no amount of time too long for a JVM to spend starting. I worked at Google many years ago. I have lived the absolute nightmares that evolve from the willful misunderstanding of this quote. No thank you. Never again. I have committed these sins more than any other, and I'm mad as hell about it.
tasuki
The first four are kind of related. For me the fifth is the important – and oft overlooked – one: > Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.
ta20211004_1
Can't agree more on 5. I've repeatedly found that any really tricky programming problem is (eventually) solved by iterative refinement of the data structures (and the APIs they expose / are associated with). When you get it right the control flow of a program becomes straightforward to reason about. To address our favorite topic: while I use LLMs to assist on coding tasks a lot, I think they're very weak at this. Claude is much more likely to suggest or expand complex control flow logic on small data types than it is to recognize and implement an opportunity to encapsulate ideas in composable chunks. And I don't buy the idea that this doesn't matter since most code will be produced and consumed by LLMs. The LLMs of today are much more effective on code bases that have already been thoughtfully designed. So are humans. Why would that change?