The Problem That Built an Industry
ShaggyHotDog
119 points
41 comments
April 11, 2026
Related Discussions
Found 5 related stories in 69.6ms across 4,259 title embeddings via pgvector HNSW
- The back story behind the first "$1.8B" dollar "AI Company" chermanowicz · 50 pts · April 06, 2026 · 49% similar
- What keeps IoT devices running for a decade mooracle · 18 pts · March 02, 2026 · 48% similar
- Five layers from writing code to writing companies joostrothweiler · 11 pts · March 13, 2026 · 48% similar
- Why Over-Engineering Happens zuhayeer · 34 pts · April 05, 2026 · 47% similar
- Fire the CEO, Introducing the AxO's boringops-dan · 78 pts · March 03, 2026 · 47% similar
Discussion Highlights (9 comments)
paulnpace
> It...handles 50,000 transactions per second with sub-100ms latency on hardware that costs a fraction of an equivalent cloud footprint. It has been doing this for 60 years. Eat that, Bitcoin.
outside1234
It is interesting to think how AI will potentially change the dynamics back to this from general purpose software. In a world where implementation is free, will we see a return to built for purpose systems like this where we define the inputs and outputs desired and AI builds it from the ground up, completely for purpose?
arrsingh
Interesting to note right at the start of the article that they sat on a plane next to each other in 1953 but the formal partnership between AA and IBM was not till 1959 - 6 years later! The article makes it look like all this happened magically fast but in reality a reminder that things take time! >> is almost mythological. In 1953, C.R. Smith, president of American Airlines, was seated next to R. Blair Smith, an IBM salesman, on a cross-country flight. By the time they landed, the outline of a solution had been sketched. IBM and American Airlines entered a formal development partnership in 1959. edit: oh and then the actual system didn't actually go live another 5 years later - in 1964. Over a decade after the two of them sat next to each other. Reminder to myself when my potential customers don't sign the deal 5 minutes after my pitch!
StilesCrisis
"The key insight is [...]. No daemons. No background threads. No connection state persisted in memory between transactions." Closed the tab.
cr125rider
Can you add RSS to your site? I’d love to follow but can’t.
neilv
ITA Software integrated with the mainframe network, and was acquired by Google. An exec made a public quote that they couldn't have done it if they hadn't used Lisp. (Today, the programming language landscape is somewhat more powerful. Rust got some metaprogramming features informed by Lisps, for example, and the team might've been able to slog through that.)
zer00eyz
SABRE, is a reminder that things that are well designed just work. How many banks and ERP's, how many accounting systems are still running COBOL scripts? (A lot). Think about modern web infrastructure and how we deploy... cpu -> hypervisor -> vm -> container -> run time -> library code -> your code Do we really need to stack all these turtles (abstractions) just to get instructions to a CPU? Every one of those layers has offshoots to other abstractions, tools and functionality that only adds to the complexity and convolution. Languages like Rust and Go compiling down to an executable are a step, revisiting how we deploy (the container layer) is probably on the table next... The use case for "serverless" is there (and edge compute), but the costs are still backwards because the software hasn't caught up yet.
andai
>TPF is not modern. It would fail every architectural review a contemporary engineering team would apply to it. It also handles 50,000 transactions per second with sub-100ms latency on hardware that costs a fraction of an equivalent cloud footprint. It has been doing this for 60 years. What kind of review would it fail? Sounds like it's pretty well designed to me.
conductr
Interesting they went from 90 minute manual booking time to microseconds. I’m unsure of what the landscape was really like before Unix and such, maybe this was just how software was and everything was a bespoke ordeal. But, it makes me wonder if something less fast could have been “better”. As in, faster to build or easier to maintain/improve, something where we still weren’t talking about wrestling with legacy software that runs the world, those kinds of things. Even 1 second transaction speed sounds slow today but if it’s replacing a 90 minute manual process I’d rather have that solution now than a microsecond fast solution that takes 5-10 years.