Show HN: MacMind – A transformer neural network in HyperCard on a 1989 Macintosh

hammer32 135 points 35 comments April 16, 2026
github.com · View on Hacker News

I trained a transformer in HyperCard. 1,216 parameters. 1989 Macintosh. And yes, it took a while. MacMind is a complete transformer neural network, embeddings, positional encoding, self-attention, backpropagation, and gradient descent, implemented entirely in HyperTalk, the scripting language Apple shipped with HyperCard in 1987. Every line of code is readable inside HyperCard's script editor. Option-click any button and read the actual math. The task: learn the bit-reversal permutation, the opening step of the Fast Fourier Transform. The model has no formula to follow. It discovers the positional pattern purely through attention and repeated trial and error. By training step 193, it was oscillating between 50%, 75%, and 100% accuracy on successive steps, settling into convergence like a ball rolling into a bowl. The whole "intelligence" is 1,216 numbers stored in hidden fields in a HyperCard stack. Save the file, quit, reopen: the trained model is still there, still correct. It runs on anything from System 7 through Mac OS 9. As a former physics student, and the FFT is an old friend, it sits at the heart of signal processing, quantum mechanics, and wave analysis. I built this because we're at a moment where AI affects all of us but most of us don't understand what it actually does. Backpropagation and attention are math, not magic. And math doesn't care whether it's running on a TPU cluster or a 68030 from 1989. The repo has a pre-trained stack (step 1,000), a blank stack you can train yourself, and a Python/NumPy reference implementation that validates the math.

Discussion Highlights (9 comments)

gcanyon

It's strange to think how modern concepts are only modern because no one thought of them back then. This feels (to me) like the germ theory being transferred back to the ancient greeks.

DetroitThrow

This is very cool. Any more demos of inference output?

hyperhello

Hello, if there are no XCMDs it should work adequately in HyperCard Simulator. I am only on my phone but I took a minute to import it. https://hcsimulator.com/imports/MacMind---Trained-69E0132C

edwin

There’s something quietly impressive about getting modern AI ideas to run on old hardware (like OP's project or running LLM inference on Windows 3.1 machines). It’s easy to think all the progress is just bigger GPUs and more compute, but moments like that remind you how much of it is just more clever math and algorithms squeezing signal out of limited resources. Feels closer to the spirit of early computing than the current “throw hardware at it” narrative.

immanuwell

The architecture of macmind looks pretty interesting

tty456

Where's the code for the actual HyperCard and building of the .img? I only see the python validator in the repo.

rcarmo

Neat. Looks like I found my new benchmark for my ARM64 JIT for BasiliskII :) (still debugging it, but getting closer to full coverage)

watersb

This is great! I first studied back-propagation in 1988, at the same time I fell in love with HyperCard programming. This project helps me recall this elegant weapon for a more civilized age.

nxobject

I love this. From reading the nuts-and-bolts "parameters" (haha) of your implementation, I get the impression that the fundamental limit is, well, using a 32-bit platform to address the sizes of data that usually need at least 48 bits!

Semantic search powered by Rivestack pgvector
4,783 stories · 45,112 chunks indexed