Project Nomad – Knowledge That Never Goes Offline
jensgk
402 points
132 comments
March 22, 2026
Related Discussions
Found 5 related stories in 51.2ms across 3,471 title embeddings via pgvector HNSW
- Show HN: Atomic – Self-hosted, semantically-connected personal knowledge base kenforthewin · 78 pts · March 21, 2026 · 48% similar
- Offline 23 Hours a Day tinkelenberg · 21 pts · March 03, 2026 · 47% similar
- Ceno, browse the web without internet access mohsen1 · 118 pts · March 13, 2026 · 46% similar
- Show HN: Construct Computer – Agentic Cloud OS for Daily Work ankushKun · 19 pts · March 03, 2026 · 46% similar
- Show HN: s@: decentralized social networking over static sites remywang · 164 pts · March 12, 2026 · 44% similar
Discussion Highlights (20 comments)
tsss
I was expecting the game from my childhood and was disappointed.
myself248
See also: https://internet-in-a-box.org/ https://wrolpi.org/
WillAdams
Missing a chance to note (or configure for?) installation on a Raspberry Pi --- that'd make an affordable option to leave powered down, but ready to go in an EMI-shield/Faraday Cage.
JanisIO
Anyone thought about using a Steam Deck with this? Or explored the concept of a "Nomad Deck"?
moffers
Really clever targeting of a niche. I’d be interested to hear if they find success!
shevy-java
So how does that work?
bpavuk
turns out I have the same setup (sans local LLMs - they are pretty useless on 2018 cards) but in Obsidian :) whatever I think might be useful later, I capture through the web clipper extension. [0] [0]: https://obsidian.md/clipper
Yokohiii
I like the idea of an LLM that acts as a public knowledge base. But that doomsday framing on the site is pretty annoying.
mohamedkoubaa
Great premise for a science fiction story
iandanforth
I like this idea! I don't need the LLM bits, and want it to run on an old Android tablet I have lying around. Can anyone recommend similar software where I can get wikipedia / street maps / useful tutorial videos nicely packaged for offline use?
adsharma
So this thing is based on Kiwix, which is based on the ZIM file format. In the meanwhile, wikipedia ships wikidata, which uses RDF dumps (and probably 8x less compressed than it should be). https://www.wikidata.org/wiki/Wikidata:Database_download There is room for a third option leveraging commercial columnar database research. https://adsharma.github.io/duckdb-wikidata-compression/
Lapra
In a world where this is useful, you aren't going to be spending your precious battery on running an LLM...
ZeroCool2u
See I really want this in a simpler format. Like a single file embedded database on my filesystem that I can point a single/or few tools at for my model to use when it needs.
leowoo91
It could use some own wisdom not to use nodejs..
itintheory
Why does it have to have AI? Ugh.
cstaszak
I'm a fan of "civilization in a box" kinds of projects. However the ZIM file format leaves a lot to be desired in 2026. I've been exploring a refreshed, alternative approach: https://github.com/stazelabs/oza I do think having an LLM as an optional "sidecar" is a useful approach. If you can run a meaningful Ollama instance alongside your content, great!
amarant
>Knowledge That Never Goes Offline >What is Project N.O.M.A.D.? Node for Offline Media, Archives, and Data That's the first header, and the first sentence of the first paragraph, and I'm confused.
nelsonic
For anyone wanting the video explanation from the creator, watch: https://youtu.be/P_wt-2P-WBk
balkanist
This is really cool. Having offline Wikipedia + local LLMs in a single bundle is a great combo for emergency preparedness. Do you have any benchmarks on how it performs on lower-end hardware? Curious about minimum specs.
Aargau
Closing on 40 acres in Panama for an eco-resort. I was planning to build my own offline repository, but will check out this repo.