Big data on the cheapest MacBook

bcye 333 points 266 comments March 12, 2026
duckdb.org · View on Hacker News

Discussion Highlights (20 comments)

TutleCpt

Oh great, the term "big data" is back.

hermanzegerman

That's an awesome idea to get a bricked MacBook Neo really fast because those idiots soldered the SSD inside

BoredPositron

Queue the endless blog posts about running tech on the potato macbook and being stunned it’s functional with massive trade-offs. Groundbreaking stuff.

opentokix

Mind blown, if you need to handle "big" data on the move - the macbook neo is not the right choice. - Who would have guessed that outcome?

montroser

This is as much an indictment of AWS compute as it is anything else.

ody4242

I would have benchmarked with an instance that has local nvme, like c8gd.4xlarge.

zipping1549

> TL;DR: How does the latest entry-level MacBook perform on database workloads? We benchmarked it to find out. That's not tldr, that's just subheader.

tosh

For the TPC-DS results it would also have been nice to show how the macbook neo compares to the AWS instances. Or am I missing something?

Robdel12

I’ve been tempted to buy one and do “real dev work” on it just to show people it’s not this handicapped little machine. I built multiple iOS apps and went through two start up acquisitions with my M1 MBA as my primary computer, as a developer. And the neo is better than the M1 MBA. I edited my 30-45 min long 4k race videos in FCP on that air just fine.

ramgale

Seems completely unnecessary, there is probably 0 overlap between people who buy a cheap MacBook and people running DuckDB locally

TacticalCoder

I'm interested by one (not for big data) but only 8 GB or RAM is kinda really sad. My good old LG Gram (from 2017? 2015? don't even remember) already had 24 GB of RAM. That was 10 years ago. A decade later I cannot see myself being a laptop with 1/3rd the mem.

refactor_master

I think it’s relevant to first read [1] to see why they’re doing this. It’s basically done as a meme. [1] https://motherduck.com/blog/big-data-is-dead/

clamlady

as a broke ecologist, this little computer can do everything I need in R and word and is a phenomenal build for the price. I'm really enjoying it thus far.

varispeed

If you can fit it on a thumb drive, it's not Big Data.

nicoritschel

> compared to 3–5 GB/s Their numbers are a bit outdated. M5 Macbook pro SSDs are literally 5x this speed. It's wild.

onlyrealcuzzo

This is awesome. I wish more companies would do showcases like this of what kind of load you can expect from commodity-ish hardware.

tasuki

That's not Big Data. If you "need to process Big Data on the move" - what you need is a network.

__mharrison__

When I teach, I use "big data" for data that won't fit in a single machine. "Small data" fits on a single machine in memory and medium data on disk. Having said that duckDB is awesome. I recently ported a 20 year old Python app to modern Python. I made the backend swappable, polars or duckdb. Got a 40-80x speed improvement. Took 2 days.

aaronharnly

That c8g.metal-48xl instance costs $7.63008 on demand[1], so for the price of the laptop, you could run queries on it for about ~90 hours. :shrug: as to whether that makes the laptop or the giant instance the better place to do one's work… [1] https://aws.amazon.com/ec2/pricing/on-demand/

alex_creates

Funny just yesterday I almost bought one but got cold feet and opted for a low range MacBook with M5 chip. The Apple sales rep was not convinced it would be enough when i described using it for vibecoding and deploying so kind of talked me out of getting the Neo. I normally use a mix of LLMs, then connect to Github and do a one-click deploy on CreateOS. Do you think I over-reacted? The price of the Neo is SO attractive, a clean half price compared to what I got.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed