Can I hear a difference between MP3s and uncompressed audio?

nomemory 24 points 23 comments March 24, 2026
82mhz.net · View on Hacker News

Discussion Highlights (10 comments)

PaulHoule

I will concur with that. When I first started encoding MP3s I used a 128kbps rate which is noticeably inferior to the original CD. I noticed this in the early 2000s when I would up listening to a CD of some music I usually listened to as a 128kbps MP3 and was blown away with how much more I heard. I'd say that 192kbps is much better and the 320kbps that the author advocates is basically transparent.

LazyMans

Correctly identified with 100% accuracy. The author said they can't, but for me the mp3 versions have noticeable high frequency artifacts that make the recording sound slightly less clear. Using Sony XM5

hxorr

Some people simply have better hearing than others. Also, you can train yourself for what to listen for, to a point.

maxwg

Pretty great demo! It'd be great to see a 128/192 comparison. I had Tidal many years back, and from the Lossless v Regular I only ever noticed a difference when it came to breathy sounds/etc. I did see that Tidal would burn through like 50GB of data monthly though. Also - you may want to test some more modern recordings, the microphone/mastering quality of things nowadays is far better than what it was 2 decades ago (despite what some audiophiles may claim)

oliyoung

As the author points out, it's not really a "MP3 vs Uncompressed" conversation, it's a "which encoder are you using" conversation ... because any of us from the late 90s/early 2000s who used the early versions of LAME will tell you in a second how easy it was to pick MP3 over raw, even at 320kb/s

DiskoHexyl

It was really easy to tell which is which for the vocals. On the other hand, the only sample in which I didn't hear ANY difference is Ennio Morricone's, to the point where I couldn't really tell it apart from its 56kbit/s version. Can the hearing be selectively bad for some frequencies within the standard 20-20000 range, and normal for the others?

whatever1

My recommendation is try not to pay attention. Once you hear the difference in sound quality / see difference in image quality you cannot undo it. I have become very picky with display resolution and text clarity, and it has not served me well. I miss the days I was happy with a 1080p monitor.

etempleton

I was right for all but one. High frequencies give it away. I can tell the difference, but it was certainly close enough that I am not sure I care anymore.

jrmg

I wonder how likely it is that the people who are posting that they got most of them correct are just the people who happened to randomly guess correctly with 50/50 chance each time - people who guessed wrong or thought they couldn’t tell probably aren’t going to post…

kimixa

It could often depend on the encoder - things like lame have a hard low-pass filter even on the "insane" settings [0]. This can often mean, if you're someone who can detect that high frequency (probably not most adult), you may pretty easily be able to tell the difference if those frequencies are present in the recording. Additionally, a lot of audio pipelines (even beyond the DAC - like amplifiers and similar) can end up with artifacts and harmonics in more audible frequencies - this is often more notable at extremely high frequencies (like 96khz and similar) - there's honestly nothing any human can actually hear near that range - but that doesn't mean it doesn't then affect audible ranges when actually played back on real equipment. The big point is that "Being Able To Tell The Difference" isn't always the same as "Better Quality". You're often just replacing one artifact of the playback pipeline with another. Neither may truely match the original performance. [0] https://sound.stackexchange.com/questions/38109/lame-why-is-... - while not an explicit "low-pass" filter, the default option of "-Y" does something similar.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed