Even a human with very good hearing and knowledge of how a song is supposed to sound cannot tell the difference between CD quality audio and 256k AAC like iTunes uses.
Don’t believe all the nonsense audiophiles keep spewing out. Human ears suck. If we hadn’t had our giant brains to compensate, we’d be practically deaf.
This. People assume that because it’s “compressed” it must sound flatter, less dynamic, or just vaguely worse than uncompressed audio, despite the fact that audio compression specifically uses psychoacoustic models to remove the bits of data that our human ears and brains cannot hear to begin with.
I would guess that the fact that people aren’t all using some kind of standard-response reference headphones is probably going to have a considerably-larger impact on the human-perceivable fidelity of the audio reproduction than any other factor.
I don’t agree. It depends how the song was ripped and how the original was mastered. I did so much A/B testing at the time and found I couldn’t tell the difference between VBR 256 AAC and the CD. 128k mp3 sounded worse, 320k mp3 is pretty safe, but there were a lot of improvements to LAME over the years so newer files sound better. The biggest difference is the mastering. Generally 1980s reissues of 1970s analog masters sound worst, 1990s is best, 2000s everything got remastered to make it loud and crush dynamic range. The only real innovation since is Dolby Atmos on Apple Music which really brings alive the promise of 1970s quadraphonic.
The music on iTunes is compressed and doesn’t sound as good as a CD does.
Not to mention they can revoke your access to your music on iTunes. No one can take away your CD unless they break into your house!
iTunes got rid of DRM a decade and a half ago.
Sure but if you don’t have the song downloaded on your PC and they remove it from your library you can’t redownload it.
Most people aren’t backing up the songs they buy on iTunes.
Thank goodness they’ll let you redownload your CD if it gets damaged…
Even a human with very good hearing and knowledge of how a song is supposed to sound cannot tell the difference between CD quality audio and 256k AAC like iTunes uses.
Don’t believe all the nonsense audiophiles keep spewing out. Human ears suck. If we hadn’t had our giant brains to compensate, we’d be practically deaf.
This. People assume that because it’s “compressed” it must sound flatter, less dynamic, or just vaguely worse than uncompressed audio, despite the fact that audio compression specifically uses psychoacoustic models to remove the bits of data that our human ears and brains cannot hear to begin with.
Expectation bias is a helluva drug.
Even FLAC is compressed. Which is how I procure my music because I have the storage space.
Yup, although that doesn’t stop some weirdos out there claiming that CDs sound better than FLAC.
I would guess that the fact that people aren’t all using some kind of standard-response reference headphones is probably going to have a considerably-larger impact on the human-perceivable fidelity of the audio reproduction than any other factor.
I don’t agree. It depends how the song was ripped and how the original was mastered. I did so much A/B testing at the time and found I couldn’t tell the difference between VBR 256 AAC and the CD. 128k mp3 sounded worse, 320k mp3 is pretty safe, but there were a lot of improvements to LAME over the years so newer files sound better. The biggest difference is the mastering. Generally 1980s reissues of 1970s analog masters sound worst, 1990s is best, 2000s everything got remastered to make it loud and crush dynamic range. The only real innovation since is Dolby Atmos on Apple Music which really brings alive the promise of 1970s quadraphonic.