Originally Posted By: CV
Wouldn't uncompressed and losslessly compressed have to be pretty much identical? I mean, there are still ways to compromise that in certain hardware configurations, but shouldn't they tend to be exactly the same, quality-wise?


Not just pretty much identical but in theory they should be bit-for-bit identical with absolutely no measurable difference all the way down to the last zero or one. But to know for certain we'd have to know that the compression process didn't do anything other than simply compress the audio, that the decompression similarly didn't apply any adjustments, and that the playback system treated both uncompressed and compressed audio in a similar manner. I'm not 100% confident that any of those three conditions are actually met in the real world.