Axiom Home Page
Posted By: tamzarian HDCD - Explanation Pls. - 06/29/07 07:55 PM
I have read and read about the HDCD process and just quite can't get it.

A CD is not recorded at 24 bits; a CD is 16 bits. When a CD says it was taken from a 24-bit digital master, then it was sampled down to 16 bits using the technique of dithering.

In an HDCD recording, 4 more bits are encoded into the regular bit stream by adding random noise which the HDCD decoding chip will recognize as more than just random noise, and thus start to decode those extra 4 bits to add to the analog end product. These 4 additional bits are mixed into the regular 16 bits via the random noise.

This seems to indicate that there are effectively 20 bits to be played back. So why is a 24-bit DAC required, rather than a 20-bit DAC?

My guess is that perhaps a separate 16-bit DAC and 8-bit DAC are used because 24-bit DACs are too expensive to manufacture, and the hardware processs can only provide 20 bits, when using a separate 16-bit DAC and 8-bit DAC.

Anyone got the real answer, or have I misunderstood the whole process?
Posted By: Mojo Re: HDCD - Explanation Pls. - 06/30/07 04:56 AM
I understand HDCD to be at least a 20 bit signal encoded into 16 bits. A control signal in one of the bits (probably the least significant bit) is used by an HDCD decoder to extract the information that is contained in the extra 4+ bits. So the signal that goes to the DAC is still 16 bits but if a 16 bit converter were used, linearity would be achieved to the 14th bit at the most. So a 20 bit or higher DAC is used.
Posted By: tamzarian Re: HDCD - Explanation Pls. - 06/30/07 04:10 PM
Mojo:

Thanks. Some more research indicates that you are correct, and the 24-bit DAC is in fact a real 24-bit DAC (not a 16 + 8). The extra 8 bits come from the encoding and interpolation process.
© Axiom Message Boards