Axiom Home Page
Greg Lee posted on a thread I had about going 4K or OLED or 1080p. He mentioned this new technology that they are putting infront of us now called HDR-LED tv's that are moving to 10bit colour depths from the current 8bit that in theory will give a more accurate colour picture on our TV.

To understand the technology, you first have to consider the history. Back in the early days of home computing, the displays were CRT that in effect have infinite number of colours, but computers are digital and driven by 1's and 0's. So we broke colour down to a number of bits. A 1 bit colour can give you either an 'on' or 'off' for any colour, so in effect it is black and white. To get grey, you used a dithering technique that fools your eyes into thinking that a black pixel beside a white pixel is combined to make a grey. As memory was expensive the cost of running extra bits of data to represent colour was prohibitive.

Now if you jump to 2 bit colour, you are still in black and white but with 2 additional grey intensities. This makes the image appear less grainy but still lacks colour.




When you jump to 4 bit colour, you now have enough memory space to get actual colour. 16 distinct colours to be exact. As our eyes break down what we see into three colour cones those 16 colours are broken into 4 shades of red, green and blue plus black, white and 2 grey. With the use of dithering, you can make your eyes perceive more colours with the strategic placement of those colours.

But as technology moved on, and computers got more memory, we moved to 8 bit colour (256 distinct colours) that made everything look more realistic. Then 16bit (65536 distinct colours) and finally 24bit (16,777,216 colour variations). As more colours are added to the possible image pallet, the amount of dithering required is reduced as you can use a closer and closer approximation to the real thing. At first glance, the images are very close to being the same, but if you look much closer, you can see that they are all using some form of dithering to achieve the final image.

Now in TV terms we took on a new naming convention we started to use just the number of colours for a single channel. So a computers 24bit colour = 8bit on a TV.
The newer HEVC colour standards are now moving to 10bit colour on the TV. This improves the number of colours available to an impressive 1 billion+ colours.

But here is where the breakdown occurs. We are now not only going up in the resolution on the screen, but also wanting to increase the number of possible colours each one of those pixels represents. Flip back to the old days of computing, where colour was limited by the cost of memory, now we have a different limiting factor... bandwidth. According to the HDMI2 spec: The total bandwidth required for all of this data is 2440 x 4400 x 60 x 30 = 17.82 gigabits per second (Gbps).

H.265 is the new answer and it uses two techniques. Break the display into smaller blocks so you can define the colour of the pixels in that block relative to neighbouring pixels, and only update the things that have changed from frame to frame. But even that is not enough as the number of blocks goes up with the resolution count, the next step is like MP3 music files, start throwing away data that most people would not perceive as being different. In a slow moving not too much going on scene, the compression has enough bandwidth to keep up. As action starts to happen, and things move fast, then the amount of change exceeds the bandwidth available and something needs to get thrown out. The easiest to chuck is colour depth. So on your action movie, your 10bit colour is thrown out the window for 8bit, then 7bit, then 6bit colour so the frame rate is preserved. Failing that, then blocks of the screen get dropped from being updated and you get the NetFlix screen tare effect. Now on blu-ray there is a whole lot of data bandwidth compared to streaming, but at 4K you are still going to run out of bandwidth on a fast moving film. So your 10bit colour gets chucked in the process.
You might also note that the increase from 2k video resolution to 4k increases the opportunity for dithering. Since 4k resolution is overkill for showing picture detail at ordinary viewing distance, you can trade some of that unneeded resolution for more colors. I looked up product literature for some 4k video cameras, and noticed that they capture images with 10 bit color and dither that down to 8 bit color in the recording.
So why is it BS, Matt? Because compression will negate the benefit of the new capture/reproduction technologies?

Netflix or any content streaming don't belong in the same sentance as quality. The method users choose to conume content is not a failure of the parent technology.

Its like saying high quality audio is BS because top 40 pop music is popular.

It sounds like BT2020/HDR/8K/48fps will eventually be standard capture. You may wanna check out Joe Kane's recent appearance on HTGeeks. smile New video standards and tech are a busy topic these days. Easier to be an audiophile for the moment. Lol.
Originally Posted By Serenity_Now
So why is it BS, Matt? Because compression will negate the benefit of the new capture/reproduction technologies?


I am saying it is sort of BS right now as what they are selling if more of a false bill of goods. I don't know the spec for the new Blu-Ray drive, but I don't imagine that it is a quantum leap forward in speed. If you look at the current blu-ray drive data rates, they are significantly less than a computer Hard Drive.

Even an SSD that is your current pinnacle of speed is rated at 650Mb/s. The numbers I gave are just for the video.. you need to account for audio componant as well, so the amount of data needed for your video + going with the latest DD-HD-Atmos, you are going to need to read the data is quite a compressed format then expand it for viewing.

The selling points are that you can get 4K video with 10 bit colour. But what I am saying is that I am very doubtful that current technology can deliver that all the time. Even on todays blu-ray's the amount of video blur and post production compression is horrible. I have friends who own a movie theater with the digital camera's. The units cost in the hundreds of thousands of dollars. The movies are stored on multi disk raid arrays with with large SSD style buffers to play the movies on. They also are not playing at 4K resolution, rather the same 2k that your Blu-Ray is playing at. First hand I have seen how what is on the source is different than what is sold on your blu-ray. So compression is a definite factor. Move to 4K and it's just going to get worse.

Quote:
Netflix or any content streaming don't belong in the same sentance as quality. The method users choose to conume content is not a failure of the parent technology.


I just named Netflix as it's an easy example of how the same technology can be pushed beyond it's ability to deliver. I realize that Netflix does not have nearly the same amount of bandwidth that is available on a blu-ray, but you can and do get the same results when trying to push too much data through a small pipe. The only way for it to work is to compress and sometimes that means lossy.

Quote:
Its like saying high quality audio is BS because top 40 pop music is popular.

It sounds like BT2020/HDR/8K/48fps will eventually be standard capture. You may wanna check out Joe Kane's recent appearance on HTGeeks. smile New video standards and tech are a busy topic these days. Easier to be an audiophile for the moment. Lol.


No, it's like saying that MP3 files don't sound as good at WAV.. But some will say that MP3-320kbs does get close. But try as you may, i don't think that MP3 will work with 24bit192Hz recordings.

Yes, they are trying to up the spec to 8K now. Consider that 2K was dreamed up in the mid 1980's, and how long did it take to reach a real consumer market?
If you guys have seen the UHD shows on Amazon or Netflix, *and* you have adequate BW to stream it, you will be impressed. I was a skeptic until I saw it with my eyes.

SACD died out (partly?) because tests kept showing that people tested couldn't tell the difference. But when you switch between UHD and HD on Netflix and Amazon, the differences are night and day. My brother has 75Mbps service and I have 300Mbps service. And while it's compressed, the clarity is truly incredible.

hsb
I just cant wait until the 4k laser pjs are affordable!!

One thing is for sure, Netflix is here to stay. With any luck they will expand 4k titles to more devices as they come to market so more people can try it out.
My hope is that the fibre to the home we were promised and have been paying for on our phone bills for the past 30+ years come to fruition and we get propper GB ethernet internet connections so that this sort of technology that were are imagining can be a reality for everyone.
I'm with you on this one Matt. The format is way ahead of either the delivery bandwidth or the hardware available to most people. Netflix does not even do real HD from what I understand.

Moar better resolution and colour are wonderful things, but I'm not tossing 10K at it, nor will the vast majority of the market.
I think the extra pixels are likely not necessary and I like an enveloping experience. I like to sit about one screen width away from a 2.35:1 image. I am very excited about the additional stuff that UHD can offer: HDR, improvement in color (more colors with more bit depth) and High Frame Rate. From what I have heard, these three total can take up less bandwidth than just adding more pixels (with some of the standards that are being tossed around). My hope is that they get the better quality pixels *first* and then worry about getting extra pixels if space is allotted, because it is quite clear even from the new UHD Blu-Ray standard that it is not possible to get all of the improvements at once. I will not rebuy any movies just because of the extra pixels.

Honestly, I think the important order of improvements should be HDR, color, HFR and then extra pixels. My second to last generation Pioneer Kuro was only a 768p set that I was comparing with other 1080p sets. In particular I had one of the best (at the time) Samsung LCDs to compare with in my house. The difference between the two was night and day. The Pioneer, even though it had about half the number of pixels seemed to have more detail, because of the improved contrast ratio. I still love how it looks. If it is possible to improve the local contrast ratios even further (like it seems) and delivery of content to display on such displays, then that is something I am very excited about.

Color would a nice additional improvement, since even on good quality Blu-Ray titles you can still see banding in colors. Lastly, HFR would be hugely welcomed by me. Think about action movies with fast action and how at 24 fps. It doesn't look natural to me. I find it irritating.
Originally Posted By Nick B
My hope is that they get the better quality pixels *first* and then worry about getting extra pixels ...

It's sad, but it's apparent it's not going to work that way. You can't get better pixels without getting more pixels.

Quote:

Color would a nice additional improvement, since even on good quality Blu-Ray titles you can still see banding in colors.

There's color and color. Banding is a symptom of insufficient color depth, and that's connected with HDR. More dynamic range requires more levels of brightness, specifically, 10 bit color depth for current HDR. Wide color gamut lets the TV display higher saturation colors. Although HDR TVs tend to have wider color gamut, there isn't any technical connection between HDR and WCG, and WCG doesn't require a higher band width signal.

Like HDR, WCG can provide some benefit for picture quality even without HDR/UHD bluray source, if you are willing to trust the TV's upconversion algorithm to extrapolate the extra brightness highlights and extra-saturated colors, though the information is not actually there in the video signal.
Originally Posted By GregLee
Banding is a symptom of insufficient color depth, and that's connected with HDR. More dynamic range requires more levels of brightness, specifically, 10 bit color depth for current HDR. Wide color gamut lets the TV display higher saturation colors. Although HDR TVs tend to have wider color gamut, there isn't any technical connection between HDR and WCG, and WCG doesn't require a higher band width signal.


Sadly that is in fact totally wrong. There is more than sufficient colours inside an 8bit colour space. If what you said was true, then every single photograph that you take with a digital camera would come out with blazing banding and there would be riots in the streets because instgram would have failed. Oh the HORROR.

The banding you see it due to terrible artifacts caused by excessive compression. if you look up how H.264 works, it takes this massive amount of data and tries to compress it in a manner that has the least amount of visual impact. Sadly if fails quite badly at it. One of the techniques to compress an image is like a .jpeg where it cubes off a section of the image and then describes the colour of each pixel block by an offset to it's neighbouring pixel block. But even that takes up quite a bit of data space. So rather than doing each and every pixel, it starts to group almost near colours as just being the same colour. When you reach a point that the difference exceeds the delta, then you step up to another colour group. But sadly where you might have had 20 different colours, it has grouped all those as being a single colour. Then the next group can be a good 20+ shades of that colour different, so you get a band.

To see this in action. Take a photograph with a camera. Load it into you computer and then successively save it as a new jpeg with a higher and higher compression ratio. As around 80% compression you will start to see it slightly, but get to around the 40-45% compression then you will see a definite banding effect on just about any shadow surface where the colour gradually change to darker.

You must remember that the H.265 (the newer one for UHD) they are looking compress the video stream to 1/600 to 1/1000 of it's original size according to the spec.
Isn't the false contouring you're describing actually caused by compressing the gray scale? The symptom is in the color reproduction IIRC. Blue skies around a sun become halos etc.

Current TVs have modes and alorithms to handle this. HDR is supposed to eliminate it (supposedly.) I really want to demo one with 4K native content. Maybe in a year or so.

Hansang, how many 4K titles are on Netflix US so far? Is the tv you are using capable of true 4K and 10bit color with HDR? Just curious if the 4K content in FauxK is good/better/worse on a fully equipped TV vs a current model. I think there are only 3 or 4 models that are legit 4K/HDR/10bit ready. I need to start reading up on TVs again, but the Video jargon always puts me to sleep.
Originally Posted By oakvillematt

Sadly that is in fact totally wrong. There is more than sufficient colours inside an 8bit colour space.

The color gamut does not have to do with what colors fill up the 8 bit color space, that's the color depth. Color gamut concerns what that color space is -- what physical wavelengths the R, G, and B primaries have. The current standard for video, rec. 709, has R/G/B primaries and hence the color space defined in a way that the resulting color space is substantially smaller than that of human vision. The new standard, followed for UHD blu-ray, rec. 2020, has a wider gamut, but still smaller than the range of colors we can see in nature.

Here is an excellent online reference on color gamut. It will help you get oriented. The Pointer's Gamut
Originally Posted By Serenity_Now

-snippage -

Hansang, how many 4K titles are on Netflix US so far? Is the tv you are using capable of true 4K and 10bit color with HDR? Just curious if the 4K content in FauxK is good/better/worse on a fully equipped TV vs a current model. I think there are only 3 or 4 models that are legit 4K/HDR/10bit ready. I need to start reading up on TVs again, but the Video jargon always puts me to sleep.


Not true 4K. It's UHD at 3840 x 2160. Amazon's own series Bosch being one of them are in UHD. Netflix has their own shows in UHD as well. And few others like Breaking Bad. Youtube has plenty of samples of "4K" videos as well.

And of course S6 and GoPro records in UHD 3840x2160 so that works out nicely.

However, I don't think it has 10bit color since it's a "cheaper" 4K model. I never buy the top end anymore due to constant changing technology. However, I do give a "thumbs" up on the remote. The built in pointer function (remote works as a pointer so you click on things) is surprising useful.

Hansang
Yes Greg. You are so much smarter than I am and obviously know everything, so I will just shut up and defer all the answers to you.
Aren't Canadians supposed to be polite?
Originally Posted By pmbuko
Aren't Canadians supposed to be polite?

Oh shut up Peter!
laugh

I come back after a few months to this?
who stirred that pot anyway?

Geez Matt, you react as if someone slapped your mother after saying they heard impossible differences between amps because they did a listening test under completely uncontrolled, non-blind and biased methods in their own home but that everyone should just "believe" them.

Of all the posts in the forums, this one was actually good for info on non-audio topic. Greg has supplied a link to more info which is worth reading. Not everyone knows about everything.
Perhaps the link of info is worth considering and then responding to, assuming you have knowledge or expertise to address his response.

Sheesh.

I for one have not been following the latest in video compression. I suspect that once we gear up for a new large format tv in the media room i'll begin researching the topic again. I just don't like LED so this waiting game for something, perhaps OLED, is just painful.
Visit your brother, chesseroo.
Originally Posted By pmbuko
Visit your brother, chesseroo.

LOL
When in doubt, blame a close family member for plausible deniability.
Got it!

Might consider traveling south next year for a short stay.
On the agenda for local attractions to see, Peter and the boys hometown micropub crawl.
Originally Posted By oakvillematt
Yes Greg. You are so much smarter than I am and obviously know everything, so I will just shut up and defer all the answers to you.


Then Matt swung his arm through the Jenga tower. laugh Rough day man? I thought Greg was helping too (especially after the bold you're totally wrong. Lol.)

It's easy to get muddled up in the minutia of video jargon. Gets me too. Dont sweat it Matt. smile
If someone says your wrong and you are sure you are not then back up you position with proof. Mud slinging does not help anyone and I certainly cant learn if the experts are not sharing useful and factual info. There is no way for me to know which of you is correct. Its a shame since I am considering a tv upgrade and have not kept up with all the BS that goes with HDTV. For instance am I going to be screwed if I buy a tv now and then HDCP 2.2 come out. Will I need better cabling for the bandwidth , what other specs will I need to be aware of. I say put your ego's aside and lets hear some hard facts.
We are still very much in the early adoption phase of UHD. If all you are concerned about it more pixels, then you should be safe if the TV has HDCP 2.2 and HDMI 2.0 and you may need a new HDMI cable. If you want the other stuff, then you need at least HDMI 2.0a.

Look at the early adopters of HDTVs that only had analog composite video connections. A few years ago, the new cable boxes, blu-ray players, etc. would only sent out 480p to such TVs. The content providers don't care if early adopters get screwed and can't use the full potential of their expensive TV, they only care about keeping their content safe (or giving them the illusion of it being safe with these extra measures).

I want a new display really badly right now. Next summer I might even have the money to buy one, but I think I will upgrade to some nice dual subwoofers instead. The display market is really in flux, right now and it will likely not be a good time to buy a new display for 2 or 3 years, unless you just want a good 1080p set and not this new stuff we are talking about.
By the time Hdmi 2.2 is norm. It will be obsolete just go for the cheapest highest resolution you can get with the higher frame rate. By the time you calibrate it UHD is just a lot brighter. Unless its a projector size the 4k UHD details are slightly sharper and brighter.
Though all the stuff over on H.T.S. say the band width isn't up to par {only 10 or 15 Mbps}so far.
Though THEATER SHACK seems almost paid to promote a lot of products.
I get that impression too sometimes. The reviewers often talk about sponsored products, and keep them afterwards.

My posting slowed to almost nothing there, as "heck" was auto deleted from one of my posts. Censorship at that level is not cool.

Originally Posted By brendo
By the time you calibrate it UHD is just a lot brighter. Unless its a projector size the 4k UHD details are slightly sharper and brighter.


This may be the case now, but there seems to be a big enough push from the movie studios (with content), all the way through the TV manufacturers to make UHD more than that. If that is all that UHD is, then nobody will really care. If we can get high dynamic range and better and expanded colors, then that will make the whole thing very special. Just think about all the rave about the Pioneer Kuro plasmas and the last generation Panasonic plasmas. It was all about how great excellent contrast ratio and black levels look in a picture. The same is true on the projector end, how the JVC projectors are all the rave for the same reason. These displays and projectors have good performance in all areas of the board, but what sets them apart from all of the competitors is the contrast ratio and black levels. To a somewhat lesser extent, even when we consider the OLEDs by LG people get excited by the picture, even though they have had issues with uniformity of color and from what I remember with the first generation issues with shadow detail. It seems to be that contrast ratio is likely the biggest factor that can improve the perceived quality of an image. The exciting thing about high dynamic range is that we can get improvements in local contrast to levels we have never seem before. That is what gets me excited. This and all of the other stuff in the UHD package is enough to make the jump from HD maybe as big as the jump from standard definition to HD was, even without the extra resolution.


That is not to say that the industry can't just mess up the whole thing. Like just cranking the brightness and altering the expanded colors so that everything is artificially vibrant, to try to sell more TVs in the showroom with these more high performance TVs and saying that this is what UHD can offer. Just letting the good TVs show the extra performance that they have in UHD with quality material should be enough to sell the whole concept.
Thanks Nick,
I agree with your thoughts, lets just hope it becomes the normal resolution and doesn't just end up as 3D has gone. To short lived.
© Axiom Message Boards