Previous Thread
Next Thread
Print Thread
Rate Thread
Page 1 of 3 1 2 3
The BS of HDR-TV and any high resolution video
#413556 07/19/15 12:06 PM
Joined: May 2014
Posts: 1,170
Likes: 6
M
MMM Offline OP
connoisseur
OP Offline
connoisseur
M
Joined: May 2014
Posts: 1,170
Likes: 6
Greg Lee posted on a thread I had about going 4K or OLED or 1080p. He mentioned this new technology that they are putting infront of us now called HDR-LED tv's that are moving to 10bit colour depths from the current 8bit that in theory will give a more accurate colour picture on our TV.

To understand the technology, you first have to consider the history. Back in the early days of home computing, the displays were CRT that in effect have infinite number of colours, but computers are digital and driven by 1's and 0's. So we broke colour down to a number of bits. A 1 bit colour can give you either an 'on' or 'off' for any colour, so in effect it is black and white. To get grey, you used a dithering technique that fools your eyes into thinking that a black pixel beside a white pixel is combined to make a grey. As memory was expensive the cost of running extra bits of data to represent colour was prohibitive.

Now if you jump to 2 bit colour, you are still in black and white but with 2 additional grey intensities. This makes the image appear less grainy but still lacks colour.




When you jump to 4 bit colour, you now have enough memory space to get actual colour. 16 distinct colours to be exact. As our eyes break down what we see into three colour cones those 16 colours are broken into 4 shades of red, green and blue plus black, white and 2 grey. With the use of dithering, you can make your eyes perceive more colours with the strategic placement of those colours.

But as technology moved on, and computers got more memory, we moved to 8 bit colour (256 distinct colours) that made everything look more realistic. Then 16bit (65536 distinct colours) and finally 24bit (16,777,216 colour variations). As more colours are added to the possible image pallet, the amount of dithering required is reduced as you can use a closer and closer approximation to the real thing. At first glance, the images are very close to being the same, but if you look much closer, you can see that they are all using some form of dithering to achieve the final image.

Now in TV terms we took on a new naming convention we started to use just the number of colours for a single channel. So a computers 24bit colour = 8bit on a TV.
The newer HEVC colour standards are now moving to 10bit colour on the TV. This improves the number of colours available to an impressive 1 billion+ colours.

But here is where the breakdown occurs. We are now not only going up in the resolution on the screen, but also wanting to increase the number of possible colours each one of those pixels represents. Flip back to the old days of computing, where colour was limited by the cost of memory, now we have a different limiting factor... bandwidth. According to the HDMI2 spec: The total bandwidth required for all of this data is 2440 x 4400 x 60 x 30 = 17.82 gigabits per second (Gbps).

H.265 is the new answer and it uses two techniques. Break the display into smaller blocks so you can define the colour of the pixels in that block relative to neighbouring pixels, and only update the things that have changed from frame to frame. But even that is not enough as the number of blocks goes up with the resolution count, the next step is like MP3 music files, start throwing away data that most people would not perceive as being different. In a slow moving not too much going on scene, the compression has enough bandwidth to keep up. As action starts to happen, and things move fast, then the amount of change exceeds the bandwidth available and something needs to get thrown out. The easiest to chuck is colour depth. So on your action movie, your 10bit colour is thrown out the window for 8bit, then 7bit, then 6bit colour so the frame rate is preserved. Failing that, then blocks of the screen get dropped from being updated and you get the NetFlix screen tare effect. Now on blu-ray there is a whole lot of data bandwidth compared to streaming, but at 4K you are still going to run out of bandwidth on a fast moving film. So your 10bit colour gets chucked in the process.

Last edited by oakvillematt; 07/19/15 12:40 PM.

Anthem: AVM60, Fosi DAC-Q5
Axiom: ADA1500, LFR1100 Actiive, QS8, EP500, M3, M3comp, M5
Re: The BS of HDR-TV and any high resolution video
MMM #413560 07/19/15 04:40 PM
Joined: Mar 2003
Posts: 144
veteran
Offline
veteran
Joined: Mar 2003
Posts: 144
You might also note that the increase from 2k video resolution to 4k increases the opportunity for dithering. Since 4k resolution is overkill for showing picture detail at ordinary viewing distance, you can trade some of that unneeded resolution for more colors. I looked up product literature for some 4k video cameras, and noticed that they capture images with 10 bit color and dither that down to 8 bit color in the recording.


Greg
VP180, M80s, M22s, QS8(4), CSW S305s, EP500, Pioneer VSX-90
M2i, M3(2), Pio vsx-1020
Re: The BS of HDR-TV and any high resolution video
MMM #413563 07/19/15 05:19 PM
Joined: Mar 2014
Posts: 1,593
Likes: 1
A
connoisseur
Offline
connoisseur
A
Joined: Mar 2014
Posts: 1,593
Likes: 1
So why is it BS, Matt? Because compression will negate the benefit of the new capture/reproduction technologies?

Netflix or any content streaming don't belong in the same sentance as quality. The method users choose to conume content is not a failure of the parent technology.

Its like saying high quality audio is BS because top 40 pop music is popular.

It sounds like BT2020/HDR/8K/48fps will eventually be standard capture. You may wanna check out Joe Kane's recent appearance on HTGeeks. smile New video standards and tech are a busy topic these days. Easier to be an audiophile for the moment. Lol.

Re: The BS of HDR-TV and any high resolution video
AAAA #413564 07/19/15 07:37 PM
Joined: May 2014
Posts: 1,170
Likes: 6
M
MMM Offline OP
connoisseur
OP Offline
connoisseur
M
Joined: May 2014
Posts: 1,170
Likes: 6
Originally Posted By Serenity_Now
So why is it BS, Matt? Because compression will negate the benefit of the new capture/reproduction technologies?


I am saying it is sort of BS right now as what they are selling if more of a false bill of goods. I don't know the spec for the new Blu-Ray drive, but I don't imagine that it is a quantum leap forward in speed. If you look at the current blu-ray drive data rates, they are significantly less than a computer Hard Drive.

Even an SSD that is your current pinnacle of speed is rated at 650Mb/s. The numbers I gave are just for the video.. you need to account for audio componant as well, so the amount of data needed for your video + going with the latest DD-HD-Atmos, you are going to need to read the data is quite a compressed format then expand it for viewing.

The selling points are that you can get 4K video with 10 bit colour. But what I am saying is that I am very doubtful that current technology can deliver that all the time. Even on todays blu-ray's the amount of video blur and post production compression is horrible. I have friends who own a movie theater with the digital camera's. The units cost in the hundreds of thousands of dollars. The movies are stored on multi disk raid arrays with with large SSD style buffers to play the movies on. They also are not playing at 4K resolution, rather the same 2k that your Blu-Ray is playing at. First hand I have seen how what is on the source is different than what is sold on your blu-ray. So compression is a definite factor. Move to 4K and it's just going to get worse.

Quote:
Netflix or any content streaming don't belong in the same sentance as quality. The method users choose to conume content is not a failure of the parent technology.


I just named Netflix as it's an easy example of how the same technology can be pushed beyond it's ability to deliver. I realize that Netflix does not have nearly the same amount of bandwidth that is available on a blu-ray, but you can and do get the same results when trying to push too much data through a small pipe. The only way for it to work is to compress and sometimes that means lossy.

Quote:
Its like saying high quality audio is BS because top 40 pop music is popular.

It sounds like BT2020/HDR/8K/48fps will eventually be standard capture. You may wanna check out Joe Kane's recent appearance on HTGeeks. smile New video standards and tech are a busy topic these days. Easier to be an audiophile for the moment. Lol.


No, it's like saying that MP3 files don't sound as good at WAV.. But some will say that MP3-320kbs does get close. But try as you may, i don't think that MP3 will work with 24bit192Hz recordings.

Yes, they are trying to up the spec to 8K now. Consider that 2K was dreamed up in the mid 1980's, and how long did it take to reach a real consumer market?


Anthem: AVM60, Fosi DAC-Q5
Axiom: ADA1500, LFR1100 Actiive, QS8, EP500, M3, M3comp, M5
Re: The BS of HDR-TV and any high resolution video
MMM #413571 07/19/15 09:33 PM
Joined: Feb 2007
Posts: 602
H
aficionado
Offline
aficionado
H
Joined: Feb 2007
Posts: 602
If you guys have seen the UHD shows on Amazon or Netflix, *and* you have adequate BW to stream it, you will be impressed. I was a skeptic until I saw it with my eyes.

SACD died out (partly?) because tests kept showing that people tested couldn't tell the difference. But when you switch between UHD and HD on Netflix and Amazon, the differences are night and day. My brother has 75Mbps service and I have 300Mbps service. And while it's compressed, the clarity is truly incredible.

hsb


--
Denon 4520, EPIC80/500/VP180 Speakers
Re: The BS of HDR-TV and any high resolution video
MMM #413572 07/19/15 10:58 PM
Joined: Mar 2014
Posts: 1,593
Likes: 1
A
connoisseur
Offline
connoisseur
A
Joined: Mar 2014
Posts: 1,593
Likes: 1
I just cant wait until the 4k laser pjs are affordable!!

One thing is for sure, Netflix is here to stay. With any luck they will expand 4k titles to more devices as they come to market so more people can try it out.

Re: The BS of HDR-TV and any high resolution video
MMM #413573 07/20/15 01:21 AM
Joined: May 2014
Posts: 1,170
Likes: 6
M
MMM Offline OP
connoisseur
OP Offline
connoisseur
M
Joined: May 2014
Posts: 1,170
Likes: 6
My hope is that the fibre to the home we were promised and have been paying for on our phone bills for the past 30+ years come to fruition and we get propper GB ethernet internet connections so that this sort of technology that were are imagining can be a reality for everyone.


Anthem: AVM60, Fosi DAC-Q5
Axiom: ADA1500, LFR1100 Actiive, QS8, EP500, M3, M3comp, M5
Re: The BS of HDR-TV and any high resolution video
MMM #413577 07/21/15 01:46 AM
Joined: Dec 2007
Posts: 7,786
axiomite
Offline
axiomite
Joined: Dec 2007
Posts: 7,786
I'm with you on this one Matt. The format is way ahead of either the delivery bandwidth or the hardware available to most people. Netflix does not even do real HD from what I understand.

Moar better resolution and colour are wonderful things, but I'm not tossing 10K at it, nor will the vast majority of the market.


Fred

-------
Blujays1: Spending Fred's money one bottle at a time, no two... Oh crap!
Re: The BS of HDR-TV and any high resolution video
fredk #413584 07/21/15 07:25 PM
Joined: Aug 2006
Posts: 504
N
aficionado
Offline
aficionado
N
Joined: Aug 2006
Posts: 504
I think the extra pixels are likely not necessary and I like an enveloping experience. I like to sit about one screen width away from a 2.35:1 image. I am very excited about the additional stuff that UHD can offer: HDR, improvement in color (more colors with more bit depth) and High Frame Rate. From what I have heard, these three total can take up less bandwidth than just adding more pixels (with some of the standards that are being tossed around). My hope is that they get the better quality pixels *first* and then worry about getting extra pixels if space is allotted, because it is quite clear even from the new UHD Blu-Ray standard that it is not possible to get all of the improvements at once. I will not rebuy any movies just because of the extra pixels.

Honestly, I think the important order of improvements should be HDR, color, HFR and then extra pixels. My second to last generation Pioneer Kuro was only a 768p set that I was comparing with other 1080p sets. In particular I had one of the best (at the time) Samsung LCDs to compare with in my house. The difference between the two was night and day. The Pioneer, even though it had about half the number of pixels seemed to have more detail, because of the improved contrast ratio. I still love how it looks. If it is possible to improve the local contrast ratios even further (like it seems) and delivery of content to display on such displays, then that is something I am very excited about.

Color would a nice additional improvement, since even on good quality Blu-Ray titles you can still see banding in colors. Lastly, HFR would be hugely welcomed by me. Think about action movies with fast action and how at 24 fps. It doesn't look natural to me. I find it irritating.

Re: The BS of HDR-TV and any high resolution video
Nick B #413586 07/21/15 09:31 PM
Joined: Mar 2003
Posts: 144
veteran
Offline
veteran
Joined: Mar 2003
Posts: 144
Originally Posted By Nick B
My hope is that they get the better quality pixels *first* and then worry about getting extra pixels ...

It's sad, but it's apparent it's not going to work that way. You can't get better pixels without getting more pixels.

Quote:

Color would a nice additional improvement, since even on good quality Blu-Ray titles you can still see banding in colors.

There's color and color. Banding is a symptom of insufficient color depth, and that's connected with HDR. More dynamic range requires more levels of brightness, specifically, 10 bit color depth for current HDR. Wide color gamut lets the TV display higher saturation colors. Although HDR TVs tend to have wider color gamut, there isn't any technical connection between HDR and WCG, and WCG doesn't require a higher band width signal.

Like HDR, WCG can provide some benefit for picture quality even without HDR/UHD bluray source, if you are willing to trust the TV's upconversion algorithm to extrapolate the extra brightness highlights and extra-saturated colors, though the information is not actually there in the video signal.


Greg
VP180, M80s, M22s, QS8(4), CSW S305s, EP500, Pioneer VSX-90
M2i, M3(2), Pio vsx-1020
Page 1 of 3 1 2 3

Moderated by  alan, Amie, Andrew, axiomadmin, Brent, Debbie, Ian, Jc 

Link Copied to Clipboard

Need Help Graphic

Forum Statistics
Forums16
Topics24,943
Posts442,465
Members15,617
Most Online2,082
Jan 22nd, 2020
Top Posters
Ken.C 18,044
pmbuko 16,441
SirQuack 13,840
CV 12,077
MarkSJohnson 11,458
Who's Online Now
0 members (), 877 guests, and 4 robots.
Key: Admin, Global Mod, Mod
Newsletter Signup
Powered by UBB.threads™ PHP Forum Software 7.7.4