HDMI 2.0 IS A LIE

Okay. The title in this one pretty much reveals what this article is about. Well, sort of. It’s also a bit clickbaity, so I apologise. Please read on.

HDMI (High Definition Media Interface) 2.0 isn’t actually a lie, but the way manufactures have represented it is a bit cheeky. One of the key promises brought by HDMI 2.0 was “4K@60Hz with no new cables”. It’s great for gamers, and brings HDMI up to the standard that Thunderbolt/DisplayPort and DVI (Digital Video Interface) have been at for a while now. It’s honestly been a long time coming, since the 4K implementation on HDMI 1.4b products was honestly half arsed.

Except there’s one massive problem with HDMI 2.0. It’s actually no faster than HDMI 1.4b. Seriously.

You might be thinking “how the hell have they managed that?” and it’s a good question. The technical answer is long and boring (not really, quite fascinating in a Sheldon Cooperish fashion) but it essentially relies on two things. Colour subsampling and bit depth. HDMI 1.4b (to my knowledge) supports 4K content at a bit depth of 12bits, and colour subsampling up to 4:4:4. The current most commonly used specification of HDMI 2.0 (at High Frame Rate 4K) uses colour subsampling of 4:2:0 and only 8 bit colour. And that’s a problem.

Colour Banding

Colour banding is evil. Note: We’re showing the difference between 8 and 24 bit (wow) colour here, but in an 8bit image, which is like trying to show the difference between 1080p and 360p on a 360p screen. It doesn’t really work.

The colour difference between 8 and 12 bit is enormous. 8 bit allows for a maximum of 16.7 million colours. 12 bit allows for 1 billion. You could be an idiot and argue “Oh, the human eye can only see 16.7 million colours though”. Honestly, if you think that, please go and join that group of idiots that think the human eye maxes out at that ‘magic’ number of 24fps….

I’ve written an article about the evils of 8 bit and the hideous colour banding it produces. In my opinion it’s an instant visual detraction, but, I can see why 8 bit was chosen. The bandwidth difference between 4K@60Hz (8bit 4:2:0) and 4K@60Hz (12bit 4:4:4) is quite substantial. The 8bit 4:2:0 standard of 4K@60Hz actually uses exactly the same about of bandwidth as HDMI 1.4b (10.2Gb/s) and this helps, as less data needs to be transferred and the brain of the device (as such) can be less powerful. It’s a bit of a cheat, but it works in delivering the 4K@60Hz that consumers apparently so desperately want.

Screen Shot 2015-08-02 at 18.44.24

It’s also a future proof problem. The latest 4K blu ray spec includes the ability to transmit 10bit colour information. It’s quite likely these future blu rays will be encoded with 10 bit colour. And that’s not the only problem. If NHK gets their way, but 2020 we’ll all have 120Hz 8K 12bit broadcasts. This 8 bit is also a limitation for gamers, as a lot of games now render colour in 10 bit or better. You can easily tell the difference between 8 and 10 bit colour on a 10 bit monitor (most TVs are actually 10 or 12 bit), so you might regret your choice in the future.

And I haven’t even got onto the biggest problem. HDCP.

HDMI 2.0 is meant to introduce the new HDCP 2.2 standard. I haven’t really talked about HDCP in much detail before, except for the fact I think it does more bad than good. As far as I can tell HDMI 2.0 Level B devices don’t always support HDCP 2.2, and this means the next generation of media players and set top television boxes will probably refuse to communicate with your television or projector, rendering your media unplayable.

HDCP 2.2 is supposedly designed to fix the ‘silent watcher’ bug that has plagued HDCP systems before, whereby as long as a single positive HDCP signal was received by the player from the television, the player would in effect ignore any other devices, like an non HDCP compatible HDMI recorder. Now, of course there are those who believe the implementation of HDCP 2.2 is actually some evil capitalist plot to make us throw away all our old televisions and buy new ones (it kinda does seem like it), but if your television is not HDCP 2.2 compliant, it will not play back from an HDCP 2.2 source. What’s an even bigger bombshell?

Some ‘HDMI 2.0’ (4K@60Hz 8bit 4:2:0/Level B) televisions and projectors don’t support HDCP 2.2, and HDMI 1.4b straight up doesn’t. As far as I can tell, only HDMI 2.0 Level A devices are required to support it. I believe it’s optional for HDMI 2.0 Level B devices. 

Which means notionally ‘future proof’ televisions with HDMI 2.0 Level B still being sold today may straight up not work two years down the line. Now I’m not saying all HDMI 2.0 televisions will be useless in the future, but I have seen some that do not support HDCP 2.2, whilst ‘supporting’ HDMI 2.0  Make sure whatever you’re buying is HDCP 2.2 compatible, or you will regret it, and always see stuff like this.

IMG_0187

Jacob

Write a variety of articles, when I get the time. Usually do more longform and analysis than I probably should. Also an editor.

You may also like...