Dark Light

The consoles have always had that ‘T&C’ applied to many aspects in their own ecosystem and the trend has not stopped in the current gen with PS4 Pro and Xbox One going for the ‘Upscaled to 4K’ term instead of Native 4K, and using some mysterious technique that simulates the game upto 90% of 4K (Or whatever they are doing, I’m not a technical expert). Another term that is being thrown around a lot is HDR (High-Dynamic-Lighting, for those who are curious). It is some sort of lighting technique in games that allows for flexibility in contrasts without losing detailing in process(At least that’s what I understood). Describing it in detail will be in vain (If you are very curious I found an article about it on Game Debate), because it is too long an explanation and will require a whole dedicated post.

Now coming to the topic, a NeoGaf user has recently posted about why not to get a 4K HDR TV. Here’s the description:

As gamers the only component of this we should be interested in is the HDR part. Right now the developers are being cagey about what they are doing , what is HDR and what is WCG.

Microsoft have stated that the Xbox One S does not use WCG for games ( only movies)
Sony hasn’t said a word about what either machine is actually doing. PS3, PS4 and Xbox One all supported 10 bit deep color from their launches, yet we’ve never seen a game that utilises it.
(Xbox One actually supports 12-bit output also, which will be a requirement for Dolby Vision in future)

10 bit textures and data requires more storage and more bandwidth, so there is are performance implications on all the machines.
For a game to well and truly support WCG, every texture and video will need to be replaced with an 10 bit WCG version.

So, on the assumption that we aren’t going to see much in the way of WCG material any time soon, but the console manufacturers are insistent on talking about it like it is a very real thing. The only assumption that can be made is that they are using the HDR side of things to better control image brightness in things like sunspots and specular highlights.

So what I’ve done as a guideline to show where TVs are right now in the grand scheme of things. Rtings have done some tests that show a screen’s capability to display a bright image on a 2% area of the screen, so something like the sun or perhaps a light source.

j7nkxcf

What we can see is that the vast majority of TVs available right now sit right at the very bottom on the chart, falling massively short of the peak brightness of 1000nits to hit the main HDR standard : UHD Premium and nothing comes even close to what Dolby is suggesting for the future.

What this also shows you is exactly why Sony chose to create their own standard, one that has no minimum brightness requirement (4k HDR), it also shows why they have chosen not to publish Nit data for their most recent TVs (such as the X800D , which has been recommended in another thread).

Anyway, I hope this helps to further add some clarity to the intentional obfuscation from TV manufacturers and now Console makers.

This is a whole lot of technological mish-mash (For me) about which I am going to research further on, and will try to come back with some clearer explanation. Here is the link to the post. I just covered this for those of you looking to get a 4K HDR TV for your consoles. Let us know in the comments below, what you think about the post.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts