I suspect that most people when the hear the term "Ultra High Definition" (UHD) Television, will think "4K" or increased resolution. However, the term "UHD" incorporates 3 separate upgrades from our high definition (HD) video standard. Of course there is the increase in resolution from HD's maximum of 1080p up to UHD's 2160p (i.e., 4K). The first generation of UHD TVs and 4K projectors generally only supported the resolution upgrade while most recent mid-to-higher end models support two or even all three of the potential upgrades. The 2nd and 3rd upgrades can be considered optional and these are:
- support with Wide Color Gamut (WCG) and;
- support for High Dynamic Range (HDR)
This blog discusses the state of HDR, with a emphasis on how it applies to projector based home theaters, and where I expect see HDR technology to be evolving over the next year or two.
This blog discusses the three versions of HDR that are now gaining some support in consumer products and talks about the capabilities and role for each. Just to get our terminology clear, currently all commercial HD video and certain 4K/UHD videos are authored using Standard Dynamic Range (SDR). The following discussion is talking only about UHD videos that have been authored and encoded to support High Dynamic Range (HDR) where extra bright highlights in the image, such as sunlight reflecting off of a shinny surface, are intended to be displayed substantially brighter than (e.g., 10 times) the normal reference bright white picture levels.
1. HDR10
HDR10 is a version of HDR that offers a bit depth of 10-bits (per color) where the level of each of the primary colors can be represented digitally using a 10-bit (which means 30-bits total for the 3 primary colors). While 10-bits allows for 1024 possible luminance values, for UHD video the range used for the video information is restricted to the range of 63 to 960. By comparison the 8-bit encoding used for HDTV is limited to the range 16 to 235. Going from 8-bit encoding up to 10-bits allows, for example, the grey scale going from black to maximum white to be encoded in much smaller steps. This not only reduces the occurrence of visible banding in the image, but also supports the ability to represent a wider dynamic range. HDR10 was the first version of HDR to appear in UHD TVs and projectors with both Sony and JVC announcing HDR10 compatible projectors at CEDIA Expo in 2015. This was well before the first widely available source of UHD content with HDR10 become available in the spring of 2016, in the form of Ultra HD Blu-ray discs and currently also from several video streaming services.
HDR10 has been standardized by the Society of Motion Picture and Television Engineers (SMPTE) industry body and more recently addressed in a standard from the International Telecommunication Union (ITU). HDR10 is an open standard that is royalty free.
HDR10 was the first, but technically the most limited, version of HDR and there have been issues with how to best display HDR10 encoded content using 4K/UHD projectors that nominally support that form of HDR. Much of the problem relates to HDR10 encoded content being authored assuming it will be displayed on a very bright flat panel UHD TV with a nominal peak brightness, for the brightest highlights, in the range of 1000 nits and a nominal brightest of around 100 nits for reference whites (or about 1/10 the brightest of the brightest HDR highlights). In the world of home theaters using projectors, we usually talk about the picture brightness off-of-the-screen in terms of foot lamberts where for a conventional home theater (i.e., for displaying normal SDR content) a reference white brightness level of 16 foot lamberts is frequently recommended. 16 foot lamberts equals 55 nits and while this is certainly sufficient for watching conventional (i.e., SDR) video in a fully light controlled home theater, this is long way from the 1000 nits baseline for the peak highlight brightness that has been envisioned as nominal for displaying the HDR10 authored video.
How many foot lamberts, or nits, you can actually achieve with a projector based home theater setup depends on the actual lumens output of the projector, the size of the projected image (i.e., the screen size) and the gain of the projection screen's surface. When a typical home theater screen, in sizes of perhaps 110 to 130 inches diagonal, is used and having a gain in the range from 1.0 to 1.3 with a HDR capable projector having an actual light output in the 1200 to 1800 lumens range, you are looking at a maximum brightness off of the screen somewhere under 50 foot lamberts (under 170 nits) and in many cases closer to 100 nits for the larger screen sizes or for a less bright projector. The industry standards for HDR10 do not specify, or otherwise offer guidance on how to best display HDR10 content on displays that are significantly less bright than that what was used for authoring the HDR10 encoded content.
Some projectors offer some form of "tone mapping" in an attempt to accommodate the lower brightness image, but there is no official standard for this. Encoding for HDR10 video includes in the data stream that is passed on to the UHD display device, information on the brightness of the monitor that was used for that specific HDR encode. So far, the Ultra HD Blu-ray discs I have checked have been authored using either a 1000 nits or 4000 nits monitor. For HDR10 this is a static (i.e., fixed) value applicable to the entire video program. Now just because a 1000 nits or 4000 nits monitor was used by the movie studio for the HDR10 authoring that doesn't mean that the actual peak brightness of the HDR highlights will extend all the up to this maximum value. For example, some HDR encodes that were made using a 1000 nit monitor have been reported to have actually used a peak brightness level of around 800 nits.
Some points to take away from this discussion are:
- Some HDR10 encoded 4K/UHD videos may be displayed substantially brighter than other HDR10 encoded videos due to the way they have been authored and changes to the display's/projector's settings and/or to the Ultra HD Blu-ray player's settings will be needed to accommodated these differences. Some projectors allow the user to store multiple settings specifically for displaying different HDR content.
- For projector based home theaters, HDR content (using any version of HDR described in this blog) in general will not be displaying the brightest highlights extremely brighter than the reference white levels. Most HDR capable projector based home theater setups will be limited to displaying the peak highlights at levels 2 to 3 times brighter than the reference white levels. While this can give HDR UHD content a dynamic look, it falls short of the 8 to 10 times increase in brightness for peak highlights that some UHD LED flat panel TVs are capable of. Using a high gain screen and/or a really high brightness projector (such as Sony VPL-VW5000ES) can get you closer to the multiplier between reference white level and the peak brightness levels of a HDR capable flat panel UHD TV, but these will not the setups currently used by most home theater owners.
I must also note that Samsung has recently teamed up with Amazon to recently propose to the SMPTE standards group and extended version of HDR10 that is called
HDR10+. This enhanced version would add support for dynamic meta data that would define the peak brightness on a scene-by-scene or even frame-by-frame basis, in a manner similar to the two alternative HDR formats described below.
2. Dolby Vision HDR -
First let me mention there are two versions of Dolby Vision. One is designed for use in commercial cinemas and the other intended for consumers. What I'm talking about here is the consumer version unless otherwise noted. Also I'm limiting my discussion to just the HDR aspect of Dolby Vision. Dolby Vision is a proprietary technology and not subject to industry standards. Dolby Laboratory specifies the requirements and licenses, for a fee, manufacturers to produce products that support Dolby Vision. Starting last year, certain UHD TVs from Vizio and LG began offering offering Dolby Vision support, but so far no projector has been announced that supports Dolby Vision, but we may hear some news on this perhaps as early as this year's CEDIA Expo in September or next year's Consumer Electronic Show (CES) in January 2018. Dolby Vision requires specialized hardware to provide the required signal processing so older devices that support HDR10 cannot generally be upgraded to support Dolby Vision, except for a very few cases were the required hardware was already included and just awaiting the addition of the required firmware.
While HDR10 is the baseline HDR standard for Ultra HD Blu-rays and must be supported by all players, Dolby Vision is a option and so far only a few manufacturers have begun shipping players that are compatible. Ultra HD Blu-ray players currently available from Oppo and LG support, or will soon support via a firmware update, Dolby Vision. The first Ultra HD Blu-ray disc titles (i.e., movies) offering Dolby Vision are due out around mid-year while Amazon Video, Netflix and Vudu are supporting Dolby Vision. So 2017 is the roll-out year for Dolby Vision.
As far as HDR capabilities offered by Dolby Vision, it is built upon HDR10 by adding meta data to increase the bit depth to 12-bits and also adds meta data to dynamically define the peak brightness on a scene-by-scene or frame-by-frame basis. As with HDR10, Dolby Vision authored videos assume very bright displays and there is no specification or guidance offered for how to best display Dolby Vision HDR authored content on a substantially lower brightness display, such as would be typical for a projector based home theater situation. I would note that the version of Dolby Vison used for commercial cinemas presents movies with the HDR authored for a display with a 100 nits peak brightness, which is similar to what a projector based home theater is capable of, but the consumer version of Dolby Vision videos are being authored for much brighter displays. As a result, for Dolby Vision to be directly applicable to a projector Dolby would need to define the tone mapping that would need be applied based on the brightness of the image and this will depend not only on the specific projector's capabilities, but also on the size and gain of the projection screen being used. Dolby has yet to indicate is such an extension to Dolby Vision will actually happen. Therefore, when or if Dolby Vision support will come to home theater projectors is unknown at this point.
3. Hybrid Log-Gamma (HLG) -
The HLG version of HDR first appeared in certain 4K/UHD projector offering from Sony and JVC in 2016. HLG was developed jointly by the UK broadcaster BBC and the Japanese broadcaster NHK and has been included in international standards issued by the ITU. This version of HDR is expected to be widely used for broadcast UHD TV services globally (certainly for terrestrial broadcast and perhaps also for broadcast satellite services), including the ATSC 3.0 standard currently being developed for broadcast UHD TV in the USA and DVB standard for UHD TV broadcasting in Europe.
HLG is royalty free standard and will likely gain widespread use over the next few years for broadcast and streaming content. HLG supports a peak brightness level up to 12 times higher than the reference white level, with a nominal peak brightness level of 1000 nits for the brightest HDR hightlights (with up to 4000 nits supported).
HLG is designed to work with the same 10-bit depth as HDR10, and additional meta-data is not needed. HLG uses a more-or-less conventional gamma curve for the darker elements of the image while applying a logarithmic gamma curve for the brighter areas. This is said to offer a better solution for presenting HDR content while supporting displays with differing capabilities (e.g., SDR vs. HDR and supporting different brightness levels). HLG is intended to be backward compatible and this is attractive to display manufacturers and broadcasters alike. I expect to see rather widespread support for HLG in upcoming 4K/UHD projectors.
The Bottom Line -
I expect to see increased support in 4K/UHD projectors for the combination of HDR10 (perhaps also HDR10+) as well as HLG HDR. Dolby Vision may show up in a few future projectors once Dolby defines how to handle the less bright images expected for projector based home theater environments (I doubt that Dolby will license Dolby Vision for use in consumer projectors until this happens).
Hopefully, the SMPTE and ITU standards bodies will eventually create standards/guidance for how to best display HDR content on lower brightness displays. Until that happens projector manufacturers will work to incorporate their own tone mapping solutions (i.e., specific gamma curves) into their projectors to accommodate the much higher, and with varying, brightness levels (either static or dynamic) being use to author HDR content as compared the brightness levels practical in projector based home theaters.
Editor's note: A quick thank you, to the folks at 4K Filme in Germany, for allowing us to use their HDR logo, which Ron found on the web.