Daniel Krason
Ten years ago, the first 4K (Ultra HD, 3840x2160 pixel) monitors (not televisions) landed in stores. Consumers and reviewers were amazed at the improvement in image quality over Full HD displays – and why not? Ultra HDTVs offered twice the horizontal and twice the vertical resolution of current TV offerings, an improvement anyone could easily see. The minimum viewing distance at which display pixels could be discerned dropped to 1.5x the screen height. But who would sit that close?
Simultaneously, screen sizes began increasing as mass production of liquid-crystal display panels swung into high gear in Korea and China. 42-inch and 50-inch televisions, once considered to be “large” screens, were eclipsed by 65-inch, 70-inch, 75-inch, and even larger sizes as wholesale panel prices and retail television prices began a race to the bottom.
Viewers largely couldn’t see any difference between the 4K and 8K clips, and in a few cases “thought” they were watching native 8K video when they actually weren’t.
The introduction of high dynamic range imaging and its associated wider gamut of colors (as defined in the DCI and ITU Recommendation BT.2020 specifications) took television viewing to a new level. Camera manufacturers started promoting models with both Ultra HD and cinema 4K (4096x2160) sensors. So-called “action cameras” like the GoPro offered 4K models. The Blu-ray disc format was upgraded to an UltraHD version, and streaming services Netflix and Amazon Prime began offering a limited number of programs in 4K with HDR.
Behind the scenes, there was an even bigger paradigm shift brewing. The Japanese NHK television network had been working for over a decade on stepping up the resolution to 8K (7680x4320 pixels) for acquisition, distribution, and viewing. A prototype Sharp 8K LCD TV showed up at CES in 2013. Compact 8K cameras and format converters were also developed by NHK engineers. By the end of the last decade, Hitachi, Sony, RED, and Sharp had all rolled out 8K production cameras. Starting in 2012, the Olympics featured incremental 8K video coverage, expanding to full coverage of the COVID-delayed Tokyo Olympics last year.
8K production has created challenges of its own, with file sizes four times as large as those for 4K production and insanely high data rates...
SUCH A DEAL
Today, 4K cameras and televisions are ubiquitous. If you buy a new television larger than 43 inches, it’s likely a 4K model as there is no market demand for (nor profit in manufacturing) large display panels with Full HD resolution. In fact, 4K televisions are basically commodities now: Display Supply Chain Consultants (DSCC) reported recently that wholesale prices for 65-inch 4K LCD panels dropped to a low of $92 this past September.
As I write this, retailer Best Buy has already started its Black Friday promotions and is offering a 65-inch Insignia 4K “smart” LCD TV with Amazon’s Fire streaming interface for $380. For $20 more, you can pick up a TCL 65-inch model with HDR and a built-in Roku streaming interface. And for $50 more, a Samsung Series 7 HDR 4K set with a Tizen interface is yours. Dirt cheap!
The median screen size for television purchases in the U.S. has moved up to 55 inches as more and more families opt for bigger screens…and they’ll sit (on average) anywhere from six to fifteen away from those screens, well outside the 1.5’ minimum viewing distance to notice pixel structures. Why would those viewers need more pixels on the screen, particularly since another long-time problem (8-bit grayscale and color rendering) is largely gone, thanks to 10-bit panel addressing?
GRAB THOSE PIXELS!
To be sure; higher image capture resolution is definitely a good thing. Watch reruns of I Love Lucy, shot with three 35mm cameras 70 years ago, and you’ll be amazed at how well they hold up in this day and age. (Compare them with the sitcoms recorded on composite video tape formats in the 1970s and 1980s, which look pretty shabby compared to even 720p HD video.)
...there’s another big problem with 8K televisions: Power consumption.
Fortunately, solid-state memory is cheap now. You can buy a 128 GB, 200 Mb/s SDXC US-II memory card for $150 today, and 256 GB models are available for $270. Write speeds are typically in the 200 Mb/s range, and you’ll need all of that to record 8K video. (1 TB will get you about 20 minutes of 8K video, depending on frame rate.) And camera prices continue to fall. Fuji’s X-H2 mirrorless DSLR can record 8K/30p with its 40-megapixel APS-C sensor and is currently priced just a hair under $2K. (One frame of 8K video contains 33 million pixels.)
TOO MUCH OF A GOOD THING?
Ensuring some degree of future-proofing by capturing, editing, and finishing video at 8K resolution is a great idea, if the funding is there to do it. But do we really need to view those programs at home at the same resolution?
Probably not. And the typical viewing distance in the home is one reason why – we’re sitting too far away from the screen to see any real difference between 4K and 8K content. A double-blind study conducted almost three years ago by Pixar, Amazon Prime Video, LG, the American Society of Cinematographers, and Warner Bros. attempted to determine if consumers could see the difference between 8K and 4K.
A total of 139 individuals participated in the study, which was conducted over three days, viewing random clips of native 8K content interspersed with 4K content, upconverted to 8K, on an 88-inch LG OLED TV. The conclusion? Viewers largely couldn’t see any difference between the clips, and in a few cases “thought” they were watching native 8K video when they actually weren’t.
It can be argued that 8K content, downconverted to 4K, will look better than native 4K video simply because there are more photosites in an 8K sensor to capture more detail. (A similar argument was made in the past to down-convert 4K HDR to Full HD for broadcast as a way to more efficiently use bandwidth.)
There are other logistical issues to resolve for 8K viewing at home, not the least of which is the bandwidth required to do so. With the latest codecs, streaming rates in the range of 150 to 200 Mb/s with long GOPs (Groups of Pictures) are required for 8K/60 video – not surprisingly, four times the rates required for 4K/60 video. To be sure, codecs are improving with every passing year. But it’s unlikely they’ll drop 8K/60 below 100 Mb/s any time soon (which is the peak streaming rate for Ultra HD Blu-ray, using variable bit rate encoding).
SLIP THE JUICE TO ME, BRUCE
Aside from the higher cost of 8K LCD and OLED televisions compare to their 4K siblings, there’s another big problem with 8K televisions: Power consumption. Samsung’s 65-inch Q900B Neo QLED 8K LCD TV has a power consumption rating of 370 watts, compared to that brand’s Q90B 65-incher at 270 watts.
Sony’s 75-inch BRAVIA XR Z9J 8K LCD TV has a mind-boggling power consumption specification of 520 watts, while its 4K relative sips 304 watts of energy. (If nothing else, the comparative numbers indicate that quantum dots harnessed to LEDs are more power-efficient than large arrays of miniLEDs for rendering high dynamic range video.)
And the power consumption issue isn’t anything to be trifled with. Recently, the European Union announced that televisions sold in member countries must have an energy efficiency index of 0.9 or less, starting on March 1 of next year. As an example, a 75-inch 8K TV would be limited to energy consumption of 141 watts or fewer per hour to comply with this regulation. And that would knock out every 8K model in larger screen sizes, along with a few 4K models.
Ten years ago, the state of California was also setting energy consumption limits for televisions and other appliances to promote energy conservation. Those limits weren’t as draconian as what the EU is defining and don’t apply to larger televisions. Still, consumers are becoming painfully aware of rising utility bills and energy consumption costs, and many (including me) do pay attention to power consumption for every piece of electronics we buy. My mainstay LG OLED55C9 has a power consumption rating of 90 watts, quite a drop from the 42-inch Panasonic plasma it replaced.
AND SO….
I can discuss 8K pros and cons until the cows come home. From my view, it makes sense as a production format for future-proofing. But it makes little sense as a home viewing format unless you’re considering buying a really big screen (>90”) and parking yourself right in front of it. (That is, if any perceived flicker from 60 fps video doesn’t drive you to distraction, and you don’t mind paying those jacked-up electric bills.)
The fact is; 4K resolution is more than adequate for home viewing, even in really large screen sizes. It’s downright affordable, uses a lot less electricity, looks spectacular with high-quality HDR content (especially on OLED sets and LCDs equipped with quantum dots) and even better with high frame rate video, and is practical to deliver to the home given the current state of broadband speeds. And content mastered in 8K can be downconverted to 4K and delivered to the home with no appreciable loss of image definition, as the earlier study showed.
Those TV brands that are promoting 8K models are largely trying to overcome the bargain prices and slim margins on 4K models. But consumers still favor screen size and price over resolution. Back in April 2021, Strategy Analytics forecast that by 2025, 72 million households worldwide would own an 8K TV with China and the U.S. seeing the fastest adoption of the 8K format. This, despite the fact that half of all TVs sold in 2021 worldwide were 4K models, a trend that’s likely to continue this year. (It’s telling that as of this writing, no 8K TVs are part of Best Buy’s Black Friday sales promotion.)