If the many big-name television manufacturers from this year’s International CES are to be believed, the future of TV is nigh. From Sony to Samsung, Vizio to even Westinghouse, it seemed as if every company looking to dominate the home theater was focused on the next step from today’s 1080p HDTV standard.
Leading this charge were two relatively new types of televisions: Ultra HDTVs, also known as 4K TVs, and OLED TVs. Consumers are going to continue hearing plenty of hubbub about these two new standards going forward, so now seems like a good time to take a look at just what both of these things do, and whether or not they’ll be changing the face of the living room going forward. Let’s see what all the fuss is about.
What’s a 4K TV?
The first thing to understand about 4K is that it’s a new resolution, not a new piece of technology. On a fundamental level, it isn’t all that different from the HDTVs many people have in their homes right now.
Actually, while we’re here, let’s take a step back and clarify what “resolution” means first. Today’s HDTVs often have resolutions of 720p or 1080p, terms that are signifiers of how many pixels are contained on their respective screens. For example, a 1080p HDTV uses progressive scan — basically, tech that makes the image smoother — and most often displays 1,920 pixels horizontally and 1,080 pixels vertically to make up its high quality images. Thus, the resolution for that kind of TV is listed as 1920 x 1080. The second of those numbers, combined with the fact that it’s combined with progressive scan tech, gives way to the classification of 1080p.
4K TVs, meanwhile, take the amount of pixels on a standard 1080p HDTV display and double them both horizontally and vertically. That means they’ll usually have resolutions of 3840 x 2160, which equals four times as many pixels than what’s on a 1080p display. The “4K” name comes from the fact that the horizontal pixels almost total four thousand. So, the typical 2 million pixels found on a 1080p set become 8 million on a 4K one. That’s a lot.
So what does this mean for viewing? Well, the big thing with 4K is its sharpness. Seeing an image displayed in 8 million pixels brings an awe-inspiring level of detail to the display, allowing viewers to see things like individual strands of arm hair and eyelashes clearer than ever before. Think of it like the so-called “Retina displays” on many modern smartphones and laptops. The amount of pixels crammed into one space can literally be too much for the human eye to distinguish with the right setup, so images can appear as smooth as butter. Really high-definition butter.
In fact, many moviegoers out there are already familiar with Ultra HD, even if they don’t realize it yet. Blockbuster films like The Amazing Spider-Man and District 9 were shot in 4K, while The Hobbit was even shot in 5K. They look wonderful. On big screens like those found in a theater, these higher resolutions can really shine, since all those pixels get to stretch their legs and fill up all that space.
Ultra Problems with Ultra HD
But that’s just the thing: in order to truly take advantage of that army of pixels, viewers will need to have a cinema-esque screen big enough to house them all. That’s why the 4K TVs offered (and soon to be offered) from companies like Sony, LG, Samsung, Sharp, Westinghouse, Vizio, and Hisense are all absolutely massive — ranging anywhere from 55 inches to a whopping 110 inches. To recap, a 55-inch TV is small when it comes to 4K sets. Many families may not have enough room to fit these big fellas, limiting their utility from the get go.
The reason all of these 4K sets are so huge is because, well, they have to be. Otherwise, they risk being virtually useless. See, the effectiveness of a display’s resolution is dependent on a few things: the amount of pixels on the display, the size of the display, and the distance from which the viewer is sitting from the display. Basically, the farther someone sits from the screen, and the smaller that screen is, then the less pixels (and therefore, the lower the resolution) are needed to produce a clear image to the human eye.
As noted above, a 4K TV needs lots of display real estate to be fully utilized. On top of that, the viewer also needs to be sitting relatively close to the screen; think 6 feet or less. If the screen was any smaller than about 55 inches, and if the viewer was sitting on a couch more than 6 feet away from the screen, then there wouldn’t be much, if any, difference between a 4K TV and a 1080p HDTV, because the human eye can distinguish fewer and fewer individual pixels the farther it sits back from the screen, and the smaller that screen is.
In other words, cramming 6 million extra pixels into a space that was only made to fit 2 million is redundant. So unless one plans on buying a big Ultra HDTV and sitting relatively close to it, the leap from 1080p HDTV to 4K Ultra HDTV isn’t always as significant as advertised.
This is only compounded by the fact that many of the soon-to-be-offered 4K sets are more or less the same as the HDTV models on the market today. They’re primarily LCD panels, which means that all the problems consumers have with current LCD HDTVs (like limited viewing angles and contrast ratios) are going to remain on these next-gen sets. The upgrade here is with resolution alone, for better and for worse.
Next, there’s the issue of native 4K content. In many ways, it’s the same “chicken and egg” scenario that helped murder 3D TV a few years back; content makers need to have a relatively sizable user base to provide 4K material, and that user base needs to have a relatively sizable amount of 4K material to buy 4K TVs in the first place. As it stands now, there is very little content produced in 4K resolution, so getting the “pure” 4K experience is an unfortunate rarity these days. And considering that a decent chunk of U.S. homeowners still don’t even own an HDTV, it may be a little bit before 4K content becomes a standard.
Manufacturers are doing their best to overcome this dilemma, however. Sony, for instance, is offering a 4K video distribution service with their Ultra HDTVs, while also promising to distribute 4K remastered Blu-ray discs going forward. And just about every 4K TV being advertised sports some sort of “HD upscaler,” which is said to take normal high-def content and boost its resolution to higher levels.
This all sounds nice, but there still won’t be expansive amounts of 4K content to start, the cost of 4K Blu-ray players will almost assuredly be high, and the upscaled HD content still won’t reach 4K levels, making its difference from 1080p more negligible than it already can be. That’s not even to mention the possibility of a mainstream television channel broadcasting in native 4K, something that probably isn’t happening on a regular basis for years at least.
Many have also brought up the idea of streaming 4K content to appropriate TVs over the internet, but the amount of storage space such streams will take up is going to be enormous. That’s either going to make users hit their bandwidth cap pretty quickly, or it’s just not going to stream well enough to get the full viewing experience. Again, 4K looks wonderful in the right situation; the problem is creating that “right situation” in the first place.
But there’s still one big problem that’s going to keep 4K TVs out of the mainstream for the near future, and it’s an important one: price. Simply put, the Ultra HDTVs coming to market now are absurdly expensive. Like, existential crisis expensive. The “average” 84-inch 4K TV usually runs somewhere in the mid-$20,000 to low-$30,000 range, which is more than the cost of a new car for most people. Manufacturers are claiming that their smaller 4K TVs in the 50- to 70-inch range will be offered at more affordable prices, but expecting anything less than five figures is wishful thinking for now. Given all the potential pratfalls these Ultra HDTVs face, such a financial commitment just may not be worth it to those outside of the 1%.
Pages: 1 2