If the world’s television makers are eager enough to try to convince World+Dog to buy a 4K x 2K TV, the world’s market watchers are no less keen to suggest the proponents of Ultra HD will be successful in the near future, with sales rocketing five years from now. This week’s Consumer Electronics Show (CES) in Las Vegas will …
Time would be better spent improving gamma and dynamic range.
Extra resolution is a highly desirable goal but a better priority would be to spend time improving both the gamma and dynamic range of both image sensors and displays.
1. Image sensors need to have a much better dynamic range than the current practical limit of about 10 stops before white clipping occurs. This would allow the camera electronics to properly simulate the Hurter–Driffield slanted-S (log exposure) curve of film (thus allowing detail to be extracted from the 'toe' [low light] and the 'shoulder' [highlights] of the image--still a major problem for electronic image sensors (television systems). At least 13 or 14 stops dynamic range should be the short-term target for digital image sensors so we can enter the High Dynamic Range Imaging (HDRI) era.
2. Large hi-res extended dynamic range displays such as OLED etc. are urgently needed to display the extra dynamic range (leaving geometry and res aside, the best CRTs still look better than LCD displays when it comes to dynamic range and so does the best film projected by a black-body radiator (tungsten filament) light source.
3. The colour gamut also needs to be widened--look how pathetically limited the current sRGB triangle is on the CIE 1931 color space chromaticity diagram [see Wiki--color gamut]. (Perhaps we even need research into four-coordinate (2 greens) colour systems.)
Preoccupation with image resolution at the expense of gamut and dynamic range seems counterproductive and shortsighted. Moreover, in this digital age, we should not lose sight of how remarkably good a film negative can be when it comes to dynamic range--after all, it's had 150 years development (although the same cannot be said about film's limited colour gamut).
Remember your eye can accommodate (adjust to) a dynamic range of over 10^6 whilst the best LCDs barely make 10^3 (despite the advertising blurb)!
Re: Time would be better spent improving gamma and dynamic range.
Not only that, the focus (err) on resolution is further harmed by this:
"don’t underestimate the ability of codec writers to devise even more efficient ways of squeezing much larger images through barely wider pipes."
...unless they're going to improve lossless compression by 4x, this means that the uber high-resolution will max out at about the same level of high frequency content as we have now.
Add that to the fact that broadcast HD is already crippled frequency-wise - I've seen plenty of 'HD' stations where macroblocking effectively cuts the spatial resolution to NTSC levels for anything but a static image - and to the continuing push for streaming content delivery, which is even *more* limited, and... yeah, aside from gaming and direct computer stuff, I don't see it working really well.
And with gaming itself crippled by horribly out-of-date consoles that are still struggling to deliver 1080p for some games, I don't see the next gen able to supply high quality graphics to 3892xWhatever; even the new monstrous PC cards have a tough time maxing things out at 3x1920x1200, and those cards have, I don't know, 8x, 16x, the raw pixel pushing power of current consoles?
You know what I want to see? *Actual* HDR. We've spent a ton of time trying to match the perceived spatial resolution of the real world, so, great - but as Graham points out, we're still way the hell off on color reproduction.
Show me a TV that can light up my room like an open window, while retaining near-zero blacks, and I'll start to get excited.
Re: Time would be better spent improving gamma and dynamic range.
>At least 13 or 14 stops dynamic range should be the short-term target for digital image sensors so we can enter >the High Dynamic Range Imaging (HDRI) era.
Easily done... just use two cameras and a half-silvered mirror at 45º. You can get cameras such as Canon's C300 that can capture video in situations we can barely see with our own eyes. The correct balance can then be worked out in post production. There was a good video demonstration of this technique featuring a welding torch, HDR'd to the max.
More dynamic range would be good, but I don't think it is necessary to create an image that is indistinguishable from a window, at least for narrative storytelling. Time will tell. Let's see how Peter Jackson's 48fps goes down with film makers and audiences.
@David W. -- Re: Time would be better spent improving gamma and dynamic range.
David, well said.
Frankly, I'm often horrified by the lack of quality that I see in hi-res video--even professional video [film-replacement] systems. Over-compression, compression artifacts, CODEC limitations, clipped highlights etc. etc. often make images look unnatural and artificial--in fact, to me, they often look quite horrible.
In many cases, going from analog to digital seems to have been an excuse to bypass many of the norms and standards which make for high quality images (and which the analog world took and still takes for granted). If one feeds say an optical--i.e. via camera lens--Pulse & Bar* & grayscale signal through the complete digital camera/recording/monitor chain and just views it--leave aside electronic measurement of the signal for the moment--the problems are glaringly obvious. One sees artifacts of all sorts--the classical (analog) overshoot, ringing etc. as well as digital noise to the extent that would have been unacceptable in professional analog systems. [Of course, I'm referring to a test signal (and distortion products) that's been appropriately scaled to the resolution and bandwidth of the specific system.]
"Show me a TV that can light up my room like an open window, while retaining near-zero blacks, and I'll start to get excited."
Right, despite the ooh-ah factor experienced by many video neophytes to HD video--and for that matter its many real benefits over older systems--digital imaging has a huge way to go before it represents a true analog of the image that its endeavouring to reproduce.
* The Macdiarmid / BBC TV Pulse & Bar 'T' test signal goes back to the early 1950s. It's old but its design is still relevant as is based on the actual optical distortion that's perceived by a viewer after an image goes through any video chain/process. As the test signal is an analog for an image, its perceived optical distortion is what matters, it's irrelevant whether the medium is digital video, analog TV or even film for that matter.
@Dave 126 -- Re: Time would be better spent improving gamma and dynamic range.
Agreed. The issue, of course, is to get an agreed widespread standard. And as we well know, this is no easy matter. Just examine the history of the NTSC/PAL/SECAM wars of the mid 20th C. for that.
My feeling is that we're going to have a lot of interim standards and it'll be a considerable time before things stabilize to the extent of NTSC/PAL/SECAM and or 35/70mm film coexistence.
Irrespective, some experienced producers of film movies who have now gone to digital for convenience, continually whinge and bitch about the lack of dynamic range, especially the white compression/clipping problem. In the old days of film, details in the highlights could be extracted out of the negative by the lab if the director wanted them, now they're clipped and thus do not exist (or are too compressed to use--cause banding etc.).
Anybody who thinks this will mean TV's the size of walls will do well to consider the predictions made in Fahrenheit 451...
People will blindly buy this.
Same as they blindly bought HD Ready.
Fact is, lots of people still have a standard (SCART) $ky box connected to their HD TV.
Let's face it, people buy laptops with 8GB RAM, a 750Gb HDD and a Celeron CPU because of the large 8 and 750 and the small price.
You have a point, but most normal-peoples' computer operations are RAM / storage-bound, not CPU-bound. Running a bunch of browsers doesn't use much CPU, but it chomps RAM like nobody's business; same with productivity stuff. Yeah, you'll be sitting waiting when you're doing gaussian blurs in Photoshop, but even then, stuff like InDesign and Illustrator should be mostly good.
For general non-gaming use, I'd take a relatively crap CPU, SSD, and 8gb over an i7 with 4gb and a 5400rpm hard drive any day.
When B&O make one, I'm in...
I don't care about the content, I want the shiny shiny.
Sky have already priced it up
The "im on benefits" plan - standard def with XL adverts £20 pm
The "im on benefits and working on the side" HD plan with XL adverts £40 pm
The "im on benefits, working on the side with an undeclared partner" UltraHD plan with XL adverts £60 pm
The "im on benefits, working on the side with an undeclared partner and 2 kids" UltraHD plan + XL adverts + sports and movies £120 pm
That as TV gets better the content to watch on them gets worse.
I suspect that by the time these TV's are mainstream I will have given up on TV altogether.
Ok if you live un the US where the Broadcasters have to bow to the commercial interests of Sponsors to update there Equipment to the latest standards. Then yeah it could happen.
If you live in Communist Europe, (and Blighty to a far lesser extent), where the Broadcasters just don't give a toss 'cause the can just rake off their Income off the Public. Then enjoy the Two or Three HD Channels that you currently have and the One Pay-TV 3D Channel (e.g. Sky), and or the few times that the BBC decides to annex the BBC HD Channel for 3D Content.
And you'd really expect that Ultra HD is gonna be a boon here in Europe? LOL no not really.
Before we even begin to worry about such things, let us first begone of all SD content.
This would at least help the credibility along a little.
- Vid Hubble 'scope snaps 200,000-ton chunky crumble conundrum
- Bugger the jetpack, where's my 21st-century Psion?
- Google offers up its own Googlers in cloud channel chumship trawl
- Windows 8.1 Update 1 spewed online a MONTH early – by Microsoft
- Interview Global Warming IS REAL, argues sceptic mathematician - it just isn't THERMAGEDDON