Nah
Blu-ray hasn't succeeded in replacing DVD by any means and if video streaming services like Netflix and Lovefilm continue to grow, why will people want to shell out a fortune on expensive TVs when a computer is capable of doing the job?
If the world’s television makers are eager enough to try to convince World+Dog to buy a 4K x 2K TV, the world’s market watchers are no less keen to suggest the proponents of Ultra HD will be successful in the near future, with sales rocketing five years from now. This week’s Consumer Electronics Show (CES) in Las Vegas will …
I'd imagine the next gen of 4k screens will come with a decent computer on the back to allow them to be 'connected'.
I've always thought Sony have been missing a trick by not lumping a PS3 into their high end TVs. Not only would it drive BR or Game sales, it would also bring people to their online shop, and as an added bonus would make all the on screen graphics damn pretty. (Instead of all the graphics that appear to be from the 16bit era like all TV makers seem to have.) Also there must be masses of duplication with a PS3+Play chucked into a TV, so the savings on the package would be pretty good too.
What jdx said! Our bluray player has netflix, hulu et al built in.
4k is coming, they need to sell sets and 3d was a relative flop. They can't sell that many huge 100+ inch sets so the next thing to push was higher res.
Initially 4k is only really going to be for early adopters and those with a real need (photographers \ videographers ). They either won't be disuaded by a lack of diverse content or compromised delivery methods (upscaled 1080p). In the short term I would expect ultra hd 'players' to use modified bluray standards, higher compression \ dual sided or dual discs (maybe 2 drives to allow for seamless switching mid movie?) and \ or downloadable content cached if it cannot be streamed fast enough.
A lot was invested in bluray and it's still not completely 'won' against dvd. I can't see it being usurped by another physical method but streaming could do it if codecs \ bandwidth (and caps) improves to the point where 4k can be streamed to the majority of people who want it. This is one area where cable and satellite companies can win, if they really wanted to they could offer 4k vod but given their half arsed 720p \1080i with shitty compression approach to hd I'm not holding my breath.
Well if you do the math, think about the pixel density required to do a 42" at 4K resolution.
I would expect that the yield would be considerably lower on the higher density screens.
I agree and would love to see something less than 50" in a 4K resolution. Yet, you will have the same issues that plague us where broadband (cable) resolution will be stunted due to the compression algos used.
Higher dpi may mean more dead pixels, but if the screens themselves are smaller then that will mitigate that to some degree and if the pixels are smaller they will be less noticeable.
Theres already a 10 inch 4k screen, in a few years you can expect to see them in tablets. Mid sized 4k screens will also come, you can already get 36.4 inch uhd screens from Eizo.
Having done all the work of putting the extra pixels for 1080p in 3D, changing the screens for 4k isn't too much of a challenge and Blu-Ray upscales reasonably well.
I don't see a new media format being used for this - it looks like an excellent opportunity for premium broadband for the usual suspects: cable operators and content providers (Utlraviolet), as people get used to streaming as opposed to physical media.
Need to do some back of the envelope calculations but a lot of people probably already have sufficient bandwidth for receiving, thought you'd probably want to have a fairly generous buffer to be sure. Serving the stuff might be another matter but content providers have a considerable interest in recovering business already lost to Netflix/Lovefilm etc.
What was that recent Reg article? Oh yeah: If you buy one of Sony's £16,000 TVs they will lend you a HDD-based media server with a few movies on it...
If you wanted to be 80s retro about it, just imagine having a shop in every town, from where you can pick up a couple of movies on HDDs (VHS size, conveniently) on a Friday night, and drop em back Sunday. Blockbuster could see their share price rise, until everybody gets fibre broadband.... Be kind, de frag.... [Meanwhile, back in reality]
BDXL should cope with 4k especially if theres a revamped codec. I can see a situation where in a few years you can rent movies on usb 3.0 flash drives from kiosks (or your own drive gets filled there and then).
As for bandwidth, 4k has 4x the pixels of 1080p, although this won't translate to 4x the bandwidth. I can see it needing between 25 and 40mbps for quality (i.e. movie not tv) streaming.
Where are people supposed to get the content from?
I've not heard any mention of a BR replacement, and I assume sky/cable/freeview/average broadband doesnt have the bandwidth required.
So whats the point? Or are we supposed to buy into the belief that upscaled content will be significantly better enough to justify the TV?
Bluray already has a standardized format capable of 128GB, and there are other much higher capacity bluray formats in the lab, the highest I've heard of was around 500GB but the theoretical limit is supposedly somewhere around 5TB.
But with the new codec they plan to use for 4K the 128GB disks should be sufficient.
Sky and Cable could do 4k without too much chaos although it would likely need a new codec + stb, freeview\freesat would probably need more reworking due to how it is structured & regulated. Content will come from a variety of sourced, you mention cable, Virgin offers broadband that could stream 4k and BT is heading that way with its new network. Bluray as a physical medium can cope with 4k although it will need new hardware and a tweak in the standards if not a new codec but it won't need to be replaced altogether.
It's called hanging a decent projector from your ceiling. There you are, job done. Personally I'd rather go with a projector than obliterate half a house wall with a black rectangle until we fire up the nuclear reactor to power it.
There are quite a few companies with 4k displays \ projectors out there, some since 2007, I know of Chinese, Korean, Japanese companies etc, none of them Apple. However, I am sure when Apple does release a 4k set it will look awesome, their propaganda department will claim its made with pixie jizz , history will be rewritten and they will be the first to have done it. Then the lawsuits will start.
I actually have the space for it.
But here's the rub.
1) 4K would be nice, but no content.
2) No content means that its not worth buying. And any content would be on multiple disk blu rays, right?
Cable bandwidth is nice, but then you would have to reduce the number of channels to get the bandwidth.
Meaning the same cable that carries 500 stations would only be able to carry 100 or so channels, right?
Cable stations don't care about program quality as long as they can sell advertising on them so more channels beats higher definition channels.
Or you could go and launch your own set of satellites which can handle the higher quality images and create your own networks.
What compression is used on BR video... it's certainly possible to get a lot more video on a DVD than the DVD standard codec allows using modern codecs so perhaps the same is true here too? Or maybe BR is already using a near-optimal codec?
"don’t underestimate the ability of codec writers to devise even more efficient ways of squeezing much larger images through barely wider pipes"
If they can get a full signal through the "barely better than dial up" that BT calls broadband where I am without it stuttering, I'll be impressed.
I had a terrible response from 4OD at my parents house over Christmas -- someone was clearly throttling it somewhere along the line, as multi-megabyte software installs downloades pretty damn quickly, and 4OD was fine at "unsocial hours". And I used to work in an office that became distinctly unproductive after 3:30 when schoolkids got home and choked up the BT bandwidth that we needed to connect to our data centre hundreds of miles away.
There isn't quite enough capacity on the internet as it is, and squeezing extra pixels into the next series of Strictly isn't a good use of what there is.
And besides, while I'd never underestimate the ability of codec writers, I'd never overestimate the intelligence of codec implementers either. MP3 was long derided as a useless music format because (as I understand it) the most popular encoders in commercial use for over a decade were really really bad at encoding music without degrading the signal atrociously.
And consider the whole backward compatibility problem -- DAB is rubbish and there's a better option (DAB+), but we can't use it because there's existing sets that don't support it. The only reason we've been able to move to Freeview+ is because of HD. Freeview+ on SD would give superior results, but it would render a lot of kit obsolete.
So if the codec writers produce that amazing new mathemagic just a little bit too late, we won't get to use it....
A ultra high def TV like that would be great for hooking up to a PC to do:
- photoshop work on high resolution DSLR images
- CAD
- anything else requiring such large screen real estate
And you can do it sitting a few feet/metres from the screen, which means that your eyes don't have to focus as close as with a traditional screen, which in turn will be better for our eyesight.
The ability for it to play upressed HD content (which should look better than same HD content on a 1080p screen) is just one added benefit!
My 2 pence
All 3D means is that it can go at 120Hz instead of 60 (or 100 / 50 depending on location) and many TVs did this even before the rise of 3D, plus a few pence spent on an IR device to sync the goggles.
If you don't want your new TV to be 3D enabled, just don't buy the glasses. Or poke an eye out, whatever suits you.
These guys are crazy.
There's no way people are going to jump all over a TV format only available to people with the space and money for a 60-100 inch TV, when there's no content for it. All you'll be able to watch is upscaled HD (or heaven help you, upscaled non-HD) for years. And they're gonna have to be considerably more expensive than current TV's, if the manufacturers want to make money off them (which they seem to be having a hard time with at present). The only way anyone will buy 4k in the next few years is if they're deceived about what it's capable of (which we can count on marketing to work overtime on) or if they're doing something niche (like the reader proposing using it at a computer screen).
>These guys are crazy.
You are inummerate:
Capegemini, a financial consultancy, defines a millionaire as anyone with investable assets of $1 million or more – meaning that they actually have over a million dollars as that doesn't include the home in which they live, for instance. By this measure there are about 10 million millionaires on the planet, according to Capegemini and Merrill Lynch.
So even if just 10% of millionaires bought one each, that figure would be about right. You say lack of content? That is is so easy to fix, even with existing media- just use a HDD media server, doesn't matter if individual Blu-rays have to loaded onto it first (the butler can do it). Or, shocker, have a media server with 3 x Blu-ray ROMs, cos at £40 they will really break the millionaire's bank.
Extra resolution is a highly desirable goal but a better priority would be to spend time improving both the gamma and dynamic range of both image sensors and displays.
1. Image sensors need to have a much better dynamic range than the current practical limit of about 10 stops before white clipping occurs. This would allow the camera electronics to properly simulate the Hurter–Driffield slanted-S (log exposure) curve of film (thus allowing detail to be extracted from the 'toe' [low light] and the 'shoulder' [highlights] of the image--still a major problem for electronic image sensors (television systems). At least 13 or 14 stops dynamic range should be the short-term target for digital image sensors so we can enter the High Dynamic Range Imaging (HDRI) era.
2. Large hi-res extended dynamic range displays such as OLED etc. are urgently needed to display the extra dynamic range (leaving geometry and res aside, the best CRTs still look better than LCD displays when it comes to dynamic range and so does the best film projected by a black-body radiator (tungsten filament) light source.
3. The colour gamut also needs to be widened--look how pathetically limited the current sRGB triangle is on the CIE 1931 color space chromaticity diagram [see Wiki--color gamut]. (Perhaps we even need research into four-coordinate (2 greens) colour systems.)
Preoccupation with image resolution at the expense of gamut and dynamic range seems counterproductive and shortsighted. Moreover, in this digital age, we should not lose sight of how remarkably good a film negative can be when it comes to dynamic range--after all, it's had 150 years development (although the same cannot be said about film's limited colour gamut).
Remember your eye can accommodate (adjust to) a dynamic range of over 10^6 whilst the best LCDs barely make 10^3 (despite the advertising blurb)!
Yes, this.
Not only that, the focus (err) on resolution is further harmed by this:
"don’t underestimate the ability of codec writers to devise even more efficient ways of squeezing much larger images through barely wider pipes."
...unless they're going to improve lossless compression by 4x, this means that the uber high-resolution will max out at about the same level of high frequency content as we have now.
Add that to the fact that broadcast HD is already crippled frequency-wise - I've seen plenty of 'HD' stations where macroblocking effectively cuts the spatial resolution to NTSC levels for anything but a static image - and to the continuing push for streaming content delivery, which is even *more* limited, and... yeah, aside from gaming and direct computer stuff, I don't see it working really well.
And with gaming itself crippled by horribly out-of-date consoles that are still struggling to deliver 1080p for some games, I don't see the next gen able to supply high quality graphics to 3892xWhatever; even the new monstrous PC cards have a tough time maxing things out at 3x1920x1200, and those cards have, I don't know, 8x, 16x, the raw pixel pushing power of current consoles?
You know what I want to see? *Actual* HDR. We've spent a ton of time trying to match the perceived spatial resolution of the real world, so, great - but as Graham points out, we're still way the hell off on color reproduction.
Show me a TV that can light up my room like an open window, while retaining near-zero blacks, and I'll start to get excited.
David, well said.
Frankly, I'm often horrified by the lack of quality that I see in hi-res video--even professional video [film-replacement] systems. Over-compression, compression artifacts, CODEC limitations, clipped highlights etc. etc. often make images look unnatural and artificial--in fact, to me, they often look quite horrible.
In many cases, going from analog to digital seems to have been an excuse to bypass many of the norms and standards which make for high quality images (and which the analog world took and still takes for granted). If one feeds say an optical--i.e. via camera lens--Pulse & Bar* & grayscale signal through the complete digital camera/recording/monitor chain and just views it--leave aside electronic measurement of the signal for the moment--the problems are glaringly obvious. One sees artifacts of all sorts--the classical (analog) overshoot, ringing etc. as well as digital noise to the extent that would have been unacceptable in professional analog systems. [Of course, I'm referring to a test signal (and distortion products) that's been appropriately scaled to the resolution and bandwidth of the specific system.]
"Show me a TV that can light up my room like an open window, while retaining near-zero blacks, and I'll start to get excited."
Right, despite the ooh-ah factor experienced by many video neophytes to HD video--and for that matter its many real benefits over older systems--digital imaging has a huge way to go before it represents a true analog of the image that its endeavouring to reproduce.
--
* The Macdiarmid / BBC TV Pulse & Bar 'T' test signal goes back to the early 1950s. It's old but its design is still relevant as is based on the actual optical distortion that's perceived by a viewer after an image goes through any video chain/process. As the test signal is an analog for an image, its perceived optical distortion is what matters, it's irrelevant whether the medium is digital video, analog TV or even film for that matter.
>At least 13 or 14 stops dynamic range should be the short-term target for digital image sensors so we can enter >the High Dynamic Range Imaging (HDRI) era.
Easily done... just use two cameras and a half-silvered mirror at 45º. You can get cameras such as Canon's C300 that can capture video in situations we can barely see with our own eyes. The correct balance can then be worked out in post production. There was a good video demonstration of this technique featuring a welding torch, HDR'd to the max.
More dynamic range would be good, but I don't think it is necessary to create an image that is indistinguishable from a window, at least for narrative storytelling. Time will tell. Let's see how Peter Jackson's 48fps goes down with film makers and audiences.
Agreed. The issue, of course, is to get an agreed widespread standard. And as we well know, this is no easy matter. Just examine the history of the NTSC/PAL/SECAM wars of the mid 20th C. for that.
My feeling is that we're going to have a lot of interim standards and it'll be a considerable time before things stabilize to the extent of NTSC/PAL/SECAM and or 35/70mm film coexistence.
Irrespective, some experienced producers of film movies who have now gone to digital for convenience, continually whinge and bitch about the lack of dynamic range, especially the white compression/clipping problem. In the old days of film, details in the highlights could be extracted out of the negative by the lab if the director wanted them, now they're clipped and thus do not exist (or are too compressed to use--cause banding etc.).
You have a point, but most normal-peoples' computer operations are RAM / storage-bound, not CPU-bound. Running a bunch of browsers doesn't use much CPU, but it chomps RAM like nobody's business; same with productivity stuff. Yeah, you'll be sitting waiting when you're doing gaussian blurs in Photoshop, but even then, stuff like InDesign and Illustrator should be mostly good.
For general non-gaming use, I'd take a relatively crap CPU, SSD, and 8gb over an i7 with 4gb and a 5400rpm hard drive any day.
The "im on benefits" plan - standard def with XL adverts £20 pm
The "im on benefits and working on the side" HD plan with XL adverts £40 pm
The "im on benefits, working on the side with an undeclared partner" UltraHD plan with XL adverts £60 pm
The "im on benefits, working on the side with an undeclared partner and 2 kids" UltraHD plan + XL adverts + sports and movies £120 pm
Ok if you live un the US where the Broadcasters have to bow to the commercial interests of Sponsors to update there Equipment to the latest standards. Then yeah it could happen.
If you live in Communist Europe, (and Blighty to a far lesser extent), where the Broadcasters just don't give a toss 'cause the can just rake off their Income off the Public. Then enjoy the Two or Three HD Channels that you currently have and the One Pay-TV 3D Channel (e.g. Sky), and or the few times that the BBC decides to annex the BBC HD Channel for 3D Content.
And you'd really expect that Ultra HD is gonna be a boon here in Europe? LOL no not really.
Before we even begin to worry about such things, let us first begone of all SD content.
This would at least help the credibility along a little.