maybe now we'll get some high resolution laptop screens back rather than the low resolution "hi def" crap that's been foisted upon us since HD came out.
If you’re a keen Reg Hardware reader, there’s a pretty good chance that you’ve already got a high definition TV, and possibly a Blu-ray player too. Technology, of course, doesn’t stop there. While the switch over from analogue 405-lines to 625-line broadcasts - and TVs - took decades, changes occur more swiftly in the digital …
maybe now we'll get some high resolution laptop screens back rather than the low resolution "hi def" crap that's been foisted upon us since HD came out.
IMO they should have settled on 720p for all TV broadcasts and 1440p for TruHD movies etc. via BD etc.
Something to be said for a nice 50% downscale/100% upscale rather than trying to fudge between 720/1080.
1080p sounded vaguely wild and tricky 7 years ago but now 1080p is holding a lot of things back (trying to buy decent computer monitors with some depth other than 1080 or the still quite pathetic 1200 for example).
"1080p sounded vaguely wild and tricky 7 years ago but now 1080p is holding a lot of things back (trying to buy decent computer monitors with some depth other than 1080 or the still quite pathetic 1200 for example)."
There's a good reason why you want a monitor or TV with a 1920x1080 resolution - if you use a higher resolution display such as 2560x1440 then a movie in 1080p will look blurry due to the resolution of the movie not matching the native resolution of the monitor (1920x1200 is an exception to this because you just get a letterbox effect).
......for totally not getting my post.
I was complaining that settling on a resolution for HD material that isnt particularly deep or that well suited to actual computer use is holding us back.
The gist is 1080p isnt enough if its to be forced wholesale on PC users who have other things to do than just watch movies all day.
Now a 1440p monitor might just have been more useful.
1080p as a standard just isnt good enough for the range of applications its being pushed at. It's allowed lazy manufacturers to be...well...lazy.
Resolution is measured in lines per unit of length. So 1080 has 540 lines per image height, 2k has about 1k lines per image height.
It's nevertheless 4 times the amount of pixels.
It's about bloody well time.
I've been using 1280x1024 displays since the mid 80s.
Sony kindly lent me a 1920x1200 display in the very early nineties when everyone was arguing about what resolution HD would be. Sure made a nice monitor on my Sony MIPS powered Unix workstation.
Well that was damn near twenty years ago. Isn't it time for some sort of progress?
Hell you were slagging a damn phone off the other day because it only had a lousy 8MP camera, so surely it's time for the marketing droids to convince us we need more than a 2MP HD TV - you'd laugh at a 2MP camera, so why not demand your screen keeps pace?
Do I need a bigger teli, no. But when Tesco's are selling 8MP TVs for a couple of hundred notes I'll be able to buy a 8MP monitor, which I do need.
You might be missing a point there. A 2 MP camera will produce really low quality prints. Printing needs a lot more resolution then displaying stuff (specially displaying motion footage). So I'm not sure the argument that cameras have reached a certain resolution, hence monitors/displays/TV's should reach the same resolution actually stands.
No a 2mp camera will not produce really low quality prints.
They will product good quality prints unless you start to enlarge them/zoom too much.
Obviously having the spare capacity to enlarge prints/crop/etc is good, but the resolution doesn't necessarily mean that they are not good.
Most signs and billboards are printed at a much lower resolution than a magazine - since the viewer is going to be at a much larger distance.
Same with TVs.
Seriously - who prints photos these days? I'd bet that less than one in a thousand of the digital photos taken these days are printed (and an even lower proportion of the photos taken on phones).
Print labs seem to want about 250dpi
a 2MP camera with happily produce 6x4 prints.
4MP is all that is required for 9x6
5MP is the requirement for 10x8 prints.
How big do you normally print your shots?
I can think of a loads of reasons why I need a higher res monitor, my old laptop (2002/2003 - 1920x1200) was about 150dpi which seems reasonable for a display, I'd love a 30" monitor of that resolution please.
Anyone with the right tools can make even a 40960 x 21600 film, today .All it takes is feed the picture from a DSLR's CDD to a hardrive cluster, and then feed it to x264, with the appopriate resolution set.
The problem is getting the filesize to a number that's tolerable. For example, 1080p/1080i existed under the moniker of "HDTV" since forever, but only when Mpeg 4 avc appeared it started to matter.
By the way, how much of 4K footage can a Bluray Disc fit? 1 hour?
Some people say that a new disc callled HVD will solve the problem, but discs are dead IMO. A tiny scratch on such a disc would obliterate minutes of content, no matter the error correction. The future is on solid state drives and harddisks, and they won't reach storage capacity capable of doing 4K for acceptable number of movies for quite a while. And don't get me on started on IPTV/Video On Demand. ISPs will never be able to provide connections that handle the extra load.
And since the new and proposed H.265 won't achieve more than 50% bitrate reduction, and added to the fact most people can't tell the difference between 4K and 1080p, then we can conclude that 4K is just a silly gimmick.
"Some people say that a new disc callled HVD will solve the problem, but discs are dead IMO. A tiny scratch on such a disc would obliterate minutes of content, no matter the error correction. The future is on solid state drives and harddisks, and they won't reach storage capacity capable of doing 4K for acceptable number of movies for quite a while."
I can see this happening where movies and other content are distributed on cheap SSD chips that you stick into a USB3 port.
You'd go to Blockbuster and come out with something a lot like the old PS2 memory cards.
Another benefit is easy writing (multiple times).
At 0.1 bits per pixel you'd get 5 hours of 4K video on a 50 GB dual layer disk. x264 will produce excellent quality at 0.1 bpp if only given enough time. The most extreme encoding I have ever seen was at 0.035 bpp, I could spot a few small artefacts here and there, but mostly it was sharp HD video.
The point being, given proper encoding blu-rays are way oversized for storing a single 2 hour 1080p movie.
I recently "downgraded" my 36" 1080p screen to a 42" 768p one, and the image on the new one looks far sharper (after turning down the insane oversharpening, obviously) and nicer than the older screen, even with 1080p sources. My living room isn't big enough - according to the chart above - for it to make any difference, and my experience certainly bears that out.
the problem is that most of the ppl don't know how to set their TVs, sharpening on max, contrast on max and so on. properly set tv (a good one) can provide quite surprising results even on SD.
when it comes to 1080p vs 720p vs SD (576p). the 1080p vs 720p I can see from distance of 1.5m, beyond that the difference is negligible. between 720p and SD I need to move good 3-4m away from the screen. my screen is 40incher.
4K is really good but to fully enjoy it you need bigger screen. there's no point to put 4K on 40-90 inchers, you wouldn't see any difference.
what's really bad is the fact that not that many recent movie are worth watching and those are the ones shot in 4K so the problem will be with the source material.
even 1080p versions of old films (re-mastered ones) don't look that good like natively shot films in 1080p.
you can use retina calculations to work out how far/close you need to sit from your screen to stop recognizing individual pixels.
if most of my video material was 1080p i'd go for 50inch TV, but that's not the case, it's mostly SD and that's why i've got 40incher and sit almost 4m away from it.
Those who want tips on setting up their TV correctly can of course refer to the handy guide we published here: http://www.reghardware.com/2009/08/19/hdtv_setup_guide/
Have to agree with that. It seems to me that everywhere I go, homes, pubs, airports, medical practices etc the TV's are always "configured" in the following manner;
Aspect Ratio: Pan and Scan (sometimes with added letterboxing to create a super stretchy picture)
When I inquire of the owner/operator whether they find that the "stretched and squashed" picture is a bit annoying and offer to fix it for them they look at me funny and tell me that it looks OK to them.
Nowadays I just grit my teeth and try to ignore it.
Surely that's the point of wide screen TV?
This is just NOT good enough, how long do I have to wait for 12k movies to watch on my 40" Retina Display monitor?
...but they're installing Sony's 4K D-Cinema projectors. The ones that, unlike the rest of the Digital Cinema world who use DLP (which works and has a nice contrast ratio and decent colour reproduction), are using LCOS - effectively a giant LCD projector.
There have been many negative review of Sony's system, not least stemming from the imaging panels failing very quickly in day to day use. And if you want to show a 3D film on a Sony projector you can't show it in 4K... you have to run it with a weird double lens that produces 2 less-than-2K images stacked on top of each other, because it can't switch quickly enough between left- and right-eye images for the conventional polarisation or shuttering processes to work.
Until Sony decided the world needed 4K, hardly anyone knew they "needed" it!
But of course there isn't a digital cinema standard for 4K 3D. All 3D titles are 2K.
For the Summer lovin' caption.
Oh yes, the "TV standard" that turns up periodically and has nearly as many pixels as the T221 I've been using since 2004 (and came out in 2001). CMO made a 56" 1080p panel several years back, and it's odd how people keep demoing 56" screens with this resolution...
More pixels are lovely on a monitor. For the TV, one of the biggest benefits is that 720p and 1080p both fit into it properly (by tripling and doubling pixels, respectively) rather than the current situation of 1080 sets mangling 720 content (although not as much as 1366x768 sets do). But we wouldn't have been in that mess in the first place if people had been able to agree on standards properly. Er, insert 16:9-is-a-pain rant here.
If people want to upgrade TV, they'd be much better wasting bandwidth on broadcasting 1080p at 50/60Hz (or 1080i at 100/120Hz) rather than trying to reconstruct bonus frames in the television. I love more resolution, but my TV's big enough, thanks.
Mind you, the Sony projectors in LA (that I saw in 2008) were very pretty, when sitting right next to the screen - but it was a damned big screen. And boy, were the pixels big when they showed Terminator 2...
This is technology for technology's sake. I can see it might be useful for specialist applications, where the higher resolution can be used to pick out details by zooming in, but for general entertainment "4K" is overshooting the market by far.
But a few of these stacked around me would make a VERY nice CAVE VR environment!
Well, it is for me. I have to wear glasses to look at a 20in monitor less than a yard away. Resolution is 1400x1050 and even then I can only just make of the fact that there are dots. Standard res. TV at normal distances - no dots!
I won't buy a new telly every 18months when most channels are still showing shows made when 625 was the only choice.
Stop pissing about with the screens, it's the dodgy content that puts me off, not the picture.
Clearly 4K was made so that someone can put it on a 24" LCD (IPS please) screen for my desk, so I can actually get some work done. Just think of how many absurdly small-texted terminals I could have open without overlapping!
Who am I kidding, I still wouldn't get any work done, I'd just read El Reg - and I'd have to use my browser to scale it up so I could read it while leaning waaaaay back in my chair.
We don't even have true HD right now. EVery providers compresses the shit out of it; even Blu-Ray. For home use, we need less compression, not more pixels.
Where are they?
I want one to show a home video of the flying car.
With the latest kit games are easily playable at 2560x1600, anything lower and that hardware is wasted. So I'd be interested in a 2k/4k projector:) dunno what lag/latency whatever is like though, that could put a downer on things.
It's pretty annoying that monitors haven't progressed, were stuck with low PPI 1080 displays...
It's daft that my cheapo £100 android phone at 800x480 has a higher density screen than any 1080 tv out there...
HDMI 4.1a supports 4Kx2K, but not at 60hz, not even at 30, only at 24hz at 36bpp color. 3D supprot is limited to 1080p/24 as well (full resolution double frames) or 1080i/60.
In contrast, DisplayPort supports higher resolution (limited only by bandwidth, which is 17gbps vs 1.4a's ~10gbit) more color spaces, and a faster etherchannel (and/or a USB passthrough). It also daisy chains up to 4 displays (at lower resolutions). Its also royalty free.
1.4 added a data channel and passthrough audio, and some additional format supprot, but did not increase the link bandwidth. We need HDMI 1.5/2.0 to do that to support not just 4K, but get 60fps in full color on it, let alone 3D.
DP is the better system, especially moving into these higher resolutions. This is confirmed by the inclusion of DP conenctors on the 4K shipping TVs today. We either need a new HDMI (likely with yet ANOTHER damned cable end), or to just move to open source and be done with it.
.....10 bit colour.
I like the thought of gaming at 3840 x 2160 but I am not sure, if my house could take the graphic card behaving like a typhoon on full afterburn!
HDMI 1.4 only supports 24p 4k, so moview only.
Given more and more movies are being shot in 3D also, there's no Side By Side 3D 4K support in HDMI 1.4.
I used to have one of these on my desk (3840x2160 56"):
Well, I say on my desk, you had to sit about 3m away from it. These displays really work well when you have a large amount of data (numbers, radar tracks etc) that you need to show on one screen. The 56" I had was used to show a coordinated air traffic picture of Europe, so I could see every single plane from South England, all the way down to the top of Africa and all the way over to the start of Russia. And when I say I could see them, I mean that I could read the altitude data, call sign etc or over 200 planes on one screen without having to touch a mouse.
And Google maps looked sh*t hot on it also, the resolution was so high it was literally like having a proper printed map on the wall.
We have numerous 4k2k screens here in the office and I have to say that for clarity of image there's nothing like it.
I don't think there's likely to be any market for them for a good few years but still the picture quality is excellent and much better than the 1080p screens we have.
To accurately digitise a 35mm film so it's regarded as being indistinguishable from the original you apparently need to scan it at 4000 lines, so as soon as we've all upgraded again to the new fancy 2000 line screens, bought the fancy new megaray-disc players and the super-HD remastered Star Wars & Aliens boxed sets it'll be time to do it all again when the 4000 line screens are available.
"To accurately digitise a 35mm film so it's regarded as being indistinguishable from the original you apparently need to scan it at 4000 lines"
True. But... If you only need to show that photo on a PC monitor, a 1600 line scan would probably suffice just nicely. OTOH if you ever plan to take your fantastic photo and produce a 90x60 cm (or larger...) print for upclose viewing, then you will probably want to have as much resolution as you can get your hands on. (or even better: shoot with medium format film in the first place or just go digital from the beginning)
My old 30" monitor sports 2560x1600. When I will need to replace it, I would prefer a good selection of 40" monitors to choose from and then 1920x1080 is just not going to cut it! Altneratively I guess I could get two of the 30" buggers, but that is not a very elegant solution IMO.
Few things are as delicious as an NHK transport stream
While it's kind of cool to watch something like Mad Men on Blu Ray and see the grain from the film it was shot on if you stand close to the TV, if you watch this on broadcast TV all you see is compression artefacts. Try watching something fast-action like football or a pop concert with glitter falling everywhere on SD Freeview on a big telly - it looks like photos from a mid-90s digital camera (the ones that stored 20 pictures on a floppy disk). The number of pixels is irrelevant if it's sent to you at a limited bitrate. HD Freeview is better but still not exactly great.
And that's just image quality. The content's often not up to much either...
While I have my doubts as to it serving any purpose for watching TV and movies in one's living room - although at some point, applying the up-rezzing technology used to provide "fake" HD from DVDs by interpolation might be used on a 2x display and a 1080 source in the living room as the ultimate deluxe home cinema experience anyone in their right mind will ever want - I think this technology is urgently needed for archiving cinema, and for cinema production.
And it should also end up being adopted in theatres when practical. If people spend the money to see a movie in a theatre, they should get film quality, not slightly improved TV quality which they already have at home.
So I see it as a valuable technology, but I also agree that we'll never need it in our living rooms. Our eyes are only so sharp.
"In the US, TV maker Westinghouse produced a set with "quad HD" resolution as long ago as 2007 - for around $50,000."
How much for a version without "capacitor disease"?
I always "thought" that 2k and 4k equipment was only meant for commercial use. Lots of commercial media is shipped to digital cinemas every week (where Cinemas aren't doing digital downloads). Hollywood doesn't need to worry about someone ripping-off the next release of Star Wars because the commercial product won't play on anyone's TV or computer. At least not yet :-)
Mostly, TVs have been getting thinner. In fact, it is even a marketing point.
The TV illustrated in the first pic is big enough to have a door!
Then of course there is the IBM T220 and IBM T221 monitors. Which, unfortunately, are long off the market, and I've been unable to find even a used example for sale. This bit of madness was a 22.2" LCD that ran at 3840x2400! Yep, you could get (almost) this full 4Kx2K resolution, with some room underneath for some menu controls or something. This comes out to 206 dots per inch, these apparently looked ASTOUNDING.
The T220 came with a Matrox G200 MMS card, since it required *4* DVI links to run at full resolution. The T221 had several variants, with various solutions allowing fewer DVI links to be hooked up to drive it.
Biting the hand that feeds IT © 1998–2017