What's the point?
Are they trying to top the 4K monitors that are barely available? How about making an affordable 4K monitor that integrated GPUs support, instead of a resolution only a handful of graphics cards are even capable of displaying at?
When it comes to pixel size, monitor designers boast about who has the smallest. The new Dell UltraSharp 27-inch display has the smallest of the lot. The ultra HD 5K monitor is the world’s first screen with a 5120 x 2880 pixel resolution. That’s 218 pixels per inch. That may sound a way off the LG G3 phone which offers 534 PPI …
Yes, it's headline waving phallus stroking but it will sell and I'm happy to see it.
For most people virtually no point. I'll take a look at one and so will many photographers and videographers. Some RED cameras shoot at a pseudo 6k&5k (there is a holy war in video\film over how 4k isn't 4k because it is interpolated and subsampled from a bayer array rather than a true 4kxRGB) and iirc SONY have a camera that shoots 4k with a future option for 6.5k and possibly an interpolated 8k because the pixels are slanted or some such voodoo. So basically if you are able to spend 50+k on a camera (or rent them), then buy 3 of them, then buy a couple of sets of cine glass (reasonably cine glass tends to start around 5k a lens, way more for the better stuff), you aren't going to flinch at 2500 for a monitor. Whilst larger studios will probably stick to higher end, lower res, better calibrated \ wider gamut monitors for a while I can see this being very popular with indy setups, especially given the comparatively low price. Personally I would kill for a 40-50 inch 8k monitor, being able to view shots at almost 100% resolution on screen would be epic but I respect there's very few people would share my enthusiasm for it:)
This isn't really meant for gamers or for office work. It's not meant to sell in large numbers but I bet it will sell and it gives Dell some schlong waving time and plenty of free press. My biggest shock is how cheap it is, I would have put it at over 10k.
and if you want really crazy, I think sharp demo'd a 13.3 inch 8k screen a while back. Crazy mofo's! 8k is coming and probably within a decade. It's a fair call to debate if its needed and there is a very decent argument to be made for frame rate \ bitrate \ gamut and DR improvements to be made before we mess with even higher resolutions, but it is coming because the general pubic know 8k must be better than 4k but WTF is 16.5k stops of dr?
Cameras are pushing way ahead, mostly because when stuff was shot on super35 or similar you could simply revisit the masters with a higher resolution scanner and you could release a 1080p or 4k version of the film. If you shoot digital thats it, you can interpolate higher but in reality there's no real extra information, just interpolated detail. Shooting in 8k asap means the films can then be rereleased at 8k later for extra money even if 8k isn't a realistic proposition now.
Do you really think that Chromebook/low end laptops will move away from that god awful 1366x768 resolution anytime soon?
High end laptops etc will get better displays but for the masses.... their crap is built down to a price. i.e. as low as possible and still return a small profit.
I for one look forward to high res screens. Not for 4k or 8k Video but to see the huge images that my new Hasselblad shoots in glorious detail and 100% size. mind you I expect it will be hard to ship those screens without at least one dead pixel.
Congrats for being the only person to come up with a good reason why a 5K monitor rather than 4K. Everyone else seemed to think I was saying "current monitors are just fine" which I obviously wasn't. I want 4K monitors to become affordable, but I can see why there would be a niche market for 5K monitors for 4K video editing.
Mind you, if I was doing that job, I'd probably want a monitor bigger than 27"...:)
Per elementary sampling theory, there's quite a lot of natural high-frequency signal — even in output like text — that cannot be displayed on ~110dpi monitors. It's not just a question of whether you can see the pixels, regardless of what faulty instincts indicate; low-pass filtering can always make the pixels invisible if enough colours are available, just by omitting large parts of the signal.
A monitor like this gets much closer to looking like print than does one with a quarter the resolution. It is therefore a much more comfortable reading experience. It is therefore better for pretty much every purpose.
Put a Chromebook Pixel next to a regular Chromebook or a Retina MacBook next to a regular MacBook. I guarantee you'll see a huge improvement.
Eyesight has its limits and it is expressed as 152 400 (divided by) pixels per inch and gives the distance in millimetres, beyond which the individual pixels are discernible by young eyes (make the number 101 600 for old eyes).
So anyone sitting more than 700mm from this screen will have a "retinal" display and most of their field of view taken with this display.
This is an oft quoted statistic and essentially it is true that your eye cannot resolve a specific pixel by itself past a certain point. However this is not all the story. We frequently print pictures at resolutions that far exceed our ability to resolve individual dots but can tell the difference between a picture at the limit and beyond it. It's hard to explain on a forum with text (and my half assed grasp of physics and maths) but it comes down to better graduation, increased hues and microcontrast. Take a line of black pixels on a red screen, add a white pixel in the middle of the black line. Now as resolution increases you will stop being able to discern the white pixel but the line will not remain black, there will be a slightly gray area. This has been done to death in photography many times over and when you have a source image with enough detail you can tell in prints. Many times when people say they cannot see the difference in a print it is because the detail simply isn't there to begin with and or it's been printed by someone with little experience of professional printing.
As for viewing distance that seems about right for a monitor. No problems with it taking up my entire field of view, although I completely respect that is a niche use compared to say gaming or using ms word :) It might not be relevant to many but I'm happy we are finally getting displays like this at vaguely sane prices.
... the usually quoted PPI figures for human vision are wrong, and are based on some assumptions. There are also situations where our eyes can resolve more detail on a VDU, such as a single-pixel wide diagonal line - of the sort often encountered when working with CAD.
http://www.cultofmac.com/173702/why-retina-isnt-enough-feature/
Basically, Steve Jobs based his 'Retina Display' figures on the assumption that we can resolve 1 arcminute, whereas most estimates place the figure at between 0.6 and 0.4 arcminute.
A 27" 16:9 has roughly the same height as a 24" 16:10 - if you think of it as a 24" 16:10 with wider fov its actually not too bad (unlike 24" 16:9 which I find forces me too close to the screen for comfortable desktop in turn making graphics look too pixelated at the aging but still popular 1080p resolution).
The problem with 4k monitors right now is poor support for the displays. Not to mention poor refresh quality and ghosting. So The bulk of them are useless video. Not to mention if your buying dells current offering of 4k displays high end or low end. They all suffer from a problem with not functioning properly. They will often only draw half the display due to how they are stitched together from two video inputs. Though the big problem is that buggy display port firmware is usually the crippling factor for these displays.
Not to mention who is going to spend 2000$ on a display that needs a high quality display. To have to send it back immediately cause its unable to be properly calibrated. Or suffers from a defect out of the box in terms of acceptable professional image quality. Let alone have to send it back for the above mentioned firmware problems. That dell doesn't make available, and requires you to send your monitor back for. Replacing it with a random refurbished model that might once again suffer from image quality and calibration issues.
So the only people that are really enjoying these monitors are people that want smoother fonts, and are willing to deal with a great deal of messing around on a consistent basis.
Lots of (entirely IMHO) missing-the-point comments about 4K.
4K is basically rubbish on large screens. It's just a 1920x1080 *equivalent area* display in most cases, albeit with the ability to scale outside the natural quad density arrangement (or in Windows' case, a general inability to scale consistently across applications with Hilarious Consequences). Most use a 4K monitor as something with 4 times the detail of the equivalent "low DPI" display, for a 1920x1080 equivalent area. At 27" and using typical laptop display area / on-screen element size as a reference, that's comically bad; UI elements are very large with a feeling of much wasted space.
Worse, 4K made it look like the industry would just settle on mass producing cheapo 4K panels and we'd get stuck in a prettified version of the 1080p rut we've endured for several years already.
Fortunately - albeit in largely niche products with a price to match - manufacturers have been making 27"-30" monitors with a 2560x1600 or 2560x1440 (former 16:10 preferred, more area / height - e.g. widescreen video editing *plus toolbars* top/bottom) resolution for a few years now. Dell's new so-called 5K display isn't just a numerical contest to try and make 4K look outdated, it's merely the natural quad density evolution of the predecessors; you end up with a 27-30" monitor that has a vaguely sane desktop "area" (think of it as 1440p), rather than something that's really just a crummy 1080p panel in disguise.
As someone who was using 1600x1200 CRTs in 1994, today's "FULL HD!!1!!11" seem rather pathetic and 4K overrated and overdue. The Dell announcement is great news, though I'd be even happier with a 30" display at 5120x3200 :-P
^^ This,
I had a 17 inch 1600 * 1200 of some description at uni in the mid 90's and a wicked DIGITAL brand laptop with the same resolution around the turn of the century. Since then, things went from bad to worse, with 1600 x 900 or even the nasty 1366 * 768 or some such bullshit widescreen lappies and the aforementioned Fv77 HDzorz which seemed to last an eternity on the desktop,
In 2004 IBM released the world's first IPS 3840x2400 16:10 QWUXGA production display and it remained unrivaled for nearly a decade... A decade might as well be a century by computing standards. After their reign and subsequent removal from the market place, I picked up three and use them to this day triple head with Mavericks. they run like a dream on my old cheesegrater.
I will never go back to anything less than 2000 on the vertical. I have already tried the dell, the philips and the asus 28" 2nd wave of budget UHD panels and decided to stick with the Asus, with its rock solid image and amazingly less than 20ms lag in "4k". Sure I won't be using it for photo editing, but it's dandy for CAD and these (once $100,000) IBM beasts are only just now starting to show their age next to my 2014 model Asus. And so they should.
TL;DR
Finally, finally the evolution switch as been thrown and manufactures are currently scrambling to bring a range of monitors to the market and If a company like sharp brings an 8k screen to market, then GOOD ON 'EM is what I say.