I love the 13" form factor, but dislike the terrible resolution.
Fancy a 3840 x 2160 display in your next 13in laptop? Form an orderly queue outside Sharp's offices then, and loudly demand it turns its latest prototype panel into shipping product. The 13.5in screen contains just under 8.3 million white OLED pixels filtered for RGB colour. Its dimensions yield a pixel density of 326 pixels …
I love the 13" form factor, but dislike the terrible resolution.
Sorry Tony, but you and I both know that even if they make it and sell it everywhere, the reztards will still complain on every laptop review.
actually it'll be the ratiotards that'll continue complaining. 16:9 widescreen is a silly aspect ratio for producing A4 portrait documents.
Even Sharp's doc is vague on whether that's actually 8.3 million white pixels that are filtered through a lower resolution RGB (of some sort) filter (which would make it, at best, 1280x2160 in reality with RGB stripe, or 1920x1080ish with PenTile RGBG), or something different.
AFAIK, OLED displays don't have a front filter - that's the whole idea. 1 pixel = 1 LED.
It's nice to see someone pushing laptop screens in the right direction.
'Retina' is just a bullshit buzzword that some coke-addled Apple marketing twunt slapped on a >300 DPI display.
I believe that's the same apple twunts next voice to text app!
If it uses TFT technology, since when is it OLED?
The "TFT display" that we're all accustomed to is a liquid crystal display that's using thin film transistors.
This OLED presumably won't use liquid crystal, but I would imagine a display composed of organic LEDs could also have a use for thin-film transistors.
96dpi web images look like on such displays?
Will web images need to have the resolution upped to look good?
If so, that extra bandwidth is gonna cost someone.
Anyone out there shed some light on this?
The web always has been designed to scale from tiny resolutions to large ones. That's the whole point about HTML, otherwise you could have gone for a simpler standard.
So webdesigners will have to learn how to not be idiots and specify sizes in something other than pixels, and images will simply be scaled. Once your output medium has a good resolution you won't care. (Just look at printers, you hardly ever print a picture at the printers native resolution)
The end result will be a more usable web, one that can be used on both your mobile phone and your retina display wall-sized desktop :)
OK, it would be nice to have a screen that could actually display a 1::1 rendition of the mutli-megapixel snaps that our hyper-giga-sooper-megabyte cameras and phones (complete with their mess-produced, fixed focus little plastic lenses) can take. But that's about it. All that will happen then is people will begin to see the Emperor's New Clothes of a 14Mpix camera that is bugger all use if the shot isn't perfectly focused, and taken with a decent lens (read: costs more than the camera) with a noise-free image sensor, and no camera-shake.
So far as looking at internet p
orn ictures goes, unless they get re-scaled to a suitable DPI, which obviates these extreme resolutions, they'll be about the size of a postage-stamp. Text, likewise.
As for movies, even 1920x1080 formats will need to lose the benefits of all those millions of pixels just to fit properly on the screen - unless you're planning on watching 4 movies simultaneously.
Finally, who actually has eyes that could discern such high resolution? Sure, if you have eyes like an eagle and are viewing in a well-lit (but reflection-free) environment then you might possibly get some benefit from a 4MPix screen on a 13-inch display, but for ordinary people: with or without fully corrected vision, viewing from a sensible distance, this seems like a "we'll do it because we can strategy - just like the megapixel marketing campaigns are with digital cameras.
Dunno, I ordered the higher res screen on my laptop and it all works well (1680 x 1050 on a 15" panel). The only annoying thing is that it's glossy, but it would be pointless to have hi-res with a matt front..
1920x1080 is quite simply too small.
My previous laptop had 1200 vertical pixels, and I really notice the missing screen space in many applications.
Higher screen resolution also lets you make writing easier to read - compare this text drawn with 5x7 pixels to the text on your current monitor.
Now make the dots smaller while keeping the text the same physical size (using more dots) - again, it becomes nicer to read.
When we look at a 1080p movie, the upscaling of movie pixel to screen pixel again affords the possibility of nicer pixels - when in motion you can estimate what the 'missing' pixels would have been, increasing the effective resolution and making for a nicer movie experience.
Secondly, if the film is instead digitised at a higher resolution we'll get the real detail in the film - maybe even as high as the digital projectors used in the cinema.
Movie houses aren't going to sell those discs/licences until enough people have these 'above-HD' resolution screens for it to be worthwhile.
I don't understand why. You can get >600 dpi on paper, but it doesn't look like a mirror.
I don't really see the point of any resolution screen that's glossy
I use 1680 x 1050 @ 15" on a matt screen. I loathe glossy screens with a passion. The slightest light from behind you and portions of the screen become unusable. My matt screen? Like a Duracell bunny..
(Ultrabook res? No thanks..)
It's about time OLEDs hit mainstream (on big screens that is). OK, so OLEDs have a shorter life but the dazzling colour and dynamic range make up for it.
!3" laptops are a good start but I'd love to be replacing my Dell 27" IPS with an OLED one.
I'm sure there was study done by IBM years ago that showed that most people need a resolution of about 300dpi to be able to comfortably read text for an extended period. Most printers, even silly cheap ones do more than that, yet screens have been stuck in the miserable 96dpi for ever.
While there will be some changes required - I, for one welcome our high resolution overlords.
"Retina" displays have 4-6x the pixels of a normal display which fine and dandy but it means you need a GPU which has 4-6x horsepower and 4-6x the video ram just to display them at their native resolutions.
And for that you get a desktop with teeny tiny windows because the dimensions are effective halved in each direction. So you're forced to use bigger fonts to attempt to compensate and then you get screwed up layouts and other glitches. And forget running games at this resolution since most games would chug at far lower resolutions than that.
So I'm not altogether sure what the point is unless Sharp intend to stick some kind of grill aperture over the top and use it to show 3D. I suppose the extra resolution would compensate for the need to send half to each eye.
Don't worry about the GPUs in the desktop or mobile PCs, they can deal with high resolutions quite fine in 2D and have plenty of RAM. An entry level integrated card with 64 MB of RAM can deal with 4000 x 4000 resolutions even triple buffering.
The 3D games will scale, that means, using 4 screen pixels for one game pixel. Not a problem.
This high res screens mean progress. You may not see the point, but I am literally seeing it on today's low res screens.
Sounds like something for the number freaks when used as a general-purpose lappie display but for graphic designers, photo-editors, typographers and so forth, it should be wonderful for an even closer approximation of a properly set printed page.
Screen resolutions have been going backwards for years. If this can reverse the trend I am all for it!
I doubt anyone can see the difference between 300dpi and 1200dpi. I can't see a quality difference between 150dpi and 600dpi on my laser printer, however it does take twice as long to print. It's all marketing.
Maybe the problem is your printer? Maybe it's physical resolution is 300 dpi and it interpolates anything higher.
Or maybe the problem is your images resolution. If you print small (size wise) images they are going to look like cr*p even on the nicest dye sub printer.
Er, the difference between 150dpi and 600dpi is really obvious. Even 300dpi looks jagged and crappy from a normal reading distance. 600dpi is basically the minimum; more is better if you're printing color, so you can get a good lpi. Sounds like you're in need of glasses.
...Just to clarify, I'm talking about laser printing, which has no anti-aliasing. 300dpi is fine for computer displays, which can do anti-aliasing.
Obviously these were intended for that 'cook an egg on it' new iPad which supposedly had a last minute switch.
Lucky Apple have the trademark so do they now say that anything over 263.999 DPI is now Retina?
Nice. Now, cut it down to 3840x1080, throw in an autostereoscopic filter in the front, and wham! Glassless HD 3D in a 13" space.
Biting the hand that feeds IT © 1998–2017