back to article OS X Lion paves way for "Retina Display" monitors

Mac OS X Lion incorporates support for displays packing four times as many pixels as they do today. Right now, the so-called "HiDPI" mode remains inaccessible unless you've downloaded Apple's Xcode software development tool. It contains a graphics test application called Quartz Debug and it enables HiDPI modes in Lion's …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Boffin

    Apple SDK should adopt BBC Basic...

    ... If memory serves me right, BBC Basic sported a virtual graphics area with coordinates ranging from -32767 to +32768.

    1. Anonymous Coward
      Anonymous Coward

      Density, not size

      As the article notes, you could use additional resolution to spread everything out further. The existing OS has never had a problem doing that.

      However, this change allows the additional resolution to be used to increase the pixel density of the display instead, so you have 4 pixels in the same area where there was 1. But it would be awful if all the UI elements suddenly shrunk to 1/4 or their normal size. This change allows all the graphics to be drawn at 4x the usual resolution so the display is sharper (and a familiar size).

    2. M Gale

      1280x1024

      IIRC you used the 1280x1024 grid to decide where to put stuff, and whatever graphics routines were in BBC Basic OS scaled that down to match your selected graphics mode.

      Anyway, wrt article, bitmaps don't shrink with screen resolution unless you made the display of bitmaps dependant on screen resolution. If OS X has been done right, the icons will just stay the same resolution and size while everything else starts looking smoother.

    3. N13L5

      how is this going to work with Apple's anemic GPU's ?

      The only reason I don't own any Macbook Pro to run windows, is that Apple, even worse than Sony, chooses out of date, low end graphics cards for their whole range,

      I was just on Apple's site yesterday... for £1299, you get a retarded Intel HD 3000 (what you get in netbooks for £300)

      for £1549, you get a low end AMD 6490 M with all of 256MB RAM... thats just sad...

      should be a 6950 M or at least a 6850 M or 6770 M

      And their website advertises how well 8 year old WoW runs. :P

      People will have their desktop turn into a slideshow if they try to go retina, even without much in the way of 3D effects...

  2. Sir Runcible Spoon

    Sir

    Colour me purple, but there are games out there that already run massive resolutions to support the triple head range of products, n'est pa?

    1. Tim Parker

      @Sir Runcible Spoon

      If I may, my liege, I think what the article is referring (arguably somewhat poorly) to is the additional of higher resolution GUI elements in Lion. These can be swapped in instead of their existing, lower resolution brethren on higher resolution screens. The effect is to have them scale - very, very simply - without looking like garbage.. or having to have them have a rather small on-screen size.

      It seems to just be a quick and dirty hack to partially get the illusion of resolution independence.

      1. Sir Runcible Spoon

        Sir

        I sit on my back-side, properly corrected and educated. I'll go back to properly reading the articles then :)

    2. Chris Parsons

      Why do I need a title to reply to a post?

      N'est-ce pas - if you want to be flash, do it right, please.

  3. James Hughes 1

    Er, what?

    All this guff about problems scaling icons etc. It all pretty simple, don't know what all the fuss is about. People have been scaling icons etc for donkeys years to match display resolutions. Have icon, have box to put it in, graphics manager/card/whatever scales it to fit.

    Of course, making a display at the resolutions described, and a graphics card fast enough to actually stuff data out to it fast enough is another matter (at an acceptable price). Sure it will come, though not holding breath.

  4. Glenn Booth
    WTF?

    What's new?

    Not sure what's new here - I was running three IBM T221 / Viewsonic 2290 displays (same thing, really) at 3840*2400 pixels each from a single graphics card under Linux more than five years ago. Everybody said "wow", but nobody wanted to buy it...

    I don't remember the dot pitch/pixels per inch of the IBM displays, but it's definitely 'a lot'.

    1. Anonymous Coward
      Anonymous Coward

      Wow

      204dpi according to Wikipedia. Impressive, especially for a 10-year old product. Must've cost a fortune then, of course (even by Apple's standards! :)

      Why Apple had to invent a special case for "x2" resolutions is beyond me. Every other OS has a few different icon sizes for use at different DPIs, adjustable according to user preferences.

  5. Anonymous Coward
    Happy

    Almost 4,000 pixels across...

    ...but some l-user will still run MS word maximised, with the document set to 75%. Stand back and marvel at all the wasted pixels...

  6. Dan 55 Silver badge
    Trollface

    What good is a retina display...

    ... if it's coupled with Apple's legendary knack of choosing woefully underpowered graphics cards?

    Still, I'm sure the water ripple effects will be nice, shame everything else on the screen will have them as well.

  7. Andy 70
    Facepalm

    oh god, not again.

    leave it to apple to "rebrand" something everyone has been able to do for years, but now that apple do it, it's new and fantastic.

  8. Anonymous Coward
    WTF?

    Isn't this the OS's job?

    Back when I first starting programming for Windows (when you had to deal directly with memory handles etc), the whole point was you programmed in a resolution-independent way. Has this skill been lost, or is Apple just catching up with the real world of device-independence? Shouldn't Display PostScript have provided this already to OS X?

    1. AdamWill
      FAIL

      Yes.

      Has the skill been lost? To put it bluntly: 'yes, pretty much'. I have a laptop with a 160dpi display and I used to have one which was 222dpi (Vaio P). Setting either Windows or Linux to these native densities just doesn't look very good; both OSes, and many of the apps on each, tend to assume 96dpi. They make some concessions - Linux more than Windows - but both are a long way from really complete. Websites are also terrible, utterly terrible, at resolution independence, and Firefox doesn't even have a working DPI setting that I can find, its font sizes seem to be arbitrary numbers which you tweak up and down until you think they feel right.

      Sigh.

  9. Badvok
    Facepalm

    Windows High DPI Aware Applications

    Let's hope Apple does a better job of this than MS. MS have been trying to get developers to make their applications "High DPI Aware" for ages.

    P.S. For those of us not born in the last decade there is a technology called CRT that is capable of much higher DPI than all this new fangled LCD nonsense.

    1. Frank Bough
      FAIL

      Wrong

      CRTs are woefully inadequate when it comes to high DPI. I'm using an Apple 27" LED Cinema Display here - DPI is 109 - way better than any practical CRT could manage.

      If you really believe a CRT can better an LCD for resolution, try sticking a zone plate on your CRT and look at the carnage.

    2. Sorry that handle is already taken. Silver badge

      Really?

      "P.S. For those of us not born in the last decade there is a technology called CRT that is capable of much higher DPI than all this new fangled LCD nonsense."

      Until a couple of years ago (when manufacturers switched from 16:10 to 16:9) 17" laptop screens were available with a pixel pitch of 0.19 mm (1920x1200), and the LCDs used on smaller devices can be below 0.1 mm. Were there any CRT monitors less than 0.2 mm?

      Most mainstream CRT monitors were no less than 0.25 mm. The high resolution 27" LCDs of today are 0.23 mm. That said, I do lust for more pixels.

      While the colour, contrast and response time of a good quality CRT is still superior in most cases, LCDs don't suffer from scanning error, or distortion, or electromagnetic effects, or flicker or etc. etc.

  10. alan17
    Meh

    But aren't our retinas insufficient?

    Does it really matter having these super-resolutions when you sit 1-2 feet from a monitor? I thought the whole "magical" thing about the retina display was it was pretty close to maximum resolvable pixel density when held at a distance that you use the phone? As any fule no, the further you are from an image the lower the acceptable / discernible resolution required. Hence magazines are higher dpi than billboards. You see the same effect in reverse when you sit too close to your friend's massive HDTV and see all the jaggies.

    So the only need for such high densities and resolutions are if you are sitting close to a massive display?

    1. Anonymous Coward
      Anonymous Coward

      Yes, but...

      I can see jaggies on my 1080p 16.4" laptop screen from two feet away if I don't have anti-aliasing turned on. I'm pretty sure I'm not alone on that either. Close inspection shows that the standard Windows fonts are displayed with single pixel width lines, which makes it hard to get away from those types of problems. But yes, what constitutes a "retina" display will vary based on the distance from the screen and the individual viewing it. The required pixel density for a retina display should decrease linearly with distance as we're actually concerned with a constant angular density on the retina. Looking at the specs for the iPhone 4, scaling that display up to 16.4" and decreasing the linear pixel density by a factor of 2 (assuming the iPhone is expected to be used at about 1' away) the screen resolution would be 2250x1500. That's only about 60% more pixels than the current screen I'm using, but it is denser. Actually increasing the current high-end pixel counts by a factor of four* as was discussed would certainly be overkill, but you would be able to turn off anti-aliasing for good.

      Personally, I also like the ability to "zoom" in on an image by simply moving my eyes closer to the display.

      *It is often mistakenly stated that the size of a screen doubles when the linear density doubles. Since a screen is two-dimensional, doubling the linear density or size actually quadruples the pixel count.

  11. Nexox Enigma

    What?

    The article said something about "...so GUI elements will render at the same size on high dpi displays."

    But what, praytell, is the point of higher pixel density if you're just going to have everything drawn at the same size? Clearly you can make things smaller, and thus fit more stuff in the same physical display area...

    Anyway, I can't help but be excited that we might finally get some high DPI (presumably LCD) screens for consumers - They might finally offer something to convince me to replace my CRTs (120dpi, while high end LCDs manage around 100.)

    1. Sorry that handle is already taken. Silver badge

      I think

      The idea is to increase smoothness, not to make GUI elements so small you have to be a trained marksman to hit them.

    2. AdamWill
      Stop

      The point...

      ...is that everything looks a lot *nicer*. Using a high DPI screen to draw the same picture four times smaller is exactly the wrong way to do it. The correct way to do things is, as other commenters have pointed out, resolution independence: graphical interface elements should be vectors, and all sizes should be specified in absolute units - centimetres, inches, whatever - not pixels. It's completely the wrong way of thinking to design your UI with the thought 'okay, this menu is 36 pixels wide, this one is 224...' etc. You should specify your sizes in absolute units, and the operating system should know the attributes of the display attached to it and set the DPI to match the DPI of the monitor, so that the interface elements turn out to be the physical sizes you specified.

      Unfortunately, this never ever happens. And we wind up with ridiculous stuff like the iPhone solution - make the display's resolution four times sharper so you can just pixel double everything and have it look 'right'. *headdesk*

  12. Grifter

    Hmm

    Wouldn't it just be simpler to use vector graphics for your interface elements instead?

  13. Kristian Walsh Silver badge

    Oh look, Apple just got bitten by an eleven-year-old dog

    OS X had an excellent resolution-independent graphics engine from the very beginning in the shape of Quartz. At its introduction, the devs proudly claimed that it would allow OS X applications to be totally resolution independent.

    Meanwhile, instead of building the UI controls out of vector instructions, Apple UI "designers" put together a UI using Photoshop and (rather ironically) Flash/Shockwave.

    Because this design was a. personally vetted by SJ ("I don't care about how it works, I just want it to look amazing for the five minutes they'll play with it in the store"), and b. was entirely bitmap-based, it was unfeasible to translate it into code and gain proper resolution-independence, or to change it in any way that might have helped, and the opportunity was lost.

    When I started looking into Symbian^3 development last year, I was quite surpised that it uses SVG for its icons, thus making its UI fully scalable (shame about the font scaling, though). MacOS X, with its PDF support, should have had this from the get-go, instead of making devs ship big fat PNGs. I remember just after OS X shipped, a friend of mine remarked that he'd just created an icon bundle that was larger than the original Macintosh ROM...

This topic is closed for new posts.

Other stories you might like