Looks like a lovely machine....
Alas, it's also way outside my current price range.
It’s kind of ironic really, that the most expensive, over-the-top version of the iMac ever released should actually turn out to be pretty good value for money. Apple iMac 27-inch with Retina Display Costs on a par with a 5K monitor: Apple's iMac with 5K Retina display The so-called "entry level" iMac launched a few months …
I don't think that "upgrade" is the best word to describe a CPU option that is available at the time of ordering. It just seems wrong to use the phrase "you can upgrade" in connection to an iMac. Historically not so much...
...Not that there's anything wrong with that.
Thats the real USP - for still photos, etc there's no real need to see every pixel 1-to-1 for your work, but with video editing there is - shifting about a zoomed window just doesn't cut it.
We've already got the first of the true consumer 4k cameras out with the go pro 4 black, more will undoubtedly follow.
For now, my workflows are all 1080p which my current retina MBP and iMac 27 both allow space to view 100% within FCPX with room for the interface, but even the go pro 3+ shoots in 2.7K - which can't be viewed in FCPX full size as its about the same resolution as the whole display.
So they'll be a quick pick up from pros shooting 4k now, and over the next year as more consumer 4k cameras come on the market, it's going to become more and more attractive to these folk.
Video editing 4K far easier (and much cheaper) dual monitors (4K + whatever) - I've been experimenting with GoPro 4 footage.
All in ones still have their place in consumerland but serious use there's still enough happening with CPU and GPU to want to upgrade a desktop more frequently than monitor so most professional developers will want to avoid the AIO format lock-in.
Apple seem to have grabbed most of the early production runs of 5K panels for iMac so we will have to wait to see what monitor pricing turns out like, I'm guessing ballpark £1000 (Dell have announced 5K but as far as I know no pricing yet - this article speculates $2500 but sounds an unlikely rumour/fud figure). Presumably Apple will refresh their monitor lineup within months too.
Incidentally the reason for 5K (5120x2880) is fact its exactly twice the height of the traditional 1440p display so 2x scaling works nicely.
You're ignoring the fact that this won't be worthless in a years time. If you want to swap out your computer then it's no more effort to swap out the screen at the same time, so sell this and get the new model. The Apple backup regime makes this extremely simple, and their resale value is second to none.
The real bonus here is that real professionals don't have time to fart about with drivers and upgrades. If your time is valuable it's much better to replace the whole caboodle and carry on working so component upgrades are less likely the further up the tree you go, especially given that for professional use as a tool, £2k is really not that much money!
Disabling the remote monitor support is a nasty trick wonder how many people will get bitten by that? If I splurge 2K on a piece of kit I should be able to use it with all my other Apple kit.
The less cynical part of me does wonder if its just that the function in OS X is still awaiting a tweak to handle the extra resolution.
The reason for not having a Target Monitor mode is down to the fact it requires DisplayPort 1.3 to drive it and Thunderbolt 2 only supports DisplayPort 1.2. Something to do with the sheer amount of bandwidth the display needs to move all those pixels.
"...Pushing this many pixels requires more bandwidth than DisplayPort 1.2 offers, which is what Thunderbolt 2 ports use for outputting video signals. (I wrote about this a few times.) Doing it right will require waiting until DisplayPort 1.3 in Thunderbolt 3 on Broadwell’s successor, Skylake, which isn’t supposed to come out for at least another year — and Intel is even worse at estimating ship dates than I am, so it’s likely to be longer..."
The reason I don’t really like all-in-ones like the iMac is that screens are the longest lasting part of a PC for me. I tend to use a multi-screen setup with my newest monitor as the primary monitor, the next oldest as a second monitor, and the third oldest as a third monitor (if the graphics hardware allows it). I don’t want to throw a perfectly good monitor away every time I buy a new PC. Target monitor mode at least partly alleviates this (particularly when an iMac costs about the same as an equivalent screen on its own - this is not the first time this has happened) and the lack of it is a deal-killer for me.
I’m not into conspiracy theories as to why Apple has disallowed it - the explanation is that the present version of Thunderbolt can’t handle the bandwidth. I am sure the next version of this iMac will fix this, which is a good reason to wait for it.
Displaying at 4K resolutions? So you want them to add an extra scaler for the target display mode that a minority of people use, and then have people complain when it doesn't look right? (The scaler in the iPhone 6 plus isn't a problem, because it has 1/4th the pixels but 1/25th the display area, so you can't notice where it it doesn't look right)
So what could you actually display on a 5K monitor that wouldn't be as good on a 4K one?
Well, the obvious answer for the average user is images from a 5K camera (and possibly the only answer: since video refreshes at rates that make seeing each pixel impractical - not to mention impossible, and once you can read a piece of text at a reasonably sharp definition, adding more hi-def. doesn't make it any better or easier to read - otherwise nobody would be able to use "old" 27-inch 1920x1080 screens to do that).
So still images it is. But wait! Even if you take an image from your DSLR, hasn't it been de-bayered inside the camera (and squished around to turn it into JPEG), so it's not exactly WYSIWYG any more. Going further: if you choose to take a squint at the RAW format, the camera still has an anti-aliasing filter in front of the sensor to reduce all those nasty Moire patterns. So you aren't even seeing the real image then, either - TIFF, JPEG or not.
Couple of things...
So what could you actually display on a 5K monitor that wouldn't be as good on a 4K one?
As pointed out in the review, the 5k screen allows you to view 4k video at 100% in your editor of choice, while also leaving enough screen space for panels/toolbars/menus etc
if you choose to take a squint at the RAW format, the camera still has an anti-aliasing filter in front of the sensor to reduce all those nasty Moire patterns.
My Fuji X series camera has no AA filter in front of the sensor, and I know that Nikon have AA filter-less cameras on the market too. Even regardless of that, I'd say this machine definitely has a potential place in a RAW workflow, if you're the type that regularly pixel peeps at 100%+
Pixel-wide lines at anything other than vertical or horizontal (Think Illustrator or AutoCAD) would benefit.
'Retina' is based on the fallacy that 20/20 eysight is average, whereas people's average vision is actually better. If if this wasn't so, some scenarios, such as the one above, still benefit.
If anything, I think less than 20/20 is average. Most contact lens wearers have uncorrected astigmatism. Glasses rarely correct beyond or even to 20/20 unless the wearer has no choice but to have separate reading glasses. The main source of people with better than 20/20 vision is those who've had LASIK in the past decade, and their vision rarely exceeds 20/12. Not that being 20/20 or whatever matters once you're past your mid 40s or so and eventually have to rely on reading glasses or something to avoid needing them (like undercorrection)
A true RAW workflow needs more than pixels. What's this display gamut? How do it compares with Benq and Nec ones? What calibration capabilities has? For image professionals, there are several parameters to choose a monitor, not only how many pixel it has.
This looks a choice for advanced amateurs without much calibration needs - those who like 'bright' images even if they're not true (and usually look very bad when they print them).
And having a computer attached can be a real nuisance - no way to really attach the hardware you need for your workflow. Also, I wouldn't really like heat from the computer component impact display conditions.
But Apple too knows in the real professional market its displays have little space, so better to aim a little lower, where it can easily sell this kind of devices.
I've just upgraded to a Dell 28" 4K display - £330+VAT. Guess what, it plugs straight in to my 2-year-old PC and works perfectly. Ah, the Mac aficionados will claim, but the Apple display has 5K pixels. But I already need to scale text up by 25% to make it legible on my (slightly larger) screen and individual pixels are only visible under strong magnification. How will having 5K pixels help?
Look, if owning a Mac rather than a PC (like owning a BMW* rather than a Škoda) helps you feel better about yourself, knock yourself out. But don't try to convince me what great value for money it is.
* The point being: I'm prepared to accept your view that the Beemer is a superior car - for some purposes, at least - just not that it offers superior value for money.
The 5K display in the iMac has nearly twice as many pixels as the 4K Dell monitor you've just purchased, it's not a small step up in resolution.
The iMac looks like a nice piece of kit but for me and my ageing eyes a 27" display is a little bit small. I'd like a 5K display with a 30"-plus diagonal but I'll probably settle for the Dell 31.5" 4K display as a Xmas present to myself. It helps that the display I've got my eye on has an IPS panel with a good colour gamut, the smaller 28" Dell 4K display is TN.
> "I've just upgraded to a Dell 28" 4K display - £330+VAT."
>> "So £396 then"
The net cost depends on whether Chris - or any other buyer of the same product - is VAT registered. An important consideration that seems to have escaped you.
So potentially £330. Or potentially £396.
Which I expect is precisely why Chris specified net cost + VAT in the first place. He's a helpful chap. Unlike some.
I'm still playing nicely with the Dell U2410. This uses the same IPS LCD panel as the iMAC from a while ago. However the difference is that the Dell has a CFL backlight that gives nice even coverage and AdobeRGB colour gamut. The i-version only manages sRGB, has a very shinny front and uneven lighting. Also the i-brightness is so high that you lose almost 2-bits per pixel attempting to adjust the brightness level when you calibrate it (and the calibration still varies across the screen). All of this is necessary with things that actually go to print.
I'm watching the Dell 5k monitors with interest to see how they play out. I'm unlikely to buy the i-version because of the limitations that normally come with it. Oh and I'm not anti i-things at all. These Dells are driven from Macs.
As an aside, I'm lost for words that Windows cannot scale the UI properly on screens, given that Window3.0 had a screen resolution DPI setting that was expressly designed to do things likt this.
"As an aside, I'm lost for words that Windows cannot scale the UI properly on screens,"
Windows scales identically to the Mac on my rMBP. The difference here is sitting between screen and chair, rather than on the disk. Windows does indeed look awful if you just switch to native retina resolution without setting it up for hi res viewing although I suspect that the person claiming to have issues was trying to make a point against the iMac rather than saying anything useful. I'd like to think anyone who actually has a 4k screen would know how to set their display properties correctly...
"The 5K display in the iMac has nearly twice as many pixels as the 4K Dell monitor you've just purchased, it's not a small step up in resolution."
Well, that's less than 40% higher resolution; is the cost less than 40% more?
The idea that anyone is editing 4K images at 1:1 and needs the extra space for toolbars etc. is classic Apple iWank. If you're editing those images, you're zoomed in; if you're watching them then you're in fullscreen mode.
I was afraid that would bring the Mac zealots out frothing and screaming :)
Yes, I get that 5 is a bigger number than 4. (Which part of "superior [...] for some purposes, at least" are you having difficulty understanding?) No, I don't think that makes it "value for money" (as the author claims) to pay 5x as much, particularly if the difference is barely detectable by the human eye. YMMV.
And (of course) Windows (since at least v3) allows you to scale the font - which is precisely what I said, doh!
It also brings out the logic zealots. Car:computer analogies fail because cars travel on the same network of roads so a BMW must contend with the same speed limits and traffic conditions as a Skoda and the monies spent on the performance specs of cars are wasted for the basic task of transit from point A to B. Also, there is generally not a correlation between the price/performance of the BMW and that of the Skoda, so your implication is that the Apple is merely a status symbol the performance of which could be met by something much less costly. The performance benchmarks indicate otherwise. The "Skoda" PC does not compare favorably to the "BMW" iMac.
I'm glad you recognize that 5 is bigger than 4 and £396 is ~1/5 of £1999. You missed the fact that the relevant numbers for the monitor are 14.7 million pixels vs 8.3 million and a comparable 5K screen for a PC costs as much as the iMac does total without also including a nice computer.
That's actually not an estimate, but the price that Dell announced (at a point when Apple's new iMac wasn't announced yet, so they claimed "twice the resolution of the iMac).
However, Dell monitor and iMac cost the same in the USA ($2,500), so I would think that Dell will sell their monitor in the UK at the same price as an iMac as well (£2,000 incl. VAT).
To the guys who talked about £330 Dell monitors: Dell sells cheap stuff, and they sell good stuff. The cheap stuff isn't good, and the good stuff isn't cheap, just as you would expect. Dell's $2,500 monitor is the good stuff, just like the screen in the iMac. You just don't get the free computer with it.
>the imac is 8bit, the dell is 10bit
..... the iMac uses a Sharp panel and Dell's 5K an LG AH-IPS. Aside from the superior gamut and PPD, I'd put my money on LG having the better reliability given Sharp's recent history and the comparative frailty of IGZO....on the other hand it costs more.
...also worth checking the anomaly rate - Dell has a zero dead pixel policy, Apple has http://support.apple.com/kb/HT4044 - docs leaked from Genius Bar previously have set the 'acceptable' threshold for repair at more than a dozen dead pixels.
"...so a BMW must contend with the same speed limits and traffic conditions as a Skoda..."
On the open roads, the difference between a cheap car and a nice car becomes very real again. Like 1800+ km straight through including overnight, stopping only as required. In a VW it was torture, in spite of the nice seats. Noisy, buzzy, not nice. In a nice Mercedes E-Class, perfectly comfortable.
And there are roads, even entire provinces, where speed limits can become effectively voluntary. When *all* the trucks are doing 140 kmh in the wee hours, then you can rest assured that the revenue collectors have gone home for the night. One can really open it up and stretch its legs. And perfectly safe, due to details I can't be bothered to explain.
These are wonderful "lifetime memories" road trips.
If you're worried about value for money, then a Certified Pre-Owned nice car is the solution. About 50% of the new price and, in our case, next to no milage on it (15,000 miles). Perfect.
Back to the iMac now...
Resale price is irrelevant.
I keep tech till it is beyond economic repair. My Jan 2000 laptop is controller now for a piece of test gear.
I'm not sure why there an authentic IBM AT (upgraded to VGA, Network and bigger HDD with 2 x full size AST memory expansion cards) and upgraded Amstrad PCW256 (2nd drive is DS 3.5" and RAM is 512K and has Clock/2xserial/1xParallel adaptor) in my attic. I did dump the 68000 based Mac with single 3.5" floppy drive. No-one wanted it even for free.
"...2-year-old PC and works perfectly. Ah, the Mac aficionados will claim, but the Apple display has 5K pixels. But I already need to scale text up by 25%..."
So Windows still uses physical pixel sized elements in 2014? The 1980s called, and wants their CGA era display toolkit back. Maybe that is something that Microsoft could fix in Windows 10, and use absolute sized display elements.
The Dell you refer to is a TN panel with about 8.3 megapixels and a linear density of around 163 ppi. The same as a first-generation iPhone.
The iMac is an IPS panel with about 14.7 megapixels and a linear density of around of around 217 ppi. A shade above the Nexus 7.
It's a fatuous comparison.
First up, the Q: the reviewer here had his display set to "scaling", so that his 5k screen appears the same resolution as his current 2560x1440 screen. He then says that the advantage of a 5k screen is that your 4k content can be displayed pixel for pixel. If you are scaling the screen, doesn't that mean that your 4k content is scaled down and then up again?
Secondly, the comment: the only reason why this is "value for money" is that 5k screens only exist for Apple. 4k screens on the other hand are fairly common, and you can choose whether you can accept TN (<£500) or must have IPS (<£1000). On that basis, a 5k screen isn't that value for money - for me, I'd rather have a 4k screen for content, a second 1080p screen for controls, and an extra £1000 in my pocket.
I'm not au fait with Apple's implementation but I would very much suspect that the supported applications use the scaling setting to size their UI appropriately but all content is displayed at native resolution. Applications which don't advertise themselves as supporting high dpi displays will probably display as they would at "standard" resolution but the OS will do a bitmap scale to compensate (which is I think what you are getting at?) This is certainly the way it works on Windows.
Somewhat ironically, if you are a media consumer rather than a creator I suspect a 4K panel will give you better results as upscaling 4K to 5K will no doubt lead to some scaling artifacts that wouldn't be present if viewing 4K content full screen on a 4K monitor.
Scaling in OSX renders the text in high resolution the same size of the original resolution.
It knows it is rendering to a 4x (or retina display). Text appears (is) crispier.
Scaling by the monitor is just hardware based upscaling like TV's do for SD, DVD or 720p content.
Not the same thing...
If you look at a retina screen (or hires tablet) for a while the the non retina screens look a bit pixely.
You could read up how the Mac Retina displays and their software work.
You set it to (officially) 2560 x 1440. The screen really is 5120 x 2880. Text, like the menubar, is in "normal" size just with much higher quality. And your 4K image is displayed at its full resolution in an area that is officially 2K in size, but with the double resolution.
It's just like a 4K TV where you don't get a screen twice the size, you get a screen with twice the quality.
This is not a device you carry around with you, it stays at home or at work. I'd have thought that offloading completed projects onto external storage (networked or locally attached, and duplicated) would be what people did. Can anyone who does serious work in this area give us an idea of how much storage they need and of what form/location?
I don't do serious video editing by any stretch of the imagination, but if I'm recording game footage at 1080p/60fps at 50Mbps, space gets eaten very, very quickly - around 375MB per minute.
Yep, but that's the beauty of USB3 and thunderbolt2. Whereas I had no choice but to have the case opened up - I farmed that out so the issues wouldn't be mine - in order to have a 1TB SSD fitted in addition to the HDD going forward Macs now have the necessary connections that you just plug shit in. G.Skill and other 4-6 bay DAS systems are quite popular and offer speed and necessary storage.
I have a measly MacMini to do most of my simple re-edits. It has a nice Fusion drive, that is pretty speedy. I also have several TB of storage attached to both USB and Thunderbolt. All the project files stay on the internal drive. I find most of the time it works nicely. If I had to upgrade, it would be to all Thunderbolt storage, then maybe a refurbed Mac Pro.
>Can anyone who does serious work in this area give us an idea of how much storage they need and of what form/location?
Short answer: Shitloads. Redundantly stored on external servers. Thunderbolt to the MacPro for editing.
My friend runs a video production company. Filmed footage is irreplaceable ( you can't return to your client and ask them to re-enact what you filmed in the first place), so all data is redundantly backed up as soon as possible - ideally as it is being shot, or at least whilst still on site.
Thunderbolt is happily fast enough to shunt it onto the editing Mac as required. There really isn't a need for large internal storage.
I cannot claim to be an expert but I can certainly use up 60GB of disk putting together a 3-4 min video for my kids birthdays. I may have dozens of videos and pictures I'm pulling clips out of. A single stream of video is easy, but you never edit a single stream, you're pulling lots of stuff together, thats the issue The various streams get put into an intermediate format for internal FCP X usage.
I'm sure that the professionals here will jump in and correct my workflow, but I thought disk was basically free and unlimited. Well it is if you have anything but a Mac :)
I actually use a Hackintosh to edit my videos anyway, I have two 512GB SSD's and very happy I am with it. In my defence I do have an iMac, an old 2007 Mac Pro, a Macbook Pro Retina, t'other half has a Macbook Air.
I like the look of the 5K and would have seriously considered it if it allowed me to plug my MacBook in. I'm not convinced by the reasons given for not having it available. No doubt we will see a 5K Thunderbolt display shortly.
And I can't fault the rest of the hardware specs. But what's up with the crappy keyboard and mouse that only munchkins could appreciate? And a machine of this class should have a pure SSD, not a "hybrid", which experience with these in an Enterprise environment has taught me are unreliable, finicky crap.
There are plenty of other keyboards available you know. If you don't like it, change it.
The same goes with the Apple Mouse. Don't like it, change it.
I eschew modern keyboards in preference to an old Dell KB that I picked up at a Computer Fair for 5quid. Built like a tank like those IBM KB's.
"...horrible standard Apple ones...." Personal preference? Apple keyboards have real mechanical springs, and have a short travel distance. Those are good things. I actually prefer them over most every keyboard at this point, even though they lack Windows specific keys.
>And a machine of this class should have a pure SSD, not a "hybrid", which experience with these in an Enterprise environment has taught me are unreliable, finicky crap.
No Mac uses a hybrid drive. 'Fusion Drive' refers the combination of a Logical Volume Manager called CoreStorage (inspired by ZFS) baked into OSX, plus conventional SSD and HDD drives. OSX itself optimises what data is duplicated onto to the SSD for performance benefits. I can see why Apple calling it 'Fusion Drive' has confused you, but a 'hybrid drive' it isn't.
>My sausage-fingered friend uses Mac keyboards for his Windows PCs. Personally, I get on with Dell keyboards. Some people swear by mechanical keyboards.
I love by my Logitech MX Darkfield mouse on a Windows PC for CAD. I don't get on with SpaceNavigators - probably because I haven't persevered. My professional Mac-using friends use tablets and styli, even when video-editing.
Each to their taste.
Your handle really does say it all, doesn't it.
If the resolution of the screen doesn't matter to you, then this isn't the right device. For you. In your use case. It's staggeringly arrogant though to assume that because YOU don't care about resolution, it can't be important to others.
As stated in the article (and about 20 comments above) - if you're editing a 4K video stream in realtime, a 5K monitor is the only way to see your footage at 100% whilst still having room for toolbars around the side. For some people, this is pretty damn important.
Unwarranted triumphalism indeed.
The problem is, it's only the right device for a staggeringly small number of users.
They went with AMD's mobile video adapter, it's a bloody laptop card. Sure, that's shiny-shiny if you need little more then a framebuffer, but that card can't push any serious polys at 4k, let alone 5k.
It's a nice screen that you can't attach to a powerful enough machine to really USE it.
I'm pretty sure someone buying a serious 4k gaming machine isn't considering an iMac for about fourteen entirely different reasons, cost being just one of them. You can build a pretty good 4k gaming machine for around £1200 these days, including monitor.
I know a few folks who are moving towards 4k film shooting - their interests are seriously piqued*.
It doesn't matter if it only sells 10,000 units a year - if Apple are making a return on them, and they're the only game in town (as they currently are for full-scale 4k editing on a normal sized desk) then they've done the job they were tasked with by their product managers.
Me? I'm happy enough at 1080p, although this talk fo £400 4k monitors is intriguing....
Steven 'stuck in the past' R.
*also, it's champing at the bit. Not chomping. Champing. Champing, champing champing CHAMPING CHAMPING CHAMPING
" if you're editing a 4K video stream in realtime, a 5K monitor is the only way to see your footage at 100% whilst still having room for toolbars around the side"
You mean there are still people out there who think they are high-end users but who only have one screen? Er, gosh! I'm a cheapskate but I've been using two screens for my work since the 20th century.
So what you're saying is that were given a computer for free? After all, the cost of the computer here is $0, and you claim you paid half of that, which makes no sense whatsoever.
So you didn't really buy it, now did you? Someone gave you a workstation for free. That's very nice of them but it also makes it completely useless as a point of comparison if you want to talk about cost.
Using the Home Theatre viewing distance calculators around, you would have to sit 50-130 cm from the screen to get the benefit of the detail on this beautiful beast. I know they're not the right calculators for workstation conditions, but I can't help thinking it should be a larger screen for 5K, but then I'd need a bigger desk.
Where can I buy the screen (preferably in black), with VGA, DVI and HDMI connections.
Also should cost about 1/3rd of iMac price.
I don't need an iMac. I rarely edit video (Have ancient Adobe Première on Windows as well as various more modern Windows & Linux free tools) and Email + Web is much the same on anything.
>Where can I buy the screen (preferably in black), with VGA, DVI and HDMI connections.
>Also should cost about 1/3rd of iMac price.
You will be able to buy it from Dell for 5/6ths of the price of the iMac. Anandtech reckon it's a fair bet that it is the same panel as the iMac, too.
I doubt that your graphics card will output 5K through VGA.
Why do you want a 5K screen for "Email + Web "?
@AC - the figure of $2500 didn't come from Anandtech. It came from http://www.pcworld.com/article/2841732/why-5k-displays-matter-the-one-spec-that-tells-all.html and teh figure is in accordance with every other tech site out there... mainly because that is the figure that Dell announced.
Where Anandtech made a guess was supposing that the Dell used the same panel as the iMac.
"Why do you want a 5K screen for "Email + Web "?"
Simple, more pixels means more characters on the screen, which means more panes in tmux.
Unfortunately the pcworld article doesn't quite get it. Once you have more than roughly 60 pixels per degree they will blur into each other. It may be useful for people who do desktop publishing on Macs, since the Apple implementation of hinting for TTF is severely broken, and there is a point to be made for a display approaching the resolution of print.
However for the rest of the population using fixed sized bitmap fonts, to small pixels will only mean we have to use bigger fonts... which counteracts the idea of having a higher resolution.
Right so whilst iFixit's tear down indicates that the components are nearly identical to previous versions (minus the 5K display) and extremetech points out that for certain applications the iMac's GPU is insufficient for 5K; indeed not even GTX 980s in SLI could make use of it, what a pile of pants review. Expected more from you guys!
I always find it very difficult to use "value for money" and anything Apple in the same sentence -- it's almost always some bunch of faff involving comparing it to the most overpriced (non-Apple) PCs on the market (like some Sony) and then going on about "comparable" parts (adding arbitrary price onto the PC for anything that is lower-spec while ignoring specs where the Apple is lower-specced).
In this case, though, I wouldn't spend that much on a system or a monitor... but that is a pretty good price for a display alone, let alone having some kind of computer in it.
Oh.. one point though.. if everything on the new screen seems brighter than on the old one, it means your screen hasn't been calibrated (probably either one.) Time to set the gamma.
Oh, also.. since the article doesn't say what a "Fusion drive" actually is -- it's a 1TB hard drive with 128GB SSD slapped on, and some indeterminate technology* to try to shuffle more frequently used data onto the SSD.
*I call it "indertminate technology" because the writeups I saw about it say they don't even know for certain if it's block-based or file-based, if it's under OS control or if there's some controller hooked up to the SSD and hard disk. They haven't been able to characterize the behavior to even make a guesstimate at it's algorithm, is it most recently used, or does it base it on how frequently it's accessed? Does it ignore linear read/writes in favor of trying to speed up usage that involves lots of disk seeks?
Okay, why has no one picked up on the "very respectable" benchmark score (gaming) of 5k? That's only slightly better than a gaming laptop, half as good as a midrange gaming PC and no where near the score you get with same chip and GTX970 (17.5k score). If you have a 5K monitor (or a 4k) you want a card that can abuse it, not one that only just beats a frikken laptop.
looks great! but few need 4 or 5k display. I will say though that the 27 imacs generally are bloody superb. Just a great, silent, powerful-enough computer which beats my 4 year old 8 core mac pro on all fronts (power, rendering video using compressor, noise, clutter-free, simple). They are great for video editing or music studios (65 tracks of 'evans stress test' on mine: 3.4ghz 8gb ram).
As for people saying they cost too much, i bought two 27's a month ago, and with a 3 year lloyds business loan for 3k, they cost 90 quid a month over 36 months and are warranteed for that 3 years of course. No brainer!
Microsoft should have seen this business model which they DID get right with xbox.
the other joy with mac is app store; simple, instant & cheap hassle free s/w purchases, not going uptown and bringin back final cut and logic in huge heavy boxes for 800 quid in a wheelbarrow... now those apps are so cheap it's ludicrous.
I groan when i have to go back to windows for certain apps & server access stuff.
btw i can edit 1080 video from an 86 quid usb3 2tb mybook, so don't worry about expensive thunderbolt drives... again, i wince when i look at my 4 year old mac pro with its 1000 quid e-sata raid rack to do the same job. iMac's re just unbeatable value machines!
"It’s certainly not the home computer that it used to be, but the iMac with Retina 5K Display will have designers, photographers and video-editors chomping at the bit."
It's champing at the bit, not chomping.
(I myself thought it was "chomping" until someone corrected me, nearly 30 years ago. He challenged me to look it up in the dictionary, and in disbelief I did, only to find he was right.
Some dictionaries now list "chomp" as an acceptable alternative to or regional variant of "champ", but that is only due to widespread repetition of the mistake — mainly by Americans. Champ at the bit remains the proper, original expression.)
For the price of this computer it has to be a tool. A tool will be bought for a specific set of uses and it will have to last for a pre-defined time, for example 10 years.
So the obvious questions one should ask about that device are how easy it is to service and to keep running for the next x years. And if you find out it'll be hard to service after Apple Care runs out... well you've bought a toy, not a tool.
You can make good all in one computers. HP proved that with their Z1. Sure, it's a bit more expensive and it doesn't have such a high resolution screen yet, but it'll still be a useful tool, probably well after 10 years of service.
>5k of pixels is a lot to push through the graphics shaders, and I would have liked to know how the game they tested ran at full resolution.
They couldn't test the Windows games at full resolution. They tried, but:
"From 5k to 4k
The first thing I noticed upon booting up in Windows 8.1 is that Windows does not run at the display’s native 5120x2880 resolution. When I logged on, I was greeted by a desktop running at 3840x2160, one of a number of different resolutions commonly lumped under the "4k" banner (this particular 4K flavor is usually referred to as "Ultra HD").
Interestingly, the non-native resolution didn’t exhibit any visible scaling artifacts. The high pixel density seems to more than make up for the loss of resolution from "5K," and the display blends the 8,294,400 points yielded by 3840x2160 into the native 14,745,600 pixels quite smartly. Even sitting with my nose an inch or so away from the screen—a distance my mother assured me when I was younger would ruin my eyes—I couldn’t see any feathering or blurring around edges and lines. Type remained sharp, and everything looked crisp."
The above article was just a preliminary gaming test, and they will be updating it:
... just based on the performance of Alien: Isolation, any fears prospective Retina iMac owners might have had that the system’s high resolution will outstrip the GPU’s ability to keep it fed appear to be unfounded.
...I’ll be playing the hell out of my Steam library on the thing over the next week before I have to send it back. Peter Bright is already spitting rage at me in the Ars staff IRC channel that I didn't benchmark with Battlefield 4 or Far Cry 3 (simple explanation: I don't own those games and don't play them, and I didn't have press Steam or Origin codes readily available), so I'll see about adding those to the mix before I drop the iMac back off at FedEx.
Like I said, desktop resolution is not the same as in game resolution. Games go into full screen mode at any resolution they want. But 1920 is the usual resolution they aim for in development, currently.
This article doesn't say the game was running in 4 or 5k. So the quoted FPS here may not be accurately representing the machine's 5k screen.
That article suggests going down to 2k to prevent lagging in games, and reviewer states it's acceptable. A lot of code goes on in each pixel shader, so slow downs in massive resolutions makes perfect sense in terms of software and hardware.
Ars has an article — http://arstechnica.com/apple/2014/10/the-retina-imac-and-its-5k-display-as-a-gaming-machine/2/ — in which they play the current Alien game at 4k with screenshots, benchmarks and subjective reactions.
I don't want to ruin it too much for you, but:
The 3840x1440 runs appeared visually smooth when I watched them complete, but the numbers tell a bit of a different tale. When I hopped into the game to actually play at that resolution, there was a noticeable amount of mouse lag. Indications are that a faster CPU would have helped considerably, but with the iMac, what you get is what you get.
Biting the hand that feeds IT © 1998–2020