That's quite a machine......
...but can it run NetHack?
*Runs. Like. Hell*
The last time Apple released a ‘more affordable’ iMac it turned out to be a damp squib of a thing that fully merited the oft-hurled criticism about Apple kit being over-priced and under-powered. I was, therefore, a bit worried when Apple announced a less expensive version of the 27-inch iMac With Retina 5K Display. Apple iMac …
Can't you even swap out the spinning disks for an SSD?
That's a good question, but I vaguely recall there's a problem with non-Apple SSDs in that OSX doesn't support certain commands, I think it was TRIM. On Macbooks, the most cost effective disk is one to a SSH/spinning rust hybrid disk. It certainly would deal with that 32 seconds boot time, that is indeed rather on the slow side.
Personally, I prefer to buy a machine that has some headroom, so I'd want SSD storage and 16GB or RAM from the outset, also because an SSD machine makes absolutely *no* sound other than when you give it so much work that the fans have to spin up to cope with the heat, and I *definitely* would want a full size keyboard, which usually means getting a wired one (which is then also a good place to plug in the stubby for a Logitech wireless mouse - not a fan of Apple mice either).
So, in short, it's probably not the machine for me :)
>> That's a good question, but I vaguely recall there's a problem with non-Apple SSDs in that OSX doesn't support certain commands, I think it was TRIM. On Macbooks, the most cost effective disk is one to a SSH/spinning rust hybrid disk. It certainly would deal with that 32 seconds boot time, that is indeed rather on the slow side <<
Trim issue was fixed a couple of years ago IIRC. The start time does indicate some performance issues but most iMac users don't turn off the machine anyway - it just goes in to sleep mode with immediate return - so start up time not particularly relevant in the real world.
The iMac comes with two internal connectors: a SATA one for and HD (or a brave user-supplied SSD) and a sort-of proprietary one for PCIe flash memory (https://d3nevzfk7ii3be.cloudfront.net/igi/PbQAPKwGOYbMTIqh) – using "SSD" to refer to only standard form factor devices. Apples uses both, of course, to implement its "Fusion" hybrid drive.
Unless the third-party SSD comes from Angelbird (an Austrian firm that appears to have filched Apple's secret TRIM sauce), enabling TRIM involves modifying a kext (driver) that gets replaced every OS upgrade — and interferes with upgrades.
For those (such as the gentle readers of this site) who are not frightened by a command line interface, booting into single-user (text interface) mode on a Mac with a third-party SSD will perform TRIM when you issue an "fsck -ffy" instruction. This is not such a great imposition, though, because most TRIM work occurs on SSDs when the system has been largely idle for a while.
To be fair, in my time at a Mac consultancy in London, most clients used to use external RAIDed drives for Video editing anyway. I personally use thunderbolt attached SSDs using thunderbolt 1 and it's more than quick enough
If thunderbolt doesn't float your boat, USB3.0 and SSD is also quick
"Does anybody here upgrade their washing machines, toasters, ovens, HiFi amps, AV equipment, etc. ?"
Sorry, crap analogy. I rather expect my washing machine or oven (both of which cost less than this 'affordable' iMac) to last for at least 10 years. Sensible people buy AV equipment in modules, i.e. buying a separate amp, tape/disk/holo-disk-from-the-future reader, radio etc. and swap the individual components out as they become obsolete/break. As for toasters, I wouldn't spend more than £25 on one but I'd still expect it to last a few years.
I doubt if you will find a washing machine that will last ten years - unless you spend a lot of money. Washing machines typically give 1 year per £100 spent. A 10-year Miele will cost £1000+
Also, if you are lucky enough to get five years out of a washing machine before it dies, you have to scrap it. An iMac has a decent residual value after five years.
Yes. My current project is going to solar+battery LED outdoor lamps on the perimeter and LED's for the overhead lamps. All the nightlights have been converted. New coffeepot next and a new (tougher) vegetable chopper. Always something!
Serious IT projects, but that's always a given.
"Can't you even swap out the spinning disks for an SSD?"
Yes, but you have to take the glass front cover and LCD out to do it. All the gubbins is behind the LCD, the only thing you can get at from the back is the RAM - https://www.ifixit.com/Guide/iMac+Intel+27-Inch+Retina+5K+Display+SSD+Replacement/30537.
It takes a special tool and a fair bit of force to unglue the glass, so it's probably best to get a specialist to do it for you. It may be cheaper to pay for Apple's SSD upgrade when you buy the machine.
There was a problem with TRIM being disabled by OSX on non-Apple SSDs. I think that's been sorted now.
When screen resolutions are usually quoted, be it 480, 576, 720 or 1080, doesn't that refer to the number of horizontal lines?
Which marketing dick decided that all of a sudden it was OK to use the number of vertical pixels to define this, started by the 4K misnomer?
Here, buy my new 6K screen, it's amazing (6000x480 pixels)
And yes, I know it wasn't an Apple idea.
Umm the number of horizontal lines and the number of vertical pixels are the same thing? There are 1080 horizontal lines of pixels in a 1920x1080 display, ergo there are 1080 pixels in a single vertical plane of alignment with one another...
I presume you mean the number of HORIZONTAL pixels in each horizontal line?
The 4K misnomer was just some shitty rounding of 3840 to the nearest thousand which, yes, was horizontal pixel count bollocks.
The television industry has traditionally quoted the number of lines (480p, 720p, 1080i) etc.
The film industry (or at least, the associated FX business) has traditionally quoted the number of columns, rounded to the nearest thousand (2k, 4k, etc).
For some reason the TV and computer industries have been switching to the convention of the film industry in recent times. Possibly because editing movies is one of the more important applications of screens like this, but mainly I suspect because the numbers are bigger this way and it sounds more impressive.
I can't remember the source, but my recollection is that the difference is because in film production, the width of the shot image is constant, not the height. Width then depends on the film stock used in shooting (16, 35, or 70mm)
Originally, shooting on 70mm film was essential for widescreen firms, but as 35mm emulsions got better, and then were replaced by digital capture, it became possible to get to the image quality of the much more expensive 70mm shooting system using the cheaper 35 setups. (the cost in 70mm isn't the film stock, but the much better lenses needed), and masking the top and bottom of the frame to get to a 1.7:1 (16:9) or 2.35:1 ("scope") ratio.
"For some reason the TV and computer industries have been switching to the convention of the film industry in recent times."
I'm pretty sure the reason is as explained in Dilbert some years ago. Bigger numbers *always* mean 'better' to the uneducated buyer. That makes it far easier to sell yet another expensive upgrade to consumers who really don't need it but like to have bigger model numbers and flashier kit than their neighbours.
It's all about making it sound fancy. That's why people were all taken in by posh sounding names such as "FullHD" - usually not realising that such screens sometimes had *less* pixels than the 1920x1200 screens they were replacing. And don't get me started on standard "HD" screens. If ever there was a way of marketing 1366x768 as a good resolution...
Had a look at Apple UK for upgrade prices.
8GB 1600MHz DDR3 SDRAM - 2x4GB
16GB 1600MHz DDR3 SDRAM - 2x8GB [+ £160.00]
32GB 1600MHz DDR3 SDRAM - 4x8GB [+ £480.00] (How do they calculate that price?)
16GB 1600MHz DDR3 SDRAM - 2x8GB for £86.39 inc. VAT
So do a user memory upgrade to 32 GB for little more than Apple's 16GB and get to keep the original 8GB
Usually because they have some piece of software that only runs under Windows that they need for some task.
Often there are equivalents under OS X, but they might lack functionality, not fully support the file format of shared files or the corporate bosses won't certify the OS X version and you have to run the Windows version.
A lot of business software is Windows only, although much of it is slowly moving to being cloud based (and I'm talking about bespoke software, ERP software, CRM etc. not Office), although SAP have had an OS X GUI for a few years now...
> SAP have had an OS X GUI for a few years
As well as various other UNIX-like operating systems ... however, they are inferior to Windows SAPGui in a number of areas, process chains springs to mind ... mainly because SAP built some "very good" process chain editor tool that requires the highly acclaimed ActiveX technology into SAPGui, which is only available on Windows.
So yes, most things can be done on Linux, Mac, Solaris Java-based SAPGui ... however, some things cannot.
"I'm surprised MS with its "Let's get our software on as many devices as possible" mantra hasn't planned to bring out a version of Windows 10 that will install straight on a Mac with no need for OSX to be installed."
You've been able to do that since... oh, I dunno, maybe 2007? Probably when Macs began to be based on Intel processors? I remember setting up a Mac Mini with WinXP at least 5 years ago, and it was an old Mac. It's not about Windows, but the computer's BIOS.
In a small Mac shop (seven users), one person here uses Windows. That doesn't qualify as "everybody," and of course they do it because some corporate drones but "solutions" for timekeeping &c. that only run on Windows – despite corporate policy that all Web apps must run on Windows, Macs, and Linux.
I'd prefer a Dell 5k screen with a MS Windows PC. Better performance for less outlay and lower lifetime cost. I'd expect screen to outlast the PC or at least for the latter to need upgrading fairly soon, especially given the embryonic state of graphic card support for 5k and the rapid fall in prices. E.g. NVIDIA Titan X was launched a few months ago. Last week the GTX 980 ti was announced with about the same performance for half the price. Thus, with the same £2k budget you can now blow away the performance of the top end iMac.
Dell's pro monitors are certainly something special... for colour accuracy and image quality, there's nothing in their price range to touch them, and they embarrass a couple of much more expensive competitors - particularly Apple.
I'd never buy an iMac simply because there's no way to use that display if the board inside it gives up, and it's not like the iMac range has a stellar reputation for longevity. Meanwhile, I'm still using a 14-year-old desktop LCD (Apple, as it happens) with my current PC, albeit as a secondary monitor to run mail, etc. on.
I'm just looking at the Dell UK site at the moment and I cant see anything above 3840 x 2160, as nice as they look, its well below the resolution for this iMac, do Dell do a monitor that matches this for resolution?
Edit: Okay, I found the UP2715K on the Dell US website, seems to be the same spec as the iMac screen except the price is $2,499 which is around £1,600? so, to get the very lovely monitor I have already spent the iMac price and now need to buy the PC box on top, how are Apple managing to offer what seems to be quite a good deal or have I missed something?
how are Apple managing to offer what seems to be quite a good deal or have I missed something?
You have. The iMac screens are fairly mediocre in terms of image quality, even the Retina ones. Increasing the resolution doesn't fix the poor dynamic range and colour reproduction. If you put an iMac beside one of Apple's own standalone monitors, you can see the difference immediately (assuming that Apple hasn't cost-stripped these monitors since I last used one). But put one of Dell's professional 1440-line monitors beside that Studio Display and you'll see a better image again, and the Dell is cheaper. (there's no 5k Apple Display to compare with)
But the real saving is down the road: If you choose a separate monitor and computer, and you can just swap out the computer, and continue to use the monitor. Pretty much any LCD display has a working life of well over a decade. How long would you realistically get out of a non-user-upgradable PC with a middle-spec CPU in it?
This huge disparity between working lifetimes of a display and a desktop computer is why I don't like all-in-ones like the iMac range, and modern PCs are small enough now that I feel you lose more than you gain by building them into the display shell.
It's only 3dsmax that doesn't run on a Mac, everything else; Maya, Modo, Cinema 4D, LightWave, Houdini, Mudbox (stop sniggering!) all have Mac versions. All are equally as capable; some more so.
You'll be telling us about CAD next. (AutoCAD, VectorWorks, MacDraft, ArchiCAD, Rhino, Cobalt, Siemens NX [as used buy Apple] all run on OS X)
*Source, actual manufacturers unlike you searchgoogletillifindshithatbacksmeup statement...
Seems like you missed out quite a few big hitters there I'm afraid. Besides that, there's not a single significant piece of *OSX-only* 3d software in either of those lists.
BTW Max is still one of the most used 3d software, especially in visualisation. And AutoCAD is at least a version behind on mac, not to mention all the other missing autodesk stuff.
Mac still has the edge for web and music people as far as I know though, so you are welcome to that lot if you like :)
At no point did I say that otherwise, I refuted your implied assertion that Macs aren't useful in the realm of 3D and preempted the inevitable CAD remarks. Now to take you to task on your (pointless) Wikilists;
Of the 37 listed programs, 12 aren't available for Mac. Of those 12, only 3 are are "big hitters", 3dsmax, SolidWorks, Solid Edge. I've deliberately left SoftImage off that list as it has been discontinued. Yes 3dsmax is popular. Your medal is in the post. You know what else is popular? Cinema 4D. As to what is better? Neither. It the person operating them dummy.
CAD has been traditionally weak on the platform, no arguments there; however, it has useful CAD software available. You "features" bollocks is irrelevant vis à vis AutoCAD as most people use it like Etch-A-Sketch, and in all honesty MacDraft and Vectorworks are better option on the platform. PLM is poorly supported, but trust me, Siemans PLM is more than capable.
I deal with Autodesk on a daily basis, and let me tell you, they are the worst. They make the likes of Oracle and Adobe look saintly. Their software is buggy as shit and their support is next to useless. Pointless in fact. So it doesn't really further your argument that Wintel is better than Mac because Autodesk CAD and 3D.
BTW, vizualisation is a niche of a niche. Son, I was CAD monkeying before you were born.
What Handy Ploug said.
Generally, engineers have used mainframes, then PCs only in the late 90s, so parametric CAD software is on PCs. The origin of engineering CAD software was on $100k+ systems.
Traditionally, music and video has been strongly supported by Macs, so those guys use OSX.
'Visualisation' and 'Motion Graphics' are areas where there is some cross over twixt the two. The first CAD on Macs were of the arty, 'clay-like' varieties: Some workflows will want 3d graphics composited into a video, but the outlook was unlikely to engineering drawings or CNC paths.
Now, in CAD itself there is a development independent of OS: the combining of 'feature-tree' and 'free form' methods in one package.
Shops might want to take engineering data, and use that to create publicity videos... so there is some scope to push against the inertia (see AutoDesk Fusion on OSX), but really this often happens infrequently on the workflow so that it isn't too much of a faff to boot into Windows from time to time.
In engineering and CAD, you often use what your your client uses, or what your customer uses. The whole 'Mac/PC' malarkey is a complete non-argument.
And then you have the new element: rent computer time off AWS, fuck it our engineers around the world are working on the same models anyway.
I understand that they probably can't fit a full sized Nvidia 980 or AMD R9 290X card into an all in one PC. But you might be better to wait for the next refresh as AMD will by then have greatly reduced the size of their card through the use of High bandwidth Memory (HBM).
On the other hand if you wait 6 months you can almost always get a much better computer...
I just can't understand why Apple don't include some form of video-in on these things? I'd take a bet that I'm not the only person out there wanting a Mac somewhere in between a Mini and a Pro in terms of spec, but needing a monitor I can still share with my Xbox and Windows laptop.
Biting the hand that feeds IT © 1998–2019