1936 posts • joined 18 Jun 2009
Re: This is going to take some getting used to (@Ian Yates)
While probably true, iTunes is still a bit too much of a feature mess to be described as merely a media manager. It's a media manager that also manages applications, streaming (both local and remote), performs device synchronisation — including its content and that inherently belonging to other applications, and has only recently stopped being a social media client.
I'm an advocate for splitting it up but I guess Apple want to be able to give Windows users who buy iPods, iPhones, etc, a single thing to install (and, for whatever reason, won't just make that thing 'a driver').
This is going to take some getting used to
Optimistically for Windows users, the interface looks like a complete rewrite, albeit possibly motivated by someone incessantly asking "can't we make it any brighter?". The binary isn't substantially different in size on the Mac though so let's not get too excited.
The advertised 'albums' view seems like a non-starter to me unless you have only about six of them but 'artists' works quite well and 'songs' provides the classic layout if you want it. Colour icons are back in the sidebar but the bar itself is no longer essential to navigation. The new control icons at the top are a huge improvement — the silhouette destroying circles they used to sit in are gone, making everything visually clearer. It's also good that they've made the mini-player more obvious but it now seems to have lost its progress bar, which is a shame.
Re: Fascinating, but very sad ..
Being temporarily in the US, I was at thanksgiving last week and heard much the same thing — having tried a couple of more local sources first, both of the people I was talking to ended up following the BBC's web coverage that day due to a combination of quality and accessibility.
Re: VCD's, etc (@RAMChYLD, others)
I'm not sure how CD Video or Video CDs match Whyfore's description of "something ... which seems to be a precursor to VCDs that look a lot like vinyl discs (or maybe they're just massive VCDs" — wouldn't those be like VCDs but exactly the same size (and in one case, exactly VCDs)?
More constructive question: does anyone else remember the Reel Magic, an MPEG1 decoding expansion card for PCs circa 1994? Other than Video CDs, I think Return to Zork had a version that supported it but that's about all. I once saw it being demonstrated with a standard retail copy of Top Gun as evidence that fast motion sequences weren't a problem, but the detail of the story was that somebody had spent weeks painstakingly tweaking the compression of that title. So not a fantastic sales pitch.
Re: VCD's (@Whyfore)
Are you thinking of Laserdiscs maybe? They're interesting because, despite the laser and its usual connotations, they're an analogue format — the video is always analogue and the sound was originally analogue but later could be digital, in exactly the same format as a CD. Since there was no compression and the resolution was about double that of VHS, the video quality was really very good.
You got at most 60 minutes of content per side so you actually need to swap them more often than video CDs though usually that just means flipping them over and high end players could do that for you, often by having two read heads like a floppy drive rather than by physically moving the disc.
There were a bunch of weird approximately LP sized video formats in the late 70s and early 80s; for a real oddity look up the Capacitance Electronic Disc, which is grooves read by a stylus just like a record.
Re: If Commodore didn't mismanage itself to the early grave...
It feels unlikely to me that a single vendor could have maintained a market lead over a diverse array of competitors. The competing technology catches up and the competition forces the prices down. DOS was particularly far behind, making the situation look worse than usual but I'm confident Microsoft or somebody else on the PC would have taken the crown by now.
I guess the real shame is that OS/2 wasn't ready in about 1984. IBM's focus on supporting mainly its own hardware would have reined in the PC architecture a little and putting a proper OS between software and hardware would probably have bought us stuff like intelligent video hardware a lot earlier.
Re: Microsoft won the way they know - dishonesty and fraud.
Didn't Micrografx Mirrors [more or less] allow Windows source to be compiled into an OS/2 application? Though I'll concede it was meant to be a stopgap emulation-ish layer rather than a tool that'd actually go in and adjust your source so that you were subsequently working on a native OS/2 application.
Per Inside Macintosh Volume 3, Page 18 (ie, Apple's official documentation):
"The pixel clock rate ... is 15.6672 MHz, or about .0642 microseconds (µsec) per pixel. For each scan line 512 pixels are drawn on the screen, requiring 32.68 µsec. The horizontal blanking interval takes the time of an additional 192 pixels, or 12.25 µsec. Thus, each full scan line takes 44.93 µsec, which means the horizontal scan rate is 22.25 kilohertz.
A full screen display consists of 342 horizontal scan lines, occupying 15367.65 µsec, or about 15.37 millisecond (msec). The vertical blanking interval takes the time of an additional 28 scan lines — 1258.17 µsec, or about 1.26 msec."
Per the GIMP FAQ: "For some industries, especially photography, 24-bit colour depths (8 bits per channel) are a real barrier to entry"
It's therefore explicitly not good enough for a lot of photographic work, per its own documentation. The good news is that the developers are fixing it, and I believe deserve credit for being upfront about the deficiency. If it were ordinary commercial software I'm sure the FAQ would disingenuously argue that nobody needs more than eight bits per channel.
Re: i'll stick with Maps thanks (@AC)
And Nokia's master stroke was ensuring that all the errors cited are present not just in Apple's app but also on the website, in the Android app, etc?
Re: Worthy but sluggish (@Steve Knox)
Nokia's app clearly eschews the built-in APIs. The inertial scrolling has the wrong inertia, double tap to zoom is painfully forced and clearly using the wrong animation curve, and all other controls are obviously custom (eg, on an iOS button you can put your finger down, drag outside, drag back inside and release and the button activates; in Nokia's app it activates only if you finger up without leaving the box).
I think the problem is more that when compared to Apple's app the Nokia effort is slow, blurry, responds incorrectly to user input and is in some areas barely functional. When compared to Google's data, Nokia's is noticeably incomplete and often inaccurate.
I think those flaws plus the puff piece this article amounts to are leading to the negative tone.
One of Android's early selling points was that it's much more open than iOS. Yet it took Adobe two years — three from the launch of the iPhone — to build a suitable version of Flash, which it then turned around and cancelled barely more than a year later.
From that you have to conclude that even if Apple had wanted Flash on the iPhone in 2007 there's no way it could have happened. So I just don't agree with Allaire's claim that Jobs killed Flash on iOS. For whatever technical reasons an argument was required as to why not having Flash was not a disadvantage.
Re: Talk about failure to comprehend! (@Christopher Michaelis)
Yes, until they find themselves in a country with nationalised healthcare like, ummm, pretty much all of them.
On the contrary; the Republican Party got just 25% of the Hispanic vote this year. Hence if current population growth patterns continue and neither party changes rhetoric then Texas will be a swing state by 2020.
Democrats therefore probably don't want secession. In practice none of us should want secession because keeping Texas in the union is the most likely way to bring the Republican party back towards the centre, and it's helpful to everyone when the most economically and militarily powerful nation on earth has moderate leadership.
Re: NEC 8088 clone
The V20 was also superior to the 8086 in that it had an 8080 compatibility mode, though I'm aware of exactly one application that used it — a CP/M-80 emulator for MS-DOS.
Re: Motorola 6800 inspired the MOS 6502 (@AC)
My understanding is that the 6500 was pin compatible with the 6800 since MOS Technology hoped to be able to walk up to Motorola's customers and sell the 6500 as not requiring any wider system changes. Motorola obviously had something to say about that, especially as Chuck Peddle — chief designer of the 6500 — had previously been a Motorola employee on the 6800 team, suggesting a trade secrets angle (spurious, but beginning the action was enough in itself to do the desired damage) . MOS backed down and pushed the 6502, identical to the 6500 except for the pin out.
I don't think the two processors shared any internal design features; they're not semantically equivalent (different numbers and sizes of registers, different addressing modes, different ways of handling decimal arithmetic, just a different instruction set overall) and certainly aren't byte-code compatible.
The 6502 was important thanks to Commodore, Apple and Atari but it was an Intel 8080 that powered the Altair 8800, the genre-defining home computer, and also the 8080 that CP/M was originally defined around. And in the UK it was the Z80 — an improved 8080 from the same team, albeit as a different company — that ran the ZX80 and the ZX81, which started home computing there. There are also a raft of other notable Z80 machines, not least the Spectrum, the Colecovision, the Master System and, approximately, the GameBoy.
I guess there are various alternative strands, like the 6502 inspiring (in at least a couple of senses) creation of the ARM or the 68000 and its progeny of the PowerPC (that, though gone from the desktop, powers the major consoles), but I disagree that you can write Intel out of the computer and video game market.
Re: "the momentum behind NFC is pretty much unstoppable"
My phone doesn't do NFC but my debit card does — indeed it's the sort of thing that's quite easy to add to a debit card compared to speakers and microphones. I've also been in several places around London that can receive NFC card payments, mostly sandwich chains and pubs where the £15 limit before you have to use chip and pin is often not a problem.
So based on my anecdotal experience, there's some momentum behind NFC. It's going into cards and becoming available in retailers.
However I've yet to experience anybody using it on a phone or any situation where I wished my phone could use it.
As a function of franchises, we used to be limited to O(n) releases a year. Thanks to the pioneering work of companies like Rovio and Lucasarts it looks like we're working towards O(n^2). Thank goodness!
Re: Should be interesting (@Destroy All Monsters)
Based on the voting it seems the humourless are out in force, so I'll spell out the meaning of my previous post very slowly indeed: patenting something this obvious is contrary to established manners. Obvious patents appear to be taking hold across the industry. The idea of writing such patents is therefore innovative according to the dictionary. Furthermore there's irony in the way that pushing boundaries in one area is holding another back and in the dissonance with Apple's claims of innovation.
Re: Should be interesting
One way to fit the dictionary definition of innovative is to do something that is contrary to established manners and which establishes new customs.
So it's very possible that Apple's patents are innovative.
I think the article just means to claim that the prospective screen supplier has started tooling up to manufacture the screen, not that Apple have started filling warehouses.
The PPI makes it all sound too unlikely to be worth paying attention to in any case. I guess the real argument is that they'd triple the current PPI (which would make 489), putting nine pixels everywhere one currently is but all that'd really do versus a doubling of the PPI is confer boasting rights, and Apple never seems to need an excuse to boast.
There is a marketing budget at play here, but it's spent by Apple talking directly to consumers. El Reg then covers alternatives as a counterpoint and because people are interested. There's no reason to suspect a conspiracy.
I guess it's always possible that when those "people familiar with the company's research" say that Apple is "exploring ways" to use its own silicon in future Macs what's actually being proposed is the consolidation of every non-CPU function into a single chip, keeping the Intel processor and moving to a [pretty much] two-chip design? They're already doing soldered RAM and SSD, the iPad designs are triple-sandwiched CPU+GPU+RAM chips and peering at the current Macbook Pro motherboard on Google images appears to show tens of chips across the board.
They're already using the on-board GPUs but I guess that building RAM+SSD into a single unit would be a good saving? It doesn't feel like something that's likely to come onto the market from anyone else.
I think that the most popular use for BootCamp, Parallels, VMWare, etc is probably to play games. They're still routinely released for Windows but not for the Mac. Having a quick glance at the current Amazon charts, if you exclude Windows 8 then the first thing not available for the Mac is "Honestech VHS to DVD 5.0 Deluxe" at number 24 (though, in fairness, I think not all variations of Quicken are available).
That said, per NetApplications Apple had 4% marketshare before it switched to Intel; more than six years later that's moved up to 7.2%. Linux has remained just below 1% across the entire period. There's basically no chance of Linux overtaking the Mac even if Apple were to make any sort of drastically unpopular change — there seems to be a glass ceiling Linux can't break through while the Mac has managed to endure regardless of mismanagement.
Re: Sheltered Life
While I agree that the iPad's unique selling proposition isn't related to the specifications and disagree that it's merely slaving brand devotion, I've also found the Nexus 7 to be pretty good. The one I used was fast and felt robust and well constructed. While metal's nice to the touch, the big dent in the back of my (perfectly functional) iPad 1 does prove that it has downsides too; I'm pretty sure that the slightly rubberised Nexus 7 would have survived the same drop with no lasting effect whatsoever.
Re: No surprise
People have been criticising the Windows port of iTunes with good justification for nearly a decade, so many have definitely not been listening. What surprises me is that the push back against Safari had a very quick effect — it speedily switched from Apple's custom font and window rendering to fitting in properly via Windows' native controls and text — but iTunes has clung on.
I'm going to imagine, despite having no real evidence, that the current iTunes is the last resting place of Carbon, which is probably less distantly descended from QuickTime for Windows than you'd hope. In my fantasy world the new release will sweep that aside, preferring whatever technological basis seems to work for Safari and dumping the heft. I'm happy in my world.
Ummm, I think MaFt was making the point that feature comparison charts from biased sources are usually a little ridiculous by supplying some absurd suggestions. He doesn't deserve the negative votes.
Not particularly accurately, though
Is it really taking the high ground to state that a screen larger than 720p — and about 77% as many pixels as what Amazon calls 'HD' — is "standard definition" and to imply that the iPad has no Wifi? I'm also unclear how they arrive at the claim that the iPad can't be used for viewing HD movies or TV in the sense that the Kindle can.
If I were Amazon I'd have pushed the pixel density (for tick box feature completists) and price (for normal people) issues harder, ignored the more tenuous claims and put an icon for whatever maps application Amazon ships into the screenshot as a dog whistle. Since it's a Kindle-branded product, boasting about the Amazon digital book library versus the iBooks storefront would probably also have been appropriate.
Re: Samsung more successful than Apple? Apple throwing it's toys out of the pram in the UK Court
Hasn't Samsung always been more successful than Apple? Per the Internet Samsung usurped HP to become the number one technology company when measuring by sales back in 2009.
I think the story is more interesting in terms of trends — Apple's numbers appear more or less stagnant (albeit quite healthy) while Samsung's are growing quickly.
Re: Really disappointed
Assuming we're talking just amongst retinas, I can easily see your point of view. Take the stock 13", add the build-to-order 256Gb storage upgrade to give it storage parity with the stock 15" and you're only $200 away. But for moving up you get a quad core CPU rather than a dual core and a discrete GPU, no doubt more than making up for the 0.2Ghz drop in CPU clock speed.
Relative to the price of the machine as a whole, that's a bargain.
Re: Just wish
Then the correct conclusion is that you don't know what anticompetitive is.
Microsoft were selling one kind of product, very successfully. Netscape came along and started selling a separate kind of product, very successfully. Microsoft used the money and resources from the one kind of product to force Netscape from the market. In short they used a dominant position in one market to distort competition in another. Actual damage was done to real consumers.
Apple doesn't have a dominant position to abuse. It hasn't used resources in one market to force anyone out of another. The competition for everything it does is very healthy. If you, as a consumer, don't like the way Apple is working then there are lots of other options with similar market clout. The free market is functioning.
Re: The only snag...
While that's a valid criticism, there's something of a chicken and egg situation backing it up — the Mac rarely gets triple A titles at the same time as other platforms, giving it an ability to lag in GPU power. Of the current Mac App Store top 10, the [budget] ports of the Grand Theft Auto 3 series are probably the most GPU taxing. Widening the net to the top 25 brings a couple of Call of Duty titles into consideration but neither a more recent initial release than 2010.
So there's empirical evidence that the GPU is more than good enough for the majority of Apple's customers, even though it's not about to attract any serious gamers.
Re: very easy to do on other hardware
Intel's solution is to use SSD as cache; Apple appears to be talking about actually locating files on the SSD _instead of_ on the hard disk. So it's not a matter of one physical address being made faster, it's a matter of data being moved from one address range to another.
Assuming Apple's comments today were more than mere marketing puff, the system sees the two things as two drives and then manages that all for you. Analysis I've now seen elsewhere suggests that 10.7's CoreStorage acts to make a single virtual address space for all drives and the OS then shuffles the physical mapping based on whatever metrics it thinks are relevant. But the software makes two hardware things look like one rather than the one hardware thing secretly being more complicated inside.
So two physical drives are merged at the same mount point?
That'd require some significant rewiring within HFS+, wouldn't it? I mean it's obviously trivial to have, say, /Applications on the SSD and /users on the platter but from the announcements it sounds like they're talking about moving individual files between them but still having them appear to software to be in the same place?
Probably I'm relying on faulty information.
Re: Nonexistent Nexus?
Indeed, my main thoughts have been 'is $100 extra over the Nexus worth it for a 40+% bigger screen?' — I wonder if we're about to see an inversion of the usual squabbling over whether screen sizes matter above all else?
In 2011 96% of Google's income came from advertising (source: http://venturebeat.com/2012/01/29/google-advertising/). Even if Android were the only other one of Google's income streams, it would still be insignificant in comparison in terms of revenue and in any case I think that trying to close it up and squeeze more money by that route would be counterproductive. It's certainly not in itself going to lead to the sort of growth that is going to offset advertising losses.
So I'm confident that Android is safe exactly as it is.
Re: This is ridiculous (as is the usual drivel coming out Business Insider)
The full framework documentation is at http://developer.apple.com/library/ios/#documentation/DeviceInformation/Reference/AdSupport_Framework/_index.html — if an advertiser wants explicitly to post a location then the app will have to request and be approved for location updates. Otherwise all they're getting is the "alphanumeric string unique to each device, used only for serving advertisements. [...] the same value is returned to all vendors. This identifier may change—for example, if the user erases the device—so you should not cache it."
Your 'opt out' appears equivalent to the don't track HTTP flag in that the advertising agent gets told that you don't want to be tracked and is then merely honour bound (or possibly legally bound, depending on your country) to obey. No technical barrier is erected. At best I guess Apple may implement some sort of vetting system for app approval.
Re: I hate sites / apps that bury settings like this
I'm not sure this is entirely true, but only on very slender grounds: I can never find anything in any reasonable amount of time within my iPhone's settings. The whole layout seems completely counterintuitive. As a result, I don't think we can assume malicious intent from the simple fact of a new setting being in a very strange place.
Obvious examples: why is Auto-Lock a 'general' setting but Brightness & Wallpaper a top-level setting? Why is iTunes Match under 'Music' rather than under 'iCloud'? How is asking the phone right now to check for a software update a setting at all?
They desperately need an OS X-style search bar, I think.
Re: And all Apple has to do ?
Unless they foolishly introduce an iPad Mini in the tiny gap between iPod Touch and iPad, of course. Then they've no real flexibility to drop the price of one thing without having to drop the price of the whole lot. If it's take a hit across the range from the lowest iPod up or wait and expect that Microsoft fails on its own merits I think it's likely they'll try the latter.
What's wrong with Pages, Numbers and Keynote?
They're as good in isolation as most other word processors, spreadsheets and presentation software, and as compatible with Microsoft Office as most things that aren't Microsoft Office (ie, reasonably but less so since Microsoft started creating fonts specifically for its idiosyncratic kerning and declining to license them).
I like to imagine that Microsoft's belated entry has made a lot of people realise that Office-equivalent functionality will do, whether on iOS or Android, in much the same way that IBM compatible became an acceptable alternative to buying IBM — especially when they took forever to release a 386.
Re: Decimation or decim8shun?
Per the OED, decimate has had the meaning of 'devastate' since at least 1663; every single use of the word in the British National Corpus as maintained by Oxford University uses it in that sense, though they all seem to be from the 1990s or newer so that's not a fantastic argument.
It's the idea that it's being used incorrectly that's a modern invention. It's up there with the idea that the lack of a possessive apostrophe on 'its' is a special case* for false inventions that have somehow gripped the public imagination.
* it's not — check the other personal pronouns; 'one's' is the special case.
Re: Epoch making?
I always thought the approximate birth of Christ (leaving the question of divinity explicitly aside), the accession of Richard the 1st, the French Revolution and possibly even January the 1st 1970 were fairly important dates. Little did I suspect that a new laptop was going to sweep our current era aside.
I have my fingers crossed for the sake of the original author that his words have suffered an unduly literal translation.
Surely the question is: if the alleged iPad Mini is £199 then who's going to bother with an iPod Touch? The Nano and the Shuffle are already there for jogging, living in the glove compartment, etc.
Re: So, despite rest-of-world being utterly wrong (@GettinSada)
Aspect ratios aside, I'll bet the new device won't run existing apps but physically smaller.
My logic is that the result would be a user experience train wreck, Apple has put significant effort into how layouts resize under iOS 6 (struts and springs are gone; you now specify arbitrary constraints) and it would be very uncharacteristic not to arm twist the developer base into adopting the latest technologies.
Re: ABOUT TIME!!! Really? (@AC 16:37)
If implemented correctly it should be imperceivable — so much effort has gone into engineering mobile phone software to be power efficient that spare processing capacity is available quite often.
We should be grateful that Google, with a vested interest in doing it well, is stepping up before manufactures start shovelling on their own solutions. Have a wander around PC World to see the worst possible outcome.
Re: Painting themselves into a corner
If iOS is a copy of Android because it has lifted some good ideas (the notification area being the most obvious) then Android is a copy of iOS for the same reason (eg, pinch to zoom). Neither is inherently in the wrong, and I think it's really only ever cited as an issue because Apple insists on being so litigious.
OS X isn't built on BSD. It also isn't bug ridden.
History says Windows 8 won't bury Apple. Windows 95 and 98 were lightyears ahead of System 7 in a lot of important areas — preemptive multitasking and memory protection sound like tedious tech wedge issues but substantially improve the user experience. Regardless, Apple survived.
The Linux distros don't make OS X old fashioned any more than the Apple Magic Track Pad or whatever it's called makes mice look old fashioned. Simply being different isn't the test.
Hackintoshes are a solution for, what, the most technical 5% of people?
Samsung and Apple phones and tablets are basically indistinguishable. The fact that developing software for them is common now and that tech types like yourself therefore ascribe greater significance to brands hasn't much changed how people pick their devices; nobody outside Internet forum types thinks of either as a great satan. Ironically it's the people that most complain about Apple users slavishly following the company that most strongly define themselves by a brand — it just happens that they're defining themselves in opposition. In any case they're a tiny subset of society. Based on the value proposition Apple probably deserves some segment of the market — say 10% or maybe even 15% — and will probably end up profitably maintaining that segment.
Apple has almost $100bn in cash reserves. That's the biggest cash hoard in corporate America, and more than twice that of Microsoft. Even if nobody buys another Apple product or service ever again, they're going nowhere for a very long time.
Re: ACTUALLY (@AC 17:06)
For the purposes of anecdotes, I've seen the purple fringe too, with a lot of light coming from the top of the frame but no light source in or near the shot.
Conversely, there's no green glow, the battery seems to last about twice as long as that in my 4S (though I'm comparing one after several months of development use, with the huge number of part charge cycles that result from plugging in and unplugging the device, to the other more or less straight) and if there are any other hardware complaints doing the rounds then I can't claim to have experienced them.
Re: Too right, typical Apple
@Destroy All Monsters: The 68020 is definitely 32-bit, no matter which way you cut it — 32-bit instruction set architecture, 32-bit data bus, 32-bit address bus. The 68000 was considered 16-bit at the time because of the 16-bit data bus though I'm not sure it'd be classified that way now as it had the full 32-bit ISA.
@Gaius: the MultiFinder was released in 1987 as part of System 5. In performance terms the Mac had the edge as of the Mac II (also 1987) since the Amiga stayed at ~7Mhz on a 68000 until 1990, whereas Apple more than doubled that clock rate and switched to the 68020. For GUI tasks the original Mac also outpaces the Amiga because the CPU is slightly faster and the display used for the GUI — being limited to 1bpp — is a lot more compact, and hence faster to manipulate. So there's really no tenable argument about performance.
Apple weren't shy in charging very high prices and often lagged in features (like, you know, colour) but it's rewriting history to suggest that there was "no contest".
- +Comment Anti-Facebook Ello: Here's why we're still in beta. SPAMGASM!
- NASA rover Curiosity drills HOLE in MARS 'GOLF COURSE'
- WHY did Sunday Mirror stoop to slurping selfies for smut sting?
- Business is back, baby! Hasta la VISTA, Win 8... Oh, yeah, Windows 9
- George Clooney, WikiLeaks' lawyer wife hand out burner phones to wedding guests