1838 posts • joined 18 Jun 2009
Fantasy World Dizzy perhaps?
The best one in my opinion, and the third in the main series. One of the very first puzzles was giving stale bread to a rat and there was a painting of Treasure Island Dizzy in the first above-ground room.
I've never noticed Linux types get upset over companies making a profit from products that use Linux — such as Android — or from the vanilla sale of Linux-based OSes, as Redhat, Suse and many others have done. They get upset if companies use Linux and fail to respect the GPL by making their modifications available, but as that's a licence violation I think that's understandable.
As for Comet, it sounds like a simple contractual dispute. Comet obviously thought they had the right to manufacture those discs. It'll be interesting to see what happens.
The Mac was a game changer because...
... because it was a software platform almost from day one with no pretence of hardware compatibility from one machine to the next, and was lucky enough to become the best supported platform in the 80s and early 90s for high-profile, market leading productivity software like Photoshop and Pagemaker, at least partly because QuickDraw was very good software and a good functional match for Postscript.
The Amiga was at least five years ahead of its time and is rarely given the amount of credit it deserves but I don't think you have to take away from Apple to give to Commodore.
Incidentally, I've been reading 'Commodore: a Company on the Edge' since it was recommended by another commenter here and part of it seems to be West Coast bias. History has so far been written by Silicon Valley types; Commodore were based in Pennsylvania. I think it may also be US bias, since the Amiga was a much bigger success for Commodore internationally than domestically.
(aside: your chronology is off. The Amiga 1000 launched in July 1985, a full year and a half after the Mac 128k, which launched in January 1984)
If I recall, didn't the disk drive have a complete additional 6502 all of its own? So not only could you write your own turbo loaders there too, but I think the time to read a whole floppy could be reduced from the something-like-ten-minutes of Commodore's code to a very reasonable less than twenty seconds?
I'd class it as a computer with some significant caveats — and in a hugely different category from the C64 — but a computer nonetheless. Besides matching the dictionary definition of a computer, it also satisfies all the criteria for being a personal computer per the Wikipedia, which I'm taking as a reasonable approximation of what an average Internet user thinks a personal computer is.
I'd distinguish it from a console based on its demographics, especially its penetration into business use, and the software people are buying for it. E.g. Pages, Apple's word processor, remains the top grossing iPad application and rarely drops from the top ten applications sold by volume.
So while I agree that the C64 is the best selling device of all time in an extremely broad category, I don't think it's still the best selling computer.
Sorry, not true
I hate to have to invoke the example, but if you really want to broaden the category as far as possible then e.g. Apple has sold more than 20m IPad 2s (based on conservative figures) - and I've no idea about the other iOS devices. They're computers both per the dictionary and in any objective terms, I think, due to the existence of an aftermarket in software that includes home and business productivity software alongside entertainment and games.
Commenters around here being what they are: I thought to check Apple's numbers because they are a modern anomaly in throwing everything behind just one model for a lengthy period. I'm explicitly not trying to say anything about the relative worth of Apple's devices.
TV user interfaces are rubbish
Look at Sky's for example — you can't see the description for anything other than the programme you're currently watching without retreating from television entirely into the full-screen menus and there's no quick way to flick through just your favourites. Navigating the guide in general feels like wading through treacle compared to almost any of the mobile phone or tablet apps that do the same thing.
What we'd ideally have though is just TVs or decoders that sit on the home network and expose their functionality by a web service, for such second-screen apps as the user cares to use. So I don't just replace the remote control with my phone, I replace everything apart from the display of the television signal. Ideally I'd like to be able to stream my recorded programmes too. Companies like Samsung, DirecTV and Tivo (including via Virgin in the UK) are essentially edging towards that but keeping the protocols private and/or proprietary.
I actually think Apple can fix television in the same way they fixed the mobile phone — by launching a joined-up solution as a premium product for a certain segment of the population and hence giving everyone an appetite for how it should be done. Then let the Android analogue come along and pull the rest of the market up to speed.
Your statistics, while mostly true, give a false picture.
Apple's share has decreased but its overall sales continue to increase. The smart phone market is just nowhere near saturation. The huge disparity in profits between Apple's App Store and the Android Marketplace suggests Apple is still acquiring a large share of the customers it wants.
The iPhone actually still is the single best selling handset worldwide. However Apple are no longer the largest single supplier of mobile phones.
Based on El Reg's headlines, notable lawsuits in the last week have been those launched by Apple, Samsung and Microsoft.
On the contrary, it isn't being sold, it's being given away and MAME's legal information doesn't define a redistribution — the app store download could easily be argued to be part of a redistribution, not a complete redistribution in itself. So they'd be fine if the source is offered by other means. Indeed, if the iOS code has been contributed to the main branch of the project then the app store release isn't even a redistribution.
Re: the trade mark, there's no reason to suppose MAME haven't given approval and whoever uploaded it will have provided assurances to Apple to the effect that they have approval at the point of upload.
If the MAME team haven't granted approval and say so to Apple then it will be taken down. But they won't have been wrong to allow it through in the first place — it's not Apple's place to do legal background checks on all products and in any case MAME's own terms don't make it clear that this release isn't permissible.
The PPC was pretty good stuff, for about five minutes
Hence e.g. Doom, PC, 1993: 320x200; Doom, Mac, 1993: 640x480. The problem for Apple/Motorola/IBM was that Intel have enough money to make any design problem go away. Starting from the superscalar integer/floating point pipelines of the Pentium, Intel have been able to make up whatever they lose in instruction set architecture through sheer feats of engineering, give or take the odd misstep.
Dodgy benchmarks or not, the PowerPC was pretty impressive for at least a couple of years.
Samsung isn't going to finish it
Take your blinkers off; at this point Samsung and Apple are behaving as badly as each other and neither's approach has any moral or ethical merit.
When they're necessarily available with FRAND licensing terms, technical patents don't achieve anything more than patents are meant to achieve — Samsung will be paid a fair amount. "Well we've got some IP that we can have a court compel you to pay a sensible amount for" isn't much of a comeback to "We've got some IP that we can get a court to use to ban your product".
You're thinking of DOS versions, maybe?
This litigation concerns the allegation that Microsoft acted in such a way as to reserve the Windows word processor market for itself — both Word and WordPerfect for Windows were WYSIWYG applications from day one.
WordPerfect for DOS always used a bunch of colour cues to communicate formatting but I think it arrived at that compromise by being a bit of a platform slut and compensated for it by having the best printer support in the business. Word turned up quite a bit later and DOS was the only text-mode OS it supported, meaning they could tie themselves much more closely to the hardware.
I haven't cried this much since Adric joined the cast.
Similar to the Streisand effect?
In that an attempt to suppress something ends up amplifying its ability to do damage. I continue to be unsure quite what Apple thought would be the likely outcomes of its actions.
I'm not sure that's exactly it...
... as conservatives tend strongly to believe in free markets whereas Apple explicitly curate their marketplace.
That said, I do agree with you that the quote you've copied was probably the most revealing part of the interview. Quite a lot of the commenters above don't quite seem to have bothered reading that far though, judging by the knee-jerk 'Woz is a hypocrite — look at how strict Apple is with its employees' comments.
I guess one argument could be that Apple succeeds as a company because it is somehow able to attract enough of the creatives while maintaining a strict business organisation? You know, navigating between the rhetorical poles of attracting a bunch of extremely creative people and never quite managing to pull everything together or being extremely good at money and organisation but managing to employ only routine thinkers.
Other real companies aren't polarised either but Apple's trade-off does often sound unique.
There's got to be some highly technical definition of advert
The standard US half hour programme seems to be 20 or 21 minutes long. Similarly the standard hour programme seems to run for 40 to 42 minutes. Check any DVD you have. So, picking the first thing I can spot on the TV schedule, when Five show The Mentalist tonight between 9PM and 10PM, per the rules they have to find something like nine minutes of material that isn't adverts to show in between parts of the programme?
IDC already have Samsung in first place worldwide
And they're the only manufacturer that both runs a single-vendor ecosystem (in Bada) and participates in the multi-vendor ecosystems (of Android and WP7), so they're doing it by keeping a finger in every available pie. Forget what the platform advocates say - that's providing your customers with freedom of choice. Hats off to them.
It sounds to me like he's excusing it on the grounds that a court in the US found it legal after a discussion of US and UK law, and the UK courts have yet to express an opinion. So 'should be legal' is accurate only in the sense that they think it probably is legal but there's no direct authority. It would be inaccurate if you meant 'should' as in isn't but should be.
See e.g. Shuey v US for an occasion where not only has a US court considered an issue of common law that's equally applicable in the UK and US first (contract in that case, but whatever), but has done so sufficiently convincingly that it's often cited here in the UK with a similar authority to genuine legal precedent.
They'd just move the manufacturing to Brazil
That being where at least some of the current model iPhones are coming from. It also sounds like he owners of the mark want to do a deal, so the objective is to hurt Apple enough to get them to the table, not so much as to damage their prospects, so getting a Chinese manufacturing ban (whether explicitly or not) wouldn't be a smart move unless they're down to brinkmanship.
Per their FAQ:
Is there a version of StyleTap for Android-based devices?
Not at this time, but we are evaluating the technical feasibility.
Also possibly of interest, the emulator is available officially without anything else bundled in for the iPhone via the usual Jailbreak stores.
I'm not willing to pay extra for it
At least, not on top of a DVD or BluRay. However, if I could pay a little extra on a cinema ticket and get UV access then I'd probably do that, even if it's a case of redeeming now and being given access only once the home versions become available.
Same experience here, albeit just nudging into the 90s. Our culprit was a 'temporary' and mostly wooden prefab that the school had acquired secondhand and which decided to use as a permanent classroom. Memorable aspects, other than the cold, included an exploding lightbulb and someone falling through the floor, presumably both related to the damp.
They had a proper classroom built somewhere in the mid-90s, I think.
Puzzle Bobble is surely the best of them?
Or maybe not. It's all subjective.
They have a Puzzle Bobble machine at the Musee Mecanique in San Francisco — as it turns out I can still complete the thing on a single credit. I'd be surprised if I could do anything like that on Bubble Bobble, especially with that Space Invaders level. As for Rainbow Islands, I've just never been a fan, even though we had the Spectrum version back in the day.
How prevalent is NDK software?
I'm going to go out on a limb and guess that a reasonable proportion of big name Android games are built with the native development kit for the simple reason that they can reuse C and C++ stuff that can also be used to target iOS, Windows, consoles, etc. And such titles are going to be built for ARM, which means they won't run on MIPS devices.
With that in mind, surely a tablet like this is headed for some vocal consumer disappointment? It should be the app writers' problem because they'll have chosen not to use the normal Dalvik route and hence to provide processor neutral applications, but I'm not sure that subtlety is going to get across.
Even without knowing their UK price, I still think the Kindle is going to be the big thing.
Quite a few inaccuracies in there
The iPhone emails video out in the H.264 codec, which is an industry standard developed outside Apple. It uses the MOV container format, which was invented at Apple but then expanded to become the industry standard MP4. You should just be able to rename the files.
So: it's not in any sense a proprietary codec and the container format is an industry standard.
Because Apple's QuickTime standard has become the basis for the industry standard that powers most video, including solid media formats like BluRay, and because Apple was the first company to provide a framework of video on the desktop, it's worthy of a "happy birthday" story.
I'm not sure 'nowadays' is accurate
It's been bloatware for as long as I can remember, even going back to whatever version was supplied for Windows 3.1. It's been reported that when Apple were forced to graft Carbon onto OS X, as a transition technology from the classic OS, they found an incomplete but much cleaner implementation of the usual QuickDraw/etc stuff in the Windows port of QuickTime and worked forward from that. I appreciate that the thing was meant to do a lot more than just video but throwing large chunks of the system libraries for an OS in there sounds like it was the offence.
At a guess, the culprit is whoever decided that QuickTime needed to be a 'multimedia platform' rather than just a video playback tool. Comparisons with Apple's feelings about Flash are entirely appropriate.
They've fixed it on the Mac side as of QuickTime X, by the way — it's a clean break reimplementation thing that really just plays a subset of the video codecs that classic QuickTime had accumulated with none of the wider aspirations. I've no idea how they would defend what they currently ship for Windows but I doubt the defence would be very convincing.
They could be using it secretly
In that one of its modes is "raise to speak", i.e. you put the handset on your face and have a conversation with it exactly as though it were a real person, sometimes repeating yourself just like when you're talking to a real person.
That said, probably people just aren't using it. I know I wouldn't.
In agreement with Tony Smith
My Kindle, which has retroactively become a Kindle Keyboard, has come with me on a couple of trips to the US and a couple of trips to other countries in Europe and been fine. So that's eight trips through airport customs with no issues, at least as hand luggage.
Surely they'd just lob a gooey blob somewhere near it?
That's not really fair
QuickDraw made it into Carbon but essentially was marked as deprecated from the initial launch of OS X. Quartz/Core Graphics, and those things abstracted away by the various NSViews, have been the recommended API since day one and Core Animation sits on top of Quartz to provide various transforms and animations in the compositor.
The long road to Core Text is probably the thing Apple should be most embarrassed about, but that was in place by 10.4 so it won't be what Mozilla are debating.
I'd imagine the issue is more Apple's zeal for cutting support for older OSs, both to end users and through their development tools. For various reasons there's no way to be confident that a build you produce with the latest Xcode will work on 10.5, even if you've set the target appropriately, written code that can cope with frameworks and bits of frameworks possibly being unavailable, etc, other than to test on 10.5 itself and then to modify compiler settings manually as appropriate. Which is a lot of effort either switching back and forth between machines or maintaining separate project files because Xcode has been through a major overhaul in the interim.
Intel co-created and are pushing Thunderbolt. It's supposed to turn up on all Ultrabooks in the near future because of its value as a break-out connector that requires minimal physical space, both externally and on the motherboard.
That said, so far I think only Sony have actually put out a Thunderbolt-supporting computer that isn't a Mac, so Thunderbolt's ascension to a proper standard is far from a done deal.
Surely it's worth kicking up a fuss so that these things will be fixed by next year? It's just an unfortunate family of software bugs so it's not like there's anybody arguing the opposite case, we just need to make manufacturers aware that we care.
Maybe buy some capacitive styluses?
They seem to cost about £8 and will work with any capacitive touch screen, whether on the iPad or any of the other finger-oriented tablets.
It's a great design for teaching literacy though
The BBC cost so much because of the impressive software and hardware engineering, the massive array of interfaces around the back and a modular design to the hardware, even inside the box. As stated in the article, the BBC is also significantly faster than most of its competitors in pure CPU terms — twice as fast as the C64, for example.
So it's a fantastic machine all around for teaching computer literacy. There are lots of ways to interface to it, the internal logic isn't sealed inside a single ULA and the operating system is an actual operating system, logically divided and well written.
It's main failing in the wider market, other than price, was that the video display was far too greedy for the available RAM. The OS takes something like 3.5kb for normal use, then often you lose a bit more to the disk filing system, so if you subtract another 20kb for any of the three highest storage display modes you're looking at trying to fit your entire programme into something like 8 kb. Compare that to the 41.25 kb available for user code on the cheaper Spectrum. You could hit the BBC's CRTC directly to invent your own video mode that gives you more space (eg, Elite reduces the width of the display, if I recall) but then you're definitely buying yourself problems when you come to do the Electron port.
@Danny 14, etc
The poster appeared antagonistic because of his statement that he "wouldn't be surprised (sadly) after the 'consolidated.db' fuss."
The consolidated.db was a file on iPhones that cached information for location services. It was synchronised to your computer via iTunes. Due to a bug in the first few iterations of iOS 4 it accumulated data indefinitely rather than merely caching recent data. As a result, if a malicious user had access to your computer then he could extract a history of your movements going back to whenever you started using iOS 4.
That information wasn't collected for any purpose and it wasn't forwarded to anyone. In other words, it's completely unlike the application in this story, the offensive part of which is that it's deliberately collecting data and forwarding it.
So to say "I wouldn't be surprised if Apple have taken a deliberate conscious decision to monitor how its customers use their phones because, you know, they made a coding error once" is so nonsensical that it could be construed as deliberate flame bait.
Probably it's just that if you don't use an iPhone then you wouldn't pay that much attention to the specifics of any particular bug — the original author was correctly aware that the iPhone had previously made it possible for third parties to monitor users in some way and had incorrectly assumed malice.
You are a troll
Apple have never blocked a web site before or in any other way filtered web content.
I'm not sure it's a very good demo
It's understandable from a technical point of view but most of the demo seems not to operate, the permitted interactions being not a great deal more than those mandated by the blue hints. It is a little better than a video and I'm sure they'll get a lot more attention because of the way they've done it but I'm not sure it makes the best impression.
If the test for obsolescence is that a newer product is in development...
... then every single electronics product on the planet has been obsolete at the time of launch, including the iPhone 4S, all future iPhones and all other phones.
They don't agree with your opinion, so it's all a fix?
And Apple refuse to invite El Reg to free press conferences but are paying them for reviews?
The iPhone 4S has no known antenna problems. The iPhone 4 had antenna problems and when reviewed on this site those problems were cited and it received only 75%. See here: http://www.reghardware.com/2010/07/02/review_smartphone_apple_iphone_4/
So your allegation is trivially false on the facts. It probably also says something about your level of paranoia that you've imagined a conspiracy out of thin air.
@Giles Jones (cf: @Arctic Fox, @fandom)
The world isn't actually divided into one group of people that say only positive things about Google and negative things about Apple and another that do only the opposite. So trying to suggest that because you heard people complain about the fixed storage size of the iPhone and now you're hearing people say it's really not a problem concerning an Android then anyone who isn't critical of Android must be a hypocrite is unsupportable. All it means is that some people think fixed storage is a problem and some people don't.
Don't mention the other company!
Though, seriously, I don't think the 16gb limit is much of a problem — if anything it's a win for the consumer if it removes the distinction between internal and removable storage, making app management that little bit easier. I'll bet that most people don't use their phones for watching video that they've stored locally and 16gb is more than sufficient for a decent amount of music and photos with enough spare to take a few photos and videos while out and about.
It's revisionist to factor Commodore into the same market as IBM and Apple, and in any case Commodore isn't relevant to the story presented.
The article is entitled "How Apple beat IBM in Steve Jobs' first retail war". It's a story about Apple versus IBM. It's framed like that because the two big beasts in the business computing market at the time were IBM and Apple. You have to rewrite history to pretend that the home and business markets were joined in 1984 — that wouldn't happen until the death of the proprietary home computers in the late 80s and early 90s.
The article is also written from the point of view of a computer specialist retailer. Apple dominated computer retail profits, the Apple II being the first computer to produce over $1bn in revenue in a year. The home suppliers, like Commodore in the US, piled them high and sold them cheap through general retailers like Sears. They didn't invest much in their sales presence in terms of aisle ends or literature since those costs would have to be passed on, which makes them even further irrelevant to this specific story.
You have more patience than I do
I switched off at Need for Speed 3. The first had a really impressive feel, with weight to the physics and a tangible sustained tension. I guess the tie-up with US magazine Car & Driver and the 3DO demographics made them aim for a mature audience.
The sequel was more of the same, probably even the same code base, but by the third they seem to have decided to go mass market. It's all tunnels through lava-filled mountains and Ridge Racer-style handling.
It's almost impossible to believe now but the only subsequent title I can think of that comes close to the first Need For Speed for tension is the original BurnOut, which thrived on 20-minute races through panoramic vistas and despite including the nonsense of a boost bar had things set up so that you pretty much never get to use it. The ten-metre visibility and desire to throw 'awesome' graphics effects at the camera don't turn up until later.
Technically it was prototyped on NeXT machines. They got it all written once then rewrote the hard stuff in x86 assembler.
Then maybe even a flag waver for open source?
In that most of the weirder ports are a result of the source code release in 1997. Though it made it to pretty much every console from the SNES onwards through standard commercial channels.
Of those I've played: the PlayStation, Jaguar and Gameboy Advance versions are very good, the Saturn version is passable, the 32X and 3DO versions are pretty bad and the politest thing I can think of to say about the SNES port is that it's a technical achievement.
The PC coming into its own
I guess we must have bought our first PC somewhere around 1992 and prior to seeing Doom I don't recall anything being obviously much better than what we'd had on the ZX Spectrum that preceded it, other than loading quickly and being more capable with colours. And although it's not particularly subtle, I still think it plays better than most other games.
Quite a lot of my professional life has ended up focussed on computer graphics and, more recently, computer vision. The sudden leap forwards of Doom is the reason.
Not a very persuasive argument
"It performs worse with Windows, this is probably a driver issue, therefore Apple did it on purpose because they hate me and have no respect for the marketplace"
Does the same conspiracy logic apply to every supplier that produces flawed driver software? It's just that the evidence of iTunes and of Apple's direction generally would appear to make it much more likely that they're just not very good at producing code for Windows.
In what sense has the iPhone steadily ceded ground?
iPhone sales volumes have monotonically increased since its introduction. Android sales volumes have also monotonically increased, but much more quickly. Android has something like two and a half times as much share of the current smartphone market but, in terms of sales volume, not at the expense of the iPhone. I expect the same story to repeat in the tablet world, but probably with the Kindle Fire doing the work of all the Android mobiles added together.
Most of those 'many people' are wrong
If Apple won brand loyalty only through advertising and PR and if advertising and PR were sufficient in and of themselves then the Mac wouldn't be stuck at 5% marketshare worldwide, and if the difference were just consumers versus businesses then you'd expect the Mac to be a hot target for the consumer-focussed sectors of the market. However, things like games usually don't get ported and, if they do, turn up months after the marketplace buzz about the product has long since subsided.
In summary: there's clearly some other distinction between Apple's mobiles and tablets versus the competition than merely the brand name and the advertising as both of those distinctions also apple to the computers but the former manage to rack up commanding market shares whereas the latter don't.
A story of greed and optimism?
Most 3d movies are shot in 2d and post processed into 3d, which gives the cardboard cut-out effect that people either assume to be a limitation of the technology or figure must mean that they're in whatever percentage it is that the 3d effect doesn't work on. When the benefits of the technology are so underwhelming there's no point making a special effort for it — such as paying more to sit through a darker film while wearing oversize cheap plastic glasses or extending an effort to get it into their own home.
In addition, the majority of television watching is people putting it on in the background while they do something more interesting. About a third of most programme time is advertisements and television programmes are competently aimed at the lowest common denominator, and nowadays there's usually a laptop or a tablet nearby. So even if you have the TV on for several hours, showing programmes you enjoy, it's probably still not the main thing you're doing with your eyes.
The idea that if they all told us we really wanted 3d then obviously we'd buy it was ridiculously optimistic.
- NASA boffin: RIDDLE of odd BULGE FOUND on MOON is SOLVED
- SOULLESS machine-intelligence ROBOT cars to hit Blighty in 2015
- BuzzGasm! Thirteen Astonishing True Facts You Never Knew About SCREWS
- Worstall on Wednesday YES, iPhones ARE getting slower with each new release of iOS
- Microsoft's Euro cloud darkens: Redmond must let feds into foreign servers