1379 posts • joined Thursday 18th June 2009 14:54 GMT
The PPC was pretty good stuff, for about five minutes
Hence e.g. Doom, PC, 1993: 320x200; Doom, Mac, 1993: 640x480. The problem for Apple/Motorola/IBM was that Intel have enough money to make any design problem go away. Starting from the superscalar integer/floating point pipelines of the Pentium, Intel have been able to make up whatever they lose in instruction set architecture through sheer feats of engineering, give or take the odd misstep.
Dodgy benchmarks or not, the PowerPC was pretty impressive for at least a couple of years.
Samsung isn't going to finish it
Take your blinkers off; at this point Samsung and Apple are behaving as badly as each other and neither's approach has any moral or ethical merit.
When they're necessarily available with FRAND licensing terms, technical patents don't achieve anything more than patents are meant to achieve — Samsung will be paid a fair amount. "Well we've got some IP that we can have a court compel you to pay a sensible amount for" isn't much of a comeback to "We've got some IP that we can get a court to use to ban your product".
You're thinking of DOS versions, maybe?
This litigation concerns the allegation that Microsoft acted in such a way as to reserve the Windows word processor market for itself — both Word and WordPerfect for Windows were WYSIWYG applications from day one.
WordPerfect for DOS always used a bunch of colour cues to communicate formatting but I think it arrived at that compromise by being a bit of a platform slut and compensated for it by having the best printer support in the business. Word turned up quite a bit later and DOS was the only text-mode OS it supported, meaning they could tie themselves much more closely to the hardware.
I haven't cried this much since Adric joined the cast.
Similar to the Streisand effect?
In that an attempt to suppress something ends up amplifying its ability to do damage. I continue to be unsure quite what Apple thought would be the likely outcomes of its actions.
I'm not sure that's exactly it...
... as conservatives tend strongly to believe in free markets whereas Apple explicitly curate their marketplace.
That said, I do agree with you that the quote you've copied was probably the most revealing part of the interview. Quite a lot of the commenters above don't quite seem to have bothered reading that far though, judging by the knee-jerk 'Woz is a hypocrite — look at how strict Apple is with its employees' comments.
I guess one argument could be that Apple succeeds as a company because it is somehow able to attract enough of the creatives while maintaining a strict business organisation? You know, navigating between the rhetorical poles of attracting a bunch of extremely creative people and never quite managing to pull everything together or being extremely good at money and organisation but managing to employ only routine thinkers.
Other real companies aren't polarised either but Apple's trade-off does often sound unique.
There's got to be some highly technical definition of advert
The standard US half hour programme seems to be 20 or 21 minutes long. Similarly the standard hour programme seems to run for 40 to 42 minutes. Check any DVD you have. So, picking the first thing I can spot on the TV schedule, when Five show The Mentalist tonight between 9PM and 10PM, per the rules they have to find something like nine minutes of material that isn't adverts to show in between parts of the programme?
IDC already have Samsung in first place worldwide
And they're the only manufacturer that both runs a single-vendor ecosystem (in Bada) and participates in the multi-vendor ecosystems (of Android and WP7), so they're doing it by keeping a finger in every available pie. Forget what the platform advocates say - that's providing your customers with freedom of choice. Hats off to them.
It sounds to me like he's excusing it on the grounds that a court in the US found it legal after a discussion of US and UK law, and the UK courts have yet to express an opinion. So 'should be legal' is accurate only in the sense that they think it probably is legal but there's no direct authority. It would be inaccurate if you meant 'should' as in isn't but should be.
See e.g. Shuey v US for an occasion where not only has a US court considered an issue of common law that's equally applicable in the UK and US first (contract in that case, but whatever), but has done so sufficiently convincingly that it's often cited here in the UK with a similar authority to genuine legal precedent.
They'd just move the manufacturing to Brazil
That being where at least some of the current model iPhones are coming from. It also sounds like he owners of the mark want to do a deal, so the objective is to hurt Apple enough to get them to the table, not so much as to damage their prospects, so getting a Chinese manufacturing ban (whether explicitly or not) wouldn't be a smart move unless they're down to brinkmanship.
Per their FAQ:
Is there a version of StyleTap for Android-based devices?
Not at this time, but we are evaluating the technical feasibility.
Also possibly of interest, the emulator is available officially without anything else bundled in for the iPhone via the usual Jailbreak stores.
I'm not willing to pay extra for it
At least, not on top of a DVD or BluRay. However, if I could pay a little extra on a cinema ticket and get UV access then I'd probably do that, even if it's a case of redeeming now and being given access only once the home versions become available.
Same experience here, albeit just nudging into the 90s. Our culprit was a 'temporary' and mostly wooden prefab that the school had acquired secondhand and which decided to use as a permanent classroom. Memorable aspects, other than the cold, included an exploding lightbulb and someone falling through the floor, presumably both related to the damp.
They had a proper classroom built somewhere in the mid-90s, I think.
Puzzle Bobble is surely the best of them?
Or maybe not. It's all subjective.
They have a Puzzle Bobble machine at the Musee Mecanique in San Francisco — as it turns out I can still complete the thing on a single credit. I'd be surprised if I could do anything like that on Bubble Bobble, especially with that Space Invaders level. As for Rainbow Islands, I've just never been a fan, even though we had the Spectrum version back in the day.
How prevalent is NDK software?
I'm going to go out on a limb and guess that a reasonable proportion of big name Android games are built with the native development kit for the simple reason that they can reuse C and C++ stuff that can also be used to target iOS, Windows, consoles, etc. And such titles are going to be built for ARM, which means they won't run on MIPS devices.
With that in mind, surely a tablet like this is headed for some vocal consumer disappointment? It should be the app writers' problem because they'll have chosen not to use the normal Dalvik route and hence to provide processor neutral applications, but I'm not sure that subtlety is going to get across.
Even without knowing their UK price, I still think the Kindle is going to be the big thing.
Quite a few inaccuracies in there
The iPhone emails video out in the H.264 codec, which is an industry standard developed outside Apple. It uses the MOV container format, which was invented at Apple but then expanded to become the industry standard MP4. You should just be able to rename the files.
So: it's not in any sense a proprietary codec and the container format is an industry standard.
Because Apple's QuickTime standard has become the basis for the industry standard that powers most video, including solid media formats like BluRay, and because Apple was the first company to provide a framework of video on the desktop, it's worthy of a "happy birthday" story.
They could be using it secretly
In that one of its modes is "raise to speak", i.e. you put the handset on your face and have a conversation with it exactly as though it were a real person, sometimes repeating yourself just like when you're talking to a real person.
That said, probably people just aren't using it. I know I wouldn't.
In agreement with Tony Smith
My Kindle, which has retroactively become a Kindle Keyboard, has come with me on a couple of trips to the US and a couple of trips to other countries in Europe and been fine. So that's eight trips through airport customs with no issues, at least as hand luggage.
Surely they'd just lob a gooey blob somewhere near it?
That's not really fair
QuickDraw made it into Carbon but essentially was marked as deprecated from the initial launch of OS X. Quartz/Core Graphics, and those things abstracted away by the various NSViews, have been the recommended API since day one and Core Animation sits on top of Quartz to provide various transforms and animations in the compositor.
The long road to Core Text is probably the thing Apple should be most embarrassed about, but that was in place by 10.4 so it won't be what Mozilla are debating.
I'd imagine the issue is more Apple's zeal for cutting support for older OSs, both to end users and through their development tools. For various reasons there's no way to be confident that a build you produce with the latest Xcode will work on 10.5, even if you've set the target appropriately, written code that can cope with frameworks and bits of frameworks possibly being unavailable, etc, other than to test on 10.5 itself and then to modify compiler settings manually as appropriate. Which is a lot of effort either switching back and forth between machines or maintaining separate project files because Xcode has been through a major overhaul in the interim.
Intel co-created and are pushing Thunderbolt. It's supposed to turn up on all Ultrabooks in the near future because of its value as a break-out connector that requires minimal physical space, both externally and on the motherboard.
That said, so far I think only Sony have actually put out a Thunderbolt-supporting computer that isn't a Mac, so Thunderbolt's ascension to a proper standard is far from a done deal.
I'm not sure 'nowadays' is accurate
It's been bloatware for as long as I can remember, even going back to whatever version was supplied for Windows 3.1. It's been reported that when Apple were forced to graft Carbon onto OS X, as a transition technology from the classic OS, they found an incomplete but much cleaner implementation of the usual QuickDraw/etc stuff in the Windows port of QuickTime and worked forward from that. I appreciate that the thing was meant to do a lot more than just video but throwing large chunks of the system libraries for an OS in there sounds like it was the offence.
At a guess, the culprit is whoever decided that QuickTime needed to be a 'multimedia platform' rather than just a video playback tool. Comparisons with Apple's feelings about Flash are entirely appropriate.
They've fixed it on the Mac side as of QuickTime X, by the way — it's a clean break reimplementation thing that really just plays a subset of the video codecs that classic QuickTime had accumulated with none of the wider aspirations. I've no idea how they would defend what they currently ship for Windows but I doubt the defence would be very convincing.
Surely it's worth kicking up a fuss so that these things will be fixed by next year? It's just an unfortunate family of software bugs so it's not like there's anybody arguing the opposite case, we just need to make manufacturers aware that we care.
Maybe buy some capacitive styluses?
They seem to cost about £8 and will work with any capacitive touch screen, whether on the iPad or any of the other finger-oriented tablets.
It's a great design for teaching literacy though
The BBC cost so much because of the impressive software and hardware engineering, the massive array of interfaces around the back and a modular design to the hardware, even inside the box. As stated in the article, the BBC is also significantly faster than most of its competitors in pure CPU terms — twice as fast as the C64, for example.
So it's a fantastic machine all around for teaching computer literacy. There are lots of ways to interface to it, the internal logic isn't sealed inside a single ULA and the operating system is an actual operating system, logically divided and well written.
It's main failing in the wider market, other than price, was that the video display was far too greedy for the available RAM. The OS takes something like 3.5kb for normal use, then often you lose a bit more to the disk filing system, so if you subtract another 20kb for any of the three highest storage display modes you're looking at trying to fit your entire programme into something like 8 kb. Compare that to the 41.25 kb available for user code on the cheaper Spectrum. You could hit the BBC's CRTC directly to invent your own video mode that gives you more space (eg, Elite reduces the width of the display, if I recall) but then you're definitely buying yourself problems when you come to do the Electron port.
@Danny 14, etc
The poster appeared antagonistic because of his statement that he "wouldn't be surprised (sadly) after the 'consolidated.db' fuss."
The consolidated.db was a file on iPhones that cached information for location services. It was synchronised to your computer via iTunes. Due to a bug in the first few iterations of iOS 4 it accumulated data indefinitely rather than merely caching recent data. As a result, if a malicious user had access to your computer then he could extract a history of your movements going back to whenever you started using iOS 4.
That information wasn't collected for any purpose and it wasn't forwarded to anyone. In other words, it's completely unlike the application in this story, the offensive part of which is that it's deliberately collecting data and forwarding it.
So to say "I wouldn't be surprised if Apple have taken a deliberate conscious decision to monitor how its customers use their phones because, you know, they made a coding error once" is so nonsensical that it could be construed as deliberate flame bait.
Probably it's just that if you don't use an iPhone then you wouldn't pay that much attention to the specifics of any particular bug — the original author was correctly aware that the iPhone had previously made it possible for third parties to monitor users in some way and had incorrectly assumed malice.
I'm not sure it's a very good demo
It's understandable from a technical point of view but most of the demo seems not to operate, the permitted interactions being not a great deal more than those mandated by the blue hints. It is a little better than a video and I'm sure they'll get a lot more attention because of the way they've done it but I'm not sure it makes the best impression.
If the test for obsolescence is that a newer product is in development...
... then every single electronics product on the planet has been obsolete at the time of launch, including the iPhone 4S, all future iPhones and all other phones.
They don't agree with your opinion, so it's all a fix?
And Apple refuse to invite El Reg to free press conferences but are paying them for reviews?
The iPhone 4S has no known antenna problems. The iPhone 4 had antenna problems and when reviewed on this site those problems were cited and it received only 75%. See here: http://www.reghardware.com/2010/07/02/review_smartphone_apple_iphone_4/
So your allegation is trivially false on the facts. It probably also says something about your level of paranoia that you've imagined a conspiracy out of thin air.
@Giles Jones (cf: @Arctic Fox, @fandom)
The world isn't actually divided into one group of people that say only positive things about Google and negative things about Apple and another that do only the opposite. So trying to suggest that because you heard people complain about the fixed storage size of the iPhone and now you're hearing people say it's really not a problem concerning an Android then anyone who isn't critical of Android must be a hypocrite is unsupportable. All it means is that some people think fixed storage is a problem and some people don't.
Don't mention the other company!
Though, seriously, I don't think the 16gb limit is much of a problem — if anything it's a win for the consumer if it removes the distinction between internal and removable storage, making app management that little bit easier. I'll bet that most people don't use their phones for watching video that they've stored locally and 16gb is more than sufficient for a decent amount of music and photos with enough spare to take a few photos and videos while out and about.
It's revisionist to factor Commodore into the same market as IBM and Apple, and in any case Commodore isn't relevant to the story presented.
The article is entitled "How Apple beat IBM in Steve Jobs' first retail war". It's a story about Apple versus IBM. It's framed like that because the two big beasts in the business computing market at the time were IBM and Apple. You have to rewrite history to pretend that the home and business markets were joined in 1984 — that wouldn't happen until the death of the proprietary home computers in the late 80s and early 90s.
The article is also written from the point of view of a computer specialist retailer. Apple dominated computer retail profits, the Apple II being the first computer to produce over $1bn in revenue in a year. The home suppliers, like Commodore in the US, piled them high and sold them cheap through general retailers like Sears. They didn't invest much in their sales presence in terms of aisle ends or literature since those costs would have to be passed on, which makes them even further irrelevant to this specific story.
You have more patience than I do
I switched off at Need for Speed 3. The first had a really impressive feel, with weight to the physics and a tangible sustained tension. I guess the tie-up with US magazine Car & Driver and the 3DO demographics made them aim for a mature audience.
The sequel was more of the same, probably even the same code base, but by the third they seem to have decided to go mass market. It's all tunnels through lava-filled mountains and Ridge Racer-style handling.
It's almost impossible to believe now but the only subsequent title I can think of that comes close to the first Need For Speed for tension is the original BurnOut, which thrived on 20-minute races through panoramic vistas and despite including the nonsense of a boost bar had things set up so that you pretty much never get to use it. The ten-metre visibility and desire to throw 'awesome' graphics effects at the camera don't turn up until later.
Not a very persuasive argument
"It performs worse with Windows, this is probably a driver issue, therefore Apple did it on purpose because they hate me and have no respect for the marketplace"
Does the same conspiracy logic apply to every supplier that produces flawed driver software? It's just that the evidence of iTunes and of Apple's direction generally would appear to make it much more likely that they're just not very good at producing code for Windows.
Then maybe even a flag waver for open source?
In that most of the weirder ports are a result of the source code release in 1997. Though it made it to pretty much every console from the SNES onwards through standard commercial channels.
Of those I've played: the PlayStation, Jaguar and Gameboy Advance versions are very good, the Saturn version is passable, the 32X and 3DO versions are pretty bad and the politest thing I can think of to say about the SNES port is that it's a technical achievement.
The PC coming into its own
I guess we must have bought our first PC somewhere around 1992 and prior to seeing Doom I don't recall anything being obviously much better than what we'd had on the ZX Spectrum that preceded it, other than loading quickly and being more capable with colours. And although it's not particularly subtle, I still think it plays better than most other games.
Quite a lot of my professional life has ended up focussed on computer graphics and, more recently, computer vision. The sudden leap forwards of Doom is the reason.
In what sense has the iPhone steadily ceded ground?
iPhone sales volumes have monotonically increased since its introduction. Android sales volumes have also monotonically increased, but much more quickly. Android has something like two and a half times as much share of the current smartphone market but, in terms of sales volume, not at the expense of the iPhone. I expect the same story to repeat in the tablet world, but probably with the Kindle Fire doing the work of all the Android mobiles added together.
Most of those 'many people' are wrong
If Apple won brand loyalty only through advertising and PR and if advertising and PR were sufficient in and of themselves then the Mac wouldn't be stuck at 5% marketshare worldwide, and if the difference were just consumers versus businesses then you'd expect the Mac to be a hot target for the consumer-focussed sectors of the market. However, things like games usually don't get ported and, if they do, turn up months after the marketplace buzz about the product has long since subsided.
In summary: there's clearly some other distinction between Apple's mobiles and tablets versus the competition than merely the brand name and the advertising as both of those distinctions also apple to the computers but the former manage to rack up commanding market shares whereas the latter don't.
A story of greed and optimism?
Most 3d movies are shot in 2d and post processed into 3d, which gives the cardboard cut-out effect that people either assume to be a limitation of the technology or figure must mean that they're in whatever percentage it is that the 3d effect doesn't work on. When the benefits of the technology are so underwhelming there's no point making a special effort for it — such as paying more to sit through a darker film while wearing oversize cheap plastic glasses or extending an effort to get it into their own home.
In addition, the majority of television watching is people putting it on in the background while they do something more interesting. About a third of most programme time is advertisements and television programmes are competently aimed at the lowest common denominator, and nowadays there's usually a laptop or a tablet nearby. So even if you have the TV on for several hours, showing programmes you enjoy, it's probably still not the main thing you're doing with your eyes.
The idea that if they all told us we really wanted 3d then obviously we'd buy it was ridiculously optimistic.
Or, for those of us without irrational prejudices...
... Shadowgun is identical between Android and iOS.
From a personal point of view, it's also shallow and boring. It's one for the self-proclaimed "real gamers" mainly.
Maybe posting an article that states that anyone reading it is either illiterate or stupid was the main source of offence?
Either that or it was an excellent idea for the amusing feedback.
Great news, but perhaps the timing isn't brilliant?
Some stories have previously linked Google's hoarding of the tablet versions of Android to a spat with Amazon, Amazon having decided to ship a tablet that takes advantage of Google's software engineering without connecting to any of Google's services, substituting Amazon services and branding, and also being the first high profile company to try to obtain a serious foothold in Android app provision (ie, so as to displace Google).
Google have released this code exactly on the Kindle Fire's launch day. It's a fantastic gift for anybody that wants to believe in an Amazon/Google spat.
Of course, it's also a fantastic gift for anyone that likes open software. So kudos to Google.
Have you considered opening a Dr Who desk?
This seems to be your third Who story in a week or so. Kudos.
Like many above, Hugh Laurie would be my default choice, especially as I think there was pretty serious talk of House ending at the end of this year, making him available. Otherwise, maybe Ben Whishaw? He seems to be pretty good at just about everything. If they could get Eccleston back, even to play the character in a completely different way, I'd be very happy indeed.
I can't think of any Americans that are suitable for the part, though I think that may just be that modern American film writing doesn't allow for a hero that's also the smartest person in the room. That'd be too elitist, right?
It's not dependant on cloud services
The [flawed, tedious] process is:
(1) pay Apple $24.99
(2) get iTunes to identify all of your music to the cloud
(3) wipe your iPod/iPhone
(4) download such of your music as you want from the cloud; download the rest at any other time, wipe your original copies of the music if you really want, they'll still be available from the cloud
There's no need for ongoing connectivity to listen to your music. However, if you have any serious amount of music then it sounds like it's going to take absolutely hours to move to iTunes Match and serious impact your bandwidth and whatever data caps you may have.
Per wikipedia, it's "a rock musical about a fictional rock and roll band". Similarly, IMDB describe it as being the story of a "transexual punk rock girl from East Berlin [who] tours the US with her rock band".
The 6502's not so bad
You've just got to think of it as a load/store architecture, with the zero page acting like the register bank in other machines and accesses everywhere else being expensive. You end up doing most of your business logic with the two-or-three cycle instructions acting on the zero page and occasionally wander into the elaborate four-upward cycle instructions to fetch tabular data. Oh, and I guess you have to get used to the slightly weird one's complement subtraction but it ends up just being a carry inversion since all arithmetic is with carry.
I prefer the Z80 but I think that's just because I know them only through the home computers and the popular 6502s always had to confuse the issue on video circuitry, the 6502's relatively poor random memory access speeds seemingly making people want to back away from just giving it a framebuffer in a sensible order.