1859 posts • joined 18 Jun 2009
Intel co-created and are pushing Thunderbolt. It's supposed to turn up on all Ultrabooks in the near future because of its value as a break-out connector that requires minimal physical space, both externally and on the motherboard.
That said, so far I think only Sony have actually put out a Thunderbolt-supporting computer that isn't a Mac, so Thunderbolt's ascension to a proper standard is far from a done deal.
I'm not sure 'nowadays' is accurate
It's been bloatware for as long as I can remember, even going back to whatever version was supplied for Windows 3.1. It's been reported that when Apple were forced to graft Carbon onto OS X, as a transition technology from the classic OS, they found an incomplete but much cleaner implementation of the usual QuickDraw/etc stuff in the Windows port of QuickTime and worked forward from that. I appreciate that the thing was meant to do a lot more than just video but throwing large chunks of the system libraries for an OS in there sounds like it was the offence.
At a guess, the culprit is whoever decided that QuickTime needed to be a 'multimedia platform' rather than just a video playback tool. Comparisons with Apple's feelings about Flash are entirely appropriate.
They've fixed it on the Mac side as of QuickTime X, by the way — it's a clean break reimplementation thing that really just plays a subset of the video codecs that classic QuickTime had accumulated with none of the wider aspirations. I've no idea how they would defend what they currently ship for Windows but I doubt the defence would be very convincing.
Surely it's worth kicking up a fuss so that these things will be fixed by next year? It's just an unfortunate family of software bugs so it's not like there's anybody arguing the opposite case, we just need to make manufacturers aware that we care.
Maybe buy some capacitive styluses?
They seem to cost about £8 and will work with any capacitive touch screen, whether on the iPad or any of the other finger-oriented tablets.
It's a great design for teaching literacy though
The BBC cost so much because of the impressive software and hardware engineering, the massive array of interfaces around the back and a modular design to the hardware, even inside the box. As stated in the article, the BBC is also significantly faster than most of its competitors in pure CPU terms — twice as fast as the C64, for example.
So it's a fantastic machine all around for teaching computer literacy. There are lots of ways to interface to it, the internal logic isn't sealed inside a single ULA and the operating system is an actual operating system, logically divided and well written.
It's main failing in the wider market, other than price, was that the video display was far too greedy for the available RAM. The OS takes something like 3.5kb for normal use, then often you lose a bit more to the disk filing system, so if you subtract another 20kb for any of the three highest storage display modes you're looking at trying to fit your entire programme into something like 8 kb. Compare that to the 41.25 kb available for user code on the cheaper Spectrum. You could hit the BBC's CRTC directly to invent your own video mode that gives you more space (eg, Elite reduces the width of the display, if I recall) but then you're definitely buying yourself problems when you come to do the Electron port.
@Danny 14, etc
The poster appeared antagonistic because of his statement that he "wouldn't be surprised (sadly) after the 'consolidated.db' fuss."
The consolidated.db was a file on iPhones that cached information for location services. It was synchronised to your computer via iTunes. Due to a bug in the first few iterations of iOS 4 it accumulated data indefinitely rather than merely caching recent data. As a result, if a malicious user had access to your computer then he could extract a history of your movements going back to whenever you started using iOS 4.
That information wasn't collected for any purpose and it wasn't forwarded to anyone. In other words, it's completely unlike the application in this story, the offensive part of which is that it's deliberately collecting data and forwarding it.
So to say "I wouldn't be surprised if Apple have taken a deliberate conscious decision to monitor how its customers use their phones because, you know, they made a coding error once" is so nonsensical that it could be construed as deliberate flame bait.
Probably it's just that if you don't use an iPhone then you wouldn't pay that much attention to the specifics of any particular bug — the original author was correctly aware that the iPhone had previously made it possible for third parties to monitor users in some way and had incorrectly assumed malice.
You are a troll
Apple have never blocked a web site before or in any other way filtered web content.
I'm not sure it's a very good demo
It's understandable from a technical point of view but most of the demo seems not to operate, the permitted interactions being not a great deal more than those mandated by the blue hints. It is a little better than a video and I'm sure they'll get a lot more attention because of the way they've done it but I'm not sure it makes the best impression.
If the test for obsolescence is that a newer product is in development...
... then every single electronics product on the planet has been obsolete at the time of launch, including the iPhone 4S, all future iPhones and all other phones.
They don't agree with your opinion, so it's all a fix?
And Apple refuse to invite El Reg to free press conferences but are paying them for reviews?
The iPhone 4S has no known antenna problems. The iPhone 4 had antenna problems and when reviewed on this site those problems were cited and it received only 75%. See here: http://www.reghardware.com/2010/07/02/review_smartphone_apple_iphone_4/
So your allegation is trivially false on the facts. It probably also says something about your level of paranoia that you've imagined a conspiracy out of thin air.
@Giles Jones (cf: @Arctic Fox, @fandom)
The world isn't actually divided into one group of people that say only positive things about Google and negative things about Apple and another that do only the opposite. So trying to suggest that because you heard people complain about the fixed storage size of the iPhone and now you're hearing people say it's really not a problem concerning an Android then anyone who isn't critical of Android must be a hypocrite is unsupportable. All it means is that some people think fixed storage is a problem and some people don't.
Don't mention the other company!
Though, seriously, I don't think the 16gb limit is much of a problem — if anything it's a win for the consumer if it removes the distinction between internal and removable storage, making app management that little bit easier. I'll bet that most people don't use their phones for watching video that they've stored locally and 16gb is more than sufficient for a decent amount of music and photos with enough spare to take a few photos and videos while out and about.
It's revisionist to factor Commodore into the same market as IBM and Apple, and in any case Commodore isn't relevant to the story presented.
The article is entitled "How Apple beat IBM in Steve Jobs' first retail war". It's a story about Apple versus IBM. It's framed like that because the two big beasts in the business computing market at the time were IBM and Apple. You have to rewrite history to pretend that the home and business markets were joined in 1984 — that wouldn't happen until the death of the proprietary home computers in the late 80s and early 90s.
The article is also written from the point of view of a computer specialist retailer. Apple dominated computer retail profits, the Apple II being the first computer to produce over $1bn in revenue in a year. The home suppliers, like Commodore in the US, piled them high and sold them cheap through general retailers like Sears. They didn't invest much in their sales presence in terms of aisle ends or literature since those costs would have to be passed on, which makes them even further irrelevant to this specific story.
You have more patience than I do
I switched off at Need for Speed 3. The first had a really impressive feel, with weight to the physics and a tangible sustained tension. I guess the tie-up with US magazine Car & Driver and the 3DO demographics made them aim for a mature audience.
The sequel was more of the same, probably even the same code base, but by the third they seem to have decided to go mass market. It's all tunnels through lava-filled mountains and Ridge Racer-style handling.
It's almost impossible to believe now but the only subsequent title I can think of that comes close to the first Need For Speed for tension is the original BurnOut, which thrived on 20-minute races through panoramic vistas and despite including the nonsense of a boost bar had things set up so that you pretty much never get to use it. The ten-metre visibility and desire to throw 'awesome' graphics effects at the camera don't turn up until later.
Technically it was prototyped on NeXT machines. They got it all written once then rewrote the hard stuff in x86 assembler.
Then maybe even a flag waver for open source?
In that most of the weirder ports are a result of the source code release in 1997. Though it made it to pretty much every console from the SNES onwards through standard commercial channels.
Of those I've played: the PlayStation, Jaguar and Gameboy Advance versions are very good, the Saturn version is passable, the 32X and 3DO versions are pretty bad and the politest thing I can think of to say about the SNES port is that it's a technical achievement.
The PC coming into its own
I guess we must have bought our first PC somewhere around 1992 and prior to seeing Doom I don't recall anything being obviously much better than what we'd had on the ZX Spectrum that preceded it, other than loading quickly and being more capable with colours. And although it's not particularly subtle, I still think it plays better than most other games.
Quite a lot of my professional life has ended up focussed on computer graphics and, more recently, computer vision. The sudden leap forwards of Doom is the reason.
Not a very persuasive argument
"It performs worse with Windows, this is probably a driver issue, therefore Apple did it on purpose because they hate me and have no respect for the marketplace"
Does the same conspiracy logic apply to every supplier that produces flawed driver software? It's just that the evidence of iTunes and of Apple's direction generally would appear to make it much more likely that they're just not very good at producing code for Windows.
In what sense has the iPhone steadily ceded ground?
iPhone sales volumes have monotonically increased since its introduction. Android sales volumes have also monotonically increased, but much more quickly. Android has something like two and a half times as much share of the current smartphone market but, in terms of sales volume, not at the expense of the iPhone. I expect the same story to repeat in the tablet world, but probably with the Kindle Fire doing the work of all the Android mobiles added together.
Most of those 'many people' are wrong
If Apple won brand loyalty only through advertising and PR and if advertising and PR were sufficient in and of themselves then the Mac wouldn't be stuck at 5% marketshare worldwide, and if the difference were just consumers versus businesses then you'd expect the Mac to be a hot target for the consumer-focussed sectors of the market. However, things like games usually don't get ported and, if they do, turn up months after the marketplace buzz about the product has long since subsided.
In summary: there's clearly some other distinction between Apple's mobiles and tablets versus the competition than merely the brand name and the advertising as both of those distinctions also apple to the computers but the former manage to rack up commanding market shares whereas the latter don't.
A story of greed and optimism?
Most 3d movies are shot in 2d and post processed into 3d, which gives the cardboard cut-out effect that people either assume to be a limitation of the technology or figure must mean that they're in whatever percentage it is that the 3d effect doesn't work on. When the benefits of the technology are so underwhelming there's no point making a special effort for it — such as paying more to sit through a darker film while wearing oversize cheap plastic glasses or extending an effort to get it into their own home.
In addition, the majority of television watching is people putting it on in the background while they do something more interesting. About a third of most programme time is advertisements and television programmes are competently aimed at the lowest common denominator, and nowadays there's usually a laptop or a tablet nearby. So even if you have the TV on for several hours, showing programmes you enjoy, it's probably still not the main thing you're doing with your eyes.
The idea that if they all told us we really wanted 3d then obviously we'd buy it was ridiculously optimistic.
Actually, the Android version has about one seventh the texture detail. Either that or I'm trying to make some point about pulling random numbers out of thin air.
Or, for those of us without irrational prejudices...
... Shadowgun is identical between Android and iOS.
From a personal point of view, it's also shallow and boring. It's one for the self-proclaimed "real gamers" mainly.
Maybe posting an article that states that anyone reading it is either illiterate or stupid was the main source of offence?
Either that or it was an excellent idea for the amusing feedback.
Great news, but perhaps the timing isn't brilliant?
Some stories have previously linked Google's hoarding of the tablet versions of Android to a spat with Amazon, Amazon having decided to ship a tablet that takes advantage of Google's software engineering without connecting to any of Google's services, substituting Amazon services and branding, and also being the first high profile company to try to obtain a serious foothold in Android app provision (ie, so as to displace Google).
Google have released this code exactly on the Kindle Fire's launch day. It's a fantastic gift for anybody that wants to believe in an Amazon/Google spat.
Of course, it's also a fantastic gift for anyone that likes open software. So kudos to Google.
Have you considered opening a Dr Who desk?
This seems to be your third Who story in a week or so. Kudos.
Like many above, Hugh Laurie would be my default choice, especially as I think there was pretty serious talk of House ending at the end of this year, making him available. Otherwise, maybe Ben Whishaw? He seems to be pretty good at just about everything. If they could get Eccleston back, even to play the character in a completely different way, I'd be very happy indeed.
I can't think of any Americans that are suitable for the part, though I think that may just be that modern American film writing doesn't allow for a hero that's also the smartest person in the room. That'd be too elitist, right?
It's not dependant on cloud services
The [flawed, tedious] process is:
(1) pay Apple $24.99
(2) get iTunes to identify all of your music to the cloud
(3) wipe your iPod/iPhone
(4) download such of your music as you want from the cloud; download the rest at any other time, wipe your original copies of the music if you really want, they'll still be available from the cloud
There's no need for ongoing connectivity to listen to your music. However, if you have any serious amount of music then it sounds like it's going to take absolutely hours to move to iTunes Match and serious impact your bandwidth and whatever data caps you may have.
Per wikipedia, it's "a rock musical about a fictional rock and roll band". Similarly, IMDB describe it as being the story of a "transexual punk rock girl from East Berlin [who] tours the US with her rock band".
And, you know, if Jack Black's on the list...
... then being funny isn't a requirement. Since he appears twice, you could even argue that being unapologetically unfunny was an advantage.
I vote for Tommy. Killing Bono was also as good as some of the films already on the list. But I guess Hedwig and the Angry Inch is an acquired taste?
The 6502's not so bad
You've just got to think of it as a load/store architecture, with the zero page acting like the register bank in other machines and accesses everywhere else being expensive. You end up doing most of your business logic with the two-or-three cycle instructions acting on the zero page and occasionally wander into the elaborate four-upward cycle instructions to fetch tabular data. Oh, and I guess you have to get used to the slightly weird one's complement subtraction but it ends up just being a carry inversion since all arithmetic is with carry.
I prefer the Z80 but I think that's just because I know them only through the home computers and the popular 6502s always had to confuse the issue on video circuitry, the 6502's relatively poor random memory access speeds seemingly making people want to back away from just giving it a framebuffer in a sensible order.
They put electrodes in your brain, and you're never the same.
To add slightly to Joe Earl's comments
Amazon's App store, and an app to access it if you want, can be installed on any Android handset unless the network has specifically taken steps to prevent it. I wasn't able to use it on a prepaid AT&T SIM but I think the 'prepaid' (ie, pay-as-you-go in American) bit was probably determinative.
The store itself is actually very good. It's curated, like Apple's, but only to ensure that apps match the functionality that they're advertised with and don't otherwise attempt to do damaging things. Developers pay for access and nominate a list price, with Amazon then acting in much the same way as they do with books. So they can use your app as a loss leader if they want but you're still getting the list price. Which gives them a great lever for attracting customers and organising the thing generally.
If you're in the US you can also preview the apps directly in your desktop web browser, via an Android emulator, which is really quite fantastic.
Because they're betting on the future?
Nokia have a market capitalisation of $25bn and are betting the house on Windows Phone 7. Subjectively speaking — and nobody can be more surprised about this than I was — it's a really nice user experience and very easy to get on board with the narrative that it's just not had a fair shot at the big time. Microsoft's royalty gathering from Android licensees has eroded WP7's cost disadvantage. Google's sudden transition to a closed source model will be a concern to Samsung and HTC, since it somewhat focusses the mind on the single component supplier problem.
If so then he did the right thing
The difference between 3.5" and 4" is so little that I don't think it's worth the fragmentation — at least when you're the only actor making the devices. I can understand the push to bigger in the Android world because the market is much more competitive and bigger numbers look better but I honestly don't think whatever competitive edge Apple have in the market as a whole is very closely connected to the screen size.
In any case, anything with worse battery life than current smartphones (including but not limited to the iPhone) would be pretty much unusable.
The article is slightly misleading
If you go to YouTube, 4OD content won't appear because of the lack of DRM on delivery. That's in contrast to devices with Flash and explains the niche that the author argues Adobe are now pursuing.
However, you can watch as much 4OD content as you like on your iOS device through the 4OD app.
Per some of the sources, it was simply a side effect of the decade long Blair v Brown spat. Brown found a lever over a crucial national policy that was unambiguously Chancellor stuff and dedicated himself to being contrary. Meanwhile the public position became entrenched of its own accord.
I'm not sure that's fair
UKIP may say some very peculiar and some outright offensive things but I think the majority of UKIP's voters don't spot that stuff and are primarily interested in smaller government by whatever means. Most people don't read manifestos, they just vote on the general message and the general message that UKIP are perceived to push is "let's remove a tier of government and a source of bureaucracy". The BNP are who you vote for if you're looking for fascism.
Since the article's author was good enough to state his bias, I'll state mine: I would never, ever vote UKIP as I think their policies are reactionary, often absurd and sometimes racist, and that they would be detrimental to the country as a whole.
It's not going to win the developers though
Barnes and Noble operate in the US only. Amazon operate worldwide, making it a reasonable bet that at some point the Kindle Fire will become available worldwide. Even for those developers that are willing to import hardware, the sales figures alone are likely to become a compelling reason to put greater effort into making your app run well on the Kindle than on the Nook.
That's even before considering how far Amazon's fork of Android may end up straying from Barnes and Noble's.
Figures seem to be about total sales revenues though
So a 99p game needs to outsell a £30 game by more than 30 times in order to swing the cited statistics in its favour.
If we're playing that game
The Atari Lynx port is fantastic and runs in just 160x102.
An apocryphal story, maybe?
A story you sometimes here is that the development of Windows 3.1 was delayed when all the programmers became addicted to the Archimedes version of Lemmings on a machine they'd imported.
As I'm unable to find any sort of source for it beyond other people that think they've heard it too, It's probably not a true story.
... Lemmings and Prince of Persia almost alone aren't enough to sustain a machine, no matter how cute its little feet.
My knee-jerk response...
... was to make some sort of joke about already having 100% of the utility of Siri on not only my iPhone 4 but on every single object I own, including hawks, horses, robes, etc.
That said, Siri isn't voice recognition. It's artificial intelligence with a voice recognition interface, just like Safari isn't a touch screen but rather a browser with a touch screen interface. Bog standard command-driven voice recognition is a long-standing feature of the iPhone, like it is of every other major platform.
I think the popular confusion shows just how interested anybody really is in voice recognition, especially on a device they primarily use in front of people, in noisy environments. Even if it could recognise and understand everything I said perfectly, I've no interest in announcing every single thing I email or look up to everybody nearby. Besides anything else I'm sure they'd prefer not to have to listen if given the choice.
It's about browser adoption, probably
Mobile phones have a fast turnover rate and you can be pretty confident people are using something close to the latest browser. Flash has also failed to gain any momentum for mobile sites owing to the quality and availability of the plug-in.
Conversely, Internet Explorer's older and extremely non-compliant versions have held the rate of development on the desktop back and will continue to account for a significant proportion of users.
You also tend to see mobile sites and desktop sites developed almost separately anyway, owing to the big difference between how they'll be consumed. Plus the disparity whereby mobile users seem to like dedicated apps (per the AIR route) but desktop types are much happier doing everything in the browser. The history of non-existent or extremely lax sandboxing on the desktop combined with certain OSs that entrust all uninstallation tasks to the software being installed probably contributes.
So, yes, the world would be better with HTML5 everywhere but as a web developer you're cutting off a significant chunk of your audience if you assume it right now. I imagine Adobe will wind down desktop Flash eventually but don't feel that the time is right.
...a qualified lawyer and a software engineer, and I largely agree. Patents are granted monopolies and, statistically speaking, monopolies correlate with periods of stagnation. Furthermore most patents are written in appalling legalese and a lot of tech patents from the relevant period appear to have been granted improperly. We are seeing serious distractions as a result.
There's insufficient evidence to indict either a majority of lawyers or a majority of patents, but plenty that significant errors have been made and are proving very expensive for the industry.
Serious reform is needed - the risk of lazy patent officers needs to be counterbalanced by an extremely cheap or free way of invalidating patents, by anyone. The way to solve this problem is to enable organisations analogous to the EFF that goes after problem patents. It even feels like something you could get politicians on board with.
Indeed, it's probably smart for owners of all types of phone to connect via Google Sync's Exchange route rather than IMAP since it gets you push email.
You're looking at it the wrong way around. This guy allegedly assaulted several innocent people.
I appreciate what you're saying: that because his only motivation was to help, the cumulative total of his positive effects should be weighed against the cumulative total of his negative effects. Or, in short, he made a simple honest mistake — he acted without malice so he isn't morally culpable.
I don't go along with that because I don't think it's a sustainable piece of logic. Assuming he did the alleged act, he intended to perform an assault and he performed an assault. So he performed a crime because he thought it was of social benefit.
So you're endorsing individuals deciding that they have earned the moral right for the law not to apply to them.
I see no distinction between him and, say, someone who (genuinely, not just as a lazy excuse) justifies benefit fraud on the basis that they deserve to get something back because the government bailed out the banks and they've supported themselves and paid appropriate taxes for the twenty preceding years.
It's clear from the earlier reporting that he wasn't beating up criminals, just random people that he thought looked a bit threatening.
I'm also for keeping anyone that thinks they have the absolute right to decide who is and who isn't a criminal away from children. He's a poster-boy for the David Blunkett school of Daily Mail appeasement essentially; forget habeas corpus, innocent until proven guilty, beyond reasonable doubt, etc, let's just empower some guys to do whatever they want.
OS X Lion isn't so bad
You know, unless you actually like knowing where you are in a document. Though people can finally stop complaining that there's no 'maximise' button, especially as Apple ramp up the 13" laptop production.
I don't think so
If it's an anti-competition probe it'll be looking at whether Samsung have agreed to licence these things under FRAND terms and then either declined to do so or else put up unfair barriers. So the decision will either be that Samsung did something wrong or that they didn't — this investigation sounds like it doesn't have the ability to find against Apple, and a finding that Samsung didn't act anti-competitively doesn't necessarily mean the court considering patents and the like will find against Apple.
In summary: I think it's just a precautionary thing. Someone in the office read that there's some dispute over whether Samsung are playing by the FRAND rules and thought they'd better ask for some additional facts.
I doubt this will go anywhere much.
- Review Is it an iPad? Is it a MacBook Air? No, it's a Surface Pro 3
- Hello, police, El Reg here. Are we a bunch of terrorists now?
- Microsoft refuses to nip 'Windows 9' unzip lip slip
- Netflix swallows yet another bitter pill, inks peering deal with TWC
- Special Report Roll up for El Reg's 3G/4G MONOPOLY DATA PUB CRAWL