1610 posts • joined Thursday 18th June 2009 14:54 GMT
Re: Only... (@AC)
But that's the starting salary. So you get that straight out of university.
If you've at least a year or two's experience, you're pretty much guaranteed six American figures. There aren't enough employees to fill the vacancies and only a very limited number can be imported every year. That's why you always see the big tech firms with any lobby group in favour of immigration reform.
Re: Dear Apple, open yourselves up a little
I think you're conflating issues.
The Hour of Code is a national scheme. Participants offering a free one-hour in-store lesson include both of the companies that have the space to do so: Apple and Microsoft. The article seems more interested in the Apple angle but you can take that up with El Reg.
Neither Apple nor Microsoft allow you to develop for their ~$500 tablets without buying an additional computer. Does that mean neither of them should be allowed to take part in the scheme?
Re: Bad analogy time
They are if you modify the gun and then resell it through the original maker.
Re: Is appearing as their own work the only problem?
Yes, I think that was the problem. It remained an AppWork product even after changes. Hence the company assumed responsibility and the CEO ended up being liable.
So it's broadly similar to going after a hosting company for propagating an infringing website.
Re: On behalf of San Francisco natives everywhere...
How about Oakland West?
Re: Genius Bar (@SVV)
Yes, it's a smug, annoying name for the customer service counter.
However, somewhere in the 12 years since Apple introduced them, I think most people who are likely to hear about them will have made peace with the name.
For your next comment, maybe you could explain what this "Windows XP" thing is?
You're blaming Labour for 10bn "and counting"? I think you might have missed some relevant news stories from 2010.
Re: It was to be expected...
You've obviously forgotten the Microsoft announcements since. People didn't want discs that become tied to the specific machine or an enforced daily check-in to seek permission to use the console because, ummm, they weren't smart enough to spot the benefits and they're all just living in the past and one day they'll realise that they were wrong and, anyway, it's not Microsoft's fault if the public is so backwards.
Furthermore, capitulation was a sign of strength and evidence that the XBox is the more versatile device. So there.
That all being said, one feels there's a game of dare going on. Microsoft and Sony are letting companies like EA (I'm talking about you, Sim City) take the flack for this sort of consumer-hating practice for now and will no doubt both integrate it back into the machine once every publisher is just having to do it individually anyway.
Re: The ipad's big fail
You could more productively have spent that half hour installing e.g. FileBrowser (https://itunes.apple.com/us/app/filebrowser-access-files-on/id364738545) and grabbing the files from a network share.
Re: Won't happen
Apple doesn't explicitly announce its motivations, obviously, but I think the explicit constraints system it introduced in iOS 6 via NSLayoutConstraint is supposed to buy them much more flexibility in screen ratio and proportions. I even suspect that via dynamic type in iOS 7 (you set the base font size, everything else is meant to size around that) they possibly even want to be able to change to non-integer multiple pixel densities.
Re: Seems a bit pointless (@Dave 128)
The ARM64 instruction set does benefit the consumer now, and increasingly so since iOS apps are built to native code so there's a bit of delay in there. None of the improvements are much to do with being 64-bit in the abstract, but 64-bit pointers help with Objective-C's traditional stance that everything is on the heap as they allow suitably small objects to be packed directly into the pointer and passed about effectively by value. That doesn't cost addressing space because valid pointers have to be aligned anyway — Apple has just adding meaning for unaligned pointers.
Apple's is also a reference counted environment, and they've built a few bits of the count directly into the pointer. Previously the rule was that once retain count goes above 1, the runtime explicitly stores it in a hash table somewhere. I think they may actually have forfeited some addressing space for this improvement though.
That's why El Reg has satirically coined it.
Re: NOT "stolen"
... so you don't think the eagle acted dishonestly to appropriate property belonging to another with intent permanently to deprive, etc, etc?
Anecdotal survey result: this is the definition most likely to recalled from the first year of a law degree.
Big in Japan! Big in Japan!
In Japan, where the 5S is offered universally free on a contract, Apple captured 76% of the market during the post-launch lustre month of October (source: https://twitter.com/KWP_ComTech/status/405288467141115904 ).
So regardless of the other analyses, I think it's probably the case that a huge number of people would take the Apple phone if it were a cost effective option — the problem isn't necessarily the user interface or the walled garden or even the size (though Japanese people seem to like small things much more than, say, Americans).
On the plus side, that means that Apple has a way out if it wants to take it. I guess the question is at what tipping point does a low-cost mass-market device become more profitable than a high-cost niche device? If the market decline worldwide continues and starts to bite Apple's bottom line, will it be smart enough to make the leap in time?
I think the Apple versus Microsoft figure is relevant here because there's no real landfill Windows Phone category. In the way that it's designed Windows Phone is much more similar to iOS than Android in that you can't chuck it onto three-year old hardware with a vendor-specific shell — Android isn't just about the blockbusting flagships — so it would seem likely that Microsoft is preparing to beat Apple at its own game.
If I were Apple? It's time for serious price cuts to the 5C. It needs to be free on all but the stingiest of contract. It's probably time to accept that the market seems to prefer dramatically bigger screens, too.
Re: More importantly...
Yes, but without complete dedication — they've all got second jobs policing the White Void.
Re: While I was rummaging through a box...
Grab some UEFs from stairwaytohell.com then use something like uefreader.sourceforge.net to output as audio...
Re: Sounded really promising
The 2GB RAM was the turn-off for me. If I'm basically going to be unable to use the desktop then I might as well buy an ordinary tablet — it'll pretty much certainly get me a much higher pixel density for the browsing, media consumption, etc, that you're pretty much limited to anyway.
And don't tell Microsoft, but other office suites are available. Including ports of OpenOffice for Android — install AndrOpen Office and you'll actually feel like you're using a Windows desktop application.
Re: $299 vs £350
I found it unclear where the $299 claim comes from; the device is $408.60 on Amazon US (before sales tax) and £339.99 on Amazon UK (with VAT). Take the VAT off the UK price and you get £283.325 (sic). Convert that to dollars and you get $463.83.
So the pre-tax difference is that UK residents pay an extra $55.23 — about £34. Which is almost exactly 10% of the price, but nothing like as bad as $299 versus £350.
The equivalently-priced Surface isn't more flexible or less restricted than the iPad — all apps have to come from Microsoft's storefront with Microsoft's blessing.
Based on price, Apple's competitor to the Surface Pro is the MacBook Air, which again is pretty much exactly as flexible and unrestricted.
I've got a couple of small LCDs lying around...
... that should be enough to meet the demand consumers have so far shown for smart watches.
The big iPad would be interesting if they were to merge iOS into OS X (ie, make both sets of API available) but that feels exceedingly unlikely. It also suggests I've learnt nothing whatsoever from Windows 8.
Re: The irony
I think you've got a very peculiar definition of stolen. E.g. the BSD authors want other people to reuse their code in any way that it proves useful. They're probably very happy that Apple uses their code.
To put it another way: if Apple "stole" the mouse despite it being implemented widely by others earlier then it follows that Samsung did "steal" multitouch, etc from Apple. If what Samsung did was reasonable then Apple's use of the mouse was also reasonable. You can't have it both ways.
It is, at worst, hypocrisy rather than irony.
Re: Pathetic lawsuit
To be fair, Apple's claims to be an innovator are primarily outside court. Inside court it merely establishes that it holds design patents and that argue that the defendant infringed on those patents, which they usually have because the patent system allows ridiculously broad design patents. Apple then throws in that the patents were copied maliciously, specifically to usurp the iPhone, because that works in their favour for quantification of damages.
You can dislike the people at Apple for the smugness of their advertising, and you can dislike them for the cynicism of their legal manoeuvres but you're disliking them for two separate reasons.
If you're anything like me you can even dislike them for those things but still rank them in the top half of the tech industry as Apple's sins have a much more diffuse effect on the market and on individual consumers than the classic villains.
Re: Sounds about right.
This is Bill & Melinda Gates' other contribution to world health.
WinAMP was a turning point for me...
... well-engineered and extremely popular as it was, its users' obsession with skins and customisation was the first thing that made me feel that I'm something different from the typical self-declared "technical" user. I just want the music to play and the thing that does that to be as invisible as possible.
It's sad to see WinAMP go though. It ushered in the modern world of music consumption.
Re: Maybe So, Look at Macintosh
I thought they became acceptable when sufficiently much of what consumers use computers for had moved into the browser so that, with the native Microsoft Office also available, the software gap no longer mattered. Apple's transition to a competent OS in the years immediately preceding helped too. The switch to Intel was just the icing on the cake.
As for BlackBerry, I have to agree with the other commenters that a difference here is that Google probably isn't going to go on an anti-competitive market fixing spree. I also don't see how it matters that much to BlackBerry if people end up thinking of their OS as just a weird version of Android — from where they are now, anything that makes money is a win.
I wouldn't read too much into it — BBC 3 is evidence that plenty of people at the BBC hate the public in general.
Re: Apple Telly
While I agree with you, I wouldn't rule out an attempt by Apple. I think they might use their "retina" sales tag to try to grab some cash during the great up-sell to 4k. With a built-in AppleTV and the assumption of Wifi it'll play well with Apple's traditional sales pitch of simplicity — just plug it into the electricity.
Use their Hollywood clout to talk the networks into live streaming through a unified platform and they might even have a compelling sales pitch. For the Americans, whose networks they'd actually bother with, anyway.
It'll need to happen very soon though.
It always used to be the Microsoft dictum that they would always prefer to purchase than to license — e.g. that's why they foisted Helvetica-alike Arial on the world rather than using the real thing. But I guess that was back when there was virtually unbounded room for growth. They were probably more cautious about a peripheral for a video game console with known sales figures.
Re: With this device I can rule the world!
With this device I can rule 13% of the smartphone-buying world!
Re: And yet... (@Bob Vistakin)
The new iPhone launched 11 days before the end of Q3, and even then only in its first ten countries. There are a lot of reasons to think Apple's stubbornness is costing it huge amounts of marketshare but the Q3 year-on-year sales report isn't one of them.
My feeling is that Apple succeeds when it is technologically ahead. It slowly withers when it tries to rely on being cool. With the Mac they eventually found a way to make it sustainable but they've never managed to turn product lines around and reclaim dominance in a market they've previously lost. So I'm not optimistic for iOS, especially if the 5C is all we're getting as a price reduction.
Re: There's a difference
I think it was more the bit where if he had read it in the book then it had to happen than the getting into NY trouble, as per the stuff with River's wrist. The book said "final farewell".
So, yeah, convoluted plot contrivances concerned with arm waving and brief mentions of causality.
Between this and PNaCl it sounds like Google has two broadly similar irons in the fire. Buyer beware.
BlackBerry's problem was that it wanted to ship a modern OS in a touchscreen mobile. It didn't have the technology ready. So it threw away its existing customers in order to spend a few years developing a me too product.
Apple's problems are entirely distinct. It's comfortable servicing one segment of the market and that segment isn't growing much any more. It seems vaguely interested in other segments (per the 5C) but doesn't really seem to have much enthusiasm in pursuing them.
The two are in very different positions. It's likely that what happens to them from now on will be very different.
See also: what Windows did to Commodore and Atari versus what Windows did to Apple.
Re: You missed the step (@Andy Prough)
I think he was just comparing Nexus 4 sales to Apple sales — one supplier versus one supplier rather than every Android supplier versus the one iOS supplier.
That said, who would really expect the Nexus 4 to outsell the iPad Mini? It's an unsubsidised phone available only via mail order. That's just not how most people buy their phones. I'll bet the Nexus amounts to a tiny segment of Android sales.
In case any other US residents are curious...
My quick search has revealed rapidonline.com will ship overseas, but if you don't want the Pi then there seems to be no way to obtain the SD card. It's about $278 for the bundle containing everything after postage has been added.
Re: People they can rule out immediately :- (@AC)
I would do but since Anonymous has been active for several years there's quite a lot of evidence about their hive mind and since this article is about — amongst other things — one lot of them trying to change the global surveillance infrastructure by defacing a dry cleaning company, the criteria you mention would falsely exclude a lot of Anonymous members.
Or maybe you know something we don't? Maybe the dry cleaning company wasn't falsely classified based on no evidence? Maybe the Australian government does all its spy work via cleaning companies?
Re: Anonymous vs Anyonymous?
People they can rule out immediately:
• those with a decent grasp of the English language;
• those who have ever contributed anything constructive to society;
• anti-bullying campaigners and their sympathisers;
• anybody that thinks, on reflection, that joining a baying mob is a bad idea.
Re: Well, someone will have to mention the SFX in 2001...
You mean the slit-scan photography, as most famously used by Who in the Tom Baker title sequence? That's actually a fairly easy idea for most computer scientists to grasp. You know how parallax scrolling works in video games? You know how by the early 1990s they were routinely doing every single scan line individually to give apparently perspective to floors and ceilings?
Imagine you created that effect visually by cutting a horizontal slice in a piece of card and suspending that a distance above a colourful 2d image. Then light the image and make everything else pitch black. Take a single frame of film by pointing the camera at the slit and moving it directly towards or away from the surface.
The effect is that the only part of the film that's exposed is that which can see through the slit. As the camera moves, the slit moves within the frame to expose a different part of the 2d image and you're now closer to the image so its larger. So you get an apparent 3d transformation of the 2d image built up with a continuous equivalent to the parallax floor.
Move the background image a little and repeat for the next frame. And again. And so on. Then you've got the pattern apparently coming towards you.
Then cut a more interesting shape than a horizontal line and add Tom Baker's face on top.
Android allows both Dalvik and native apps to coexist as first-class citizens; Dalvik is oriented towards a completely independent implementation of Java. Android sits upon the Linux kernel but in modern terms the kernel is just a tiny fragment of an operating system. The libraries, widgets, etc are just as much what people think of when they think of an OS.
Hooray for the removal of bloat from iTunes?
This story has one upside, at least.
That game Sharaz Jek is playing in Caves of Androzani looks pretty good
Some sort of side-view real time strategy game, I think, clearly for the BBC Micro.
Re: I was afraid of this (@AC)
You should have a word with Microsoft. They seem so sure that the Intel-bearing Surface can't be made cheaply that they endured a US$900m write-down halfheartedly trying to push an ARM version.
The Surface Pro 128GB list price was £900; presumably the Haswell-powered Pro 2 will be the same. Chuck Apple an extra £200 for the entry level 13" Retina MacBook Pro and they'll give you a larger, higher resolution screen and a 50% faster CPU.
Re: Only apple could say they have a
They don't say that. Only an AC could make such a ridiculous straw man argument, etc, etc, etc.
A contrarian here
I attended university at the turn of the millennium; I was a young child during the '80s and packed my teenage years entirely into the '90s.
My experience, shortened to the interesting bits: I received an obsolete micro from the classifieds somewhere in the early '90s; left to figure things out on my own as at that stage the computer had no magazines or commercial support I achieved some things I'm very proud of but was remarkably naive in other areas.
In the late '90s we got a PC and the Internet. So suddenly I had access to unending reams of documentation and properly technical people to discuss things with. My abilities took a huge leap forward. I progressed much faster than I probably would have if I'd continued in independent study or muddling through with a single book or two.
As a result, just as others above think the most educational environment was having limited choices and needing to figure everything out for themselves, I think the most educational environment was taking a bit of time to get the absolute fundamentals down then being exposed to the breadth of everything available. Probably people a decade younger than me that the best way to learn is to be dropped in immediately amongst the breadth.
It'll be interesting to see what the second article advocates but too many of the commenters seem to be confusing causation and correlation so as to jump from perceiving an experience to be common to suggesting that it's a good idea.
Re: Different style today
It runs slightly contrary to the rose-tinted nostalgia of some of the other posters, but many of the worst coders I've worked are those who start with increment coding and debugging and proceed to the conclusion that the correct way to figure out how libraries work is by empirical investigation. Reading documentation just takes time, right? And if nobody's going to read it, why write it in the first place?
If I were asked to come up with a related rant immediately it would be about people who think that date handling is easy, so they wrote it all themselves, and mysteriously enough their code gets the length of a day wrong twice a year. But that's okay because their 200 lines of date handling "was a quicker solution" than five lines of API calls that would require you actually to have learnt about what's already provided. If you wanted a rant tomorrow? Probably something else.
In this case they didn't — it's a standard buffer overflow attack.
But if you're asking who would? Well, early-1990s Microsoft did in WMF. They closed that hole back in 2006 but you should be able to find ample reporting from then.
Standard "no problems here" comment
I have a Western Digital 'My Passport' USB 3 1TB device. It's just for Time Machine so I plug it in and ignore it. Everything is working without issue.
Naturally I haven't installed any of WD's custom software because I'm about as enthusiastic about that as I am about consumer printer drivers — one just assumes it'll inexplicably install a 2GB boot-time kernel extension in order to be able to say in a booming voice "You have 20GB of space left; visit www.westerndigital.com to buy another hard disks" every ten minutes. And probably require a network connection.
- It's true, the START MENU is coming BACK to Windows 8, hiss sources
- iSPY: Apple Stores switch on iBeacon phone sniff spy system
- Pic NASA Mars tank Curiosity rolls on old WET PATCH, sighs, sniffs for life signs
- How UK air traffic control system was caught asleep on the job
- Google embiggens its fat vid pipe Chromecast with TEN new supported apps