Apple's line that iPhone users who toe the line and resist any temptation to jailbreak, unlock or otherwise desecrate their smartphone are protected from threats has been called into question by new research. Swiss iPhone developer Nicolas Seriot has published research on security shortcomings that could create a mechanism for …
Is the story here really "apps can potentially do bad things"? Because if so, I'd suggest the presence of the existing app store review process probably indicates Apple are aware of this.
God save us from "security experts" trying to flog us their latest white paper.
Correct me if I'm wrong
But isn't this pretty much the same for most devices and operating systems?
I can understand why someone might pick the iPhone if they particularly wanted to do something like this on a mobile device, but Windows has a significantly larger installation base.
If you can write an app that would be downloaded and used in sufficient volumes to make this a worthwhile approach, then you'd almost certainly make more money just selling the app - without the added risk of a criminal record.
You may as well say that criminals might start buying certain mobile phones to use as clubs when mugging people. It *is* possible but the chances are reasonably slim.
Stay safe, avoid stupid apps
The most likely carriers of such evil would be apps like the fart button (not that I'm currently accusing these developers of anything more than poor taste). It's unlikely that malware developers would put the time and effort into developing something useful.
So do the world a favour and avoid the stupid apps. It'll help keep your data safe and make the world a more pleasant place for everyone around you.
So, the "hack" is to
1) trick apple into approving a malicios app, which of coulrse would have to make ilklegal calls to functions not approved, something they do scan for in all submitted code now (and if there was an exploitable system revealed, scanning for that would be even easier).
2) trick a consumer into dowloading said app
3) customer has to run said app
all before the news comes out and says: don't download this it;s a virus, shortly after apple pulls it, and sends a message to everyone who downloaded it telling them to delete it without running it (if there's not some system to auto-remove apps that's never been activated).
1) The way SpyPhone does harvest data does not involve any illegal call not functions not approved.
2) You do not consider the case where police / detective / husband / boss takes the iPhone for a few minutes, installs SpyPhone, collects the data and removed it.
1) The address book is unprotected with standard documented APIs. (They're in the ABAddressBook frameworks). Geolocation requires user interaction; there are ways to override this, but this causes a rejection from the app store. Email, however, is protected (With 3.0, you get MFMailComposer, but it doesn't reveal the email account that you send the mail with, much less any passwords) within documented calls.
Web history and the like? I'm interested how he claims to do that through legit calls.
2) In order to install an app from the app store, you have to type in your itunes account password, and the only way to change the account is when hooked up to a computer. In other words, either the perp connects it to their computer in which all bets are off, or they have to know your iTunes acct password, which is almost as unlikely.
3) Apple has two tools in case such a thing happens. One is that they know where you live. If a personal developer, they have enough to process your credit card, right? If a company, you have to provide a LOT of company information to get authorized. Second, as a last ditch nuke, they can push out a killfile that will remove the app. They have yet to do this, as it's overkill for anything but malware.
...Apples existing sandboxing and review process should take care of this potential threat? The ones already in place? No sh*t, Sherlock. Nice way to get your name in the press.
A cunning plan...
Let me get this right - the malware developer first develops an application of such broad appeal to iPhone users that a significant proportion will download and use it, then invests a huge amount of time and money publicising it (because all developers have to leap this hurdle), and finally has a tiny window of opportunity to covertly extract data before Apple (or others) invariably discover what's going and shutter the app - is that the plan?
It reminds me of Dr Evil's plan to hold the world for ransom for $1 million whilst sitting at the head of a multi-billion dollar corporation. If you have the skills and determination to put an app into the hands of a large number of users, wouldn't you make a far better living off the app itself?
Some of these security "experts" seem far too keen to discover threats.
Now try that on a Windows Phone. Go on, I dare you, Can't do can you. Microsoft maybe paiinted as Evil but at least thier mobile OS kick some serious security ass. And you can develep and install what you want for it without restriction.
Apple on the other hand has to test every thing and approve it before letting you install it on THEIR phone.
"Now try that on a Windows Phone. Go on, I dare you, Can't do can you. Microsoft maybe paiinted as Evil but at least thier mobile OS kick some serious security ass. And you can develep and install what you want for it without restriction.
Apple on the other hand has to test every thing and approve it before letting you install it on THEIR phone."
Its Friday afternoon so correct me if I am wrong...Would the above come under the title of Turdspurt??
Didn't this already happen?
One of the popular games makers for the iPhone had been discovered harvesting contact details from users phones by using an Apple API. This was despite Apple's "careful" review process.
One iPhone game maker has already used this to pimp out personal info from the iPhone, using Apple's API. Looks like the iBone has no security sandbox at all!
I'll keep my BlackBerry, thank-you-very-much.
Sigh.. going backwards
I'm surprised by the comments which seem to accept this kind of rubbish as a fact of life.
My relatively ancient* phone runs Java applets, and this allows me to use the quite excellent Opera Mini, Google Maps, Google Mail, and Shazam iD.
Because they're applets, they have to follow the Java security model. Any applet has to be given permission before it can access any of my data. For example, to access the filesystem, to access the address book, to send SMS, or to connect to the internet.
When an applet tries to access any of these features, the phone asks me whether to allow it. Unless I grant permission, it's impossible for the applet to break out of its "sandpit". Of course I can grant permission permanently, on a per applet basis, if I trust the applet.
It seems that we're going backwards when technologies like this get left to one side. With the security threats that exist these days, it seems weird when we run applications on our computers or phones, they can do pretty much anything they like (e.g. access any of the user's files, not just the ones they need to access).
* It's a K800i, a mid-range phone launched three and a half years ago (and long discontinued). The camera is as good as the latest iPhone, and it has a real flash (not LED), doesn't break if I drop it, and battery can last a week... contact details, to-do list and calendars sync perfectly with the Address Book and Calendar on MacOS using Bluetooth.
I don't see why this is aimed at the iPhone, it's applicable to all platforms and OSs, Linux, Windows, MacOS, iPhone etc. etc.
If you install bad software, it'll do bad things.
There is quite possibly the exception of Android, but it's still a small exception. When you install an application on Android, it tells you what permissions it's asked for, i.e. access to the internet, if the app doesn't ask for permission, it can't access the internet. But that doesn't stop an app that's pretending to do one thing and doing another. Does the iPhone do something like this? (Not as in haha it doesn't (if it doesn't), I've never used one).
All this boils down to is if you don't know / trust the author / software, be extremely wary about using it. Which, shockingly the people who aren't replying to spam e-mails and logging into phishing sites already know.
to just misquote a little bit:
"AppStore enforces general consumer security, not business security."
Not so silly -
Despite combining usability with tight controls more effectively than its competitors, this shows iPhone could do better, e.g. providing an outgoing firewall, and overall paranoia settings (e.g. "nobody sees my address book without my say-so"). These are relatively small OS changes. Equally, the existence of the app store review process, app sandboxing, and and app kill switches shows that iPhone is probably the only current candidate smartphone to be a viable proposition for universal adoption. It's certainly been exposed to hacker scrutiny a couple of orders of magnitude greater than any other phone platform, for which iPhone users and Apple investors everywhere should be extremely grateful. That's a multi-year process that any competitor will need to complete before taking the iPhone crown.
Contrary to AC, it is not possible for someone with temporary physical access to install an app on an iPhone without either wiping it completely first, or knowing the user's Apple ID password. A spy app, however puny, would also be killed and pulled from the App store long before it became a significant threat to the ordinary phone user, even if it did evade the review process.
The point is, like all things Apple - a large percentage of their customers assume that anything made by Jesus and his deciples is inpeniterable to everything, and that anything you install from the Apple store isn't going to bite you in the ass.
You should understand this already else what are you doing here and secondly you should understand the research is squarely aimed at Apple, to say - hey by the way guys you should probably clean these little problems up around the edges..
Research of this nature is important because sometimes developers see things which security researchers don't, but without the whole hey look at me defcon talks and whatnot.
The point is that people (like my boss who should know better) spend their days installing stuff from the app store because they assume that it's safe because it's on the apple store. You actually wonder how many people using apple's app store even know that the apps aren't written by apple to be fair, but still, I won't discuss the stupidity of the average Apple customer without an attorney present :)
Apple approved app that contains spyware
And this is an Apple approved app that is doing the spying.
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders