Feeds

back to article Hacker brings enhanced security to jailbroken iPhones

A computer consultant is embarking where Apple has refused to go, adding a security measure known as ASLR to iPhones to make them more resistant to malware attacks. Short for address space layout randomization, ASLR has been noticeably absent from all iOS devices since their inception, making possible the types of attacks that …

COMMENTS

This topic is closed for new posts.
Thumb Up

This dude.

I hope he gets hired by Apple and makes a bloody mint. Good show!

1
1

RE: thecakeis(not)alie

Even if he did, apple don't give a shit about protecting their userbase clearly.

Apple's mantra is on denying there is a problem and then being the only manufacturer not protecting their customers from malicious code injection.

4
3

clearly you are not a UNIX admin

Even IF they inject code, the permission escalation is still an issue, as is preventing code execution from within that space. Further, what's running in that space is separately secured by kernel level (not user) permissions. For example, the linker can only touch things in certain other places. ASLR also has its own vulnerability and weaknesses, and can be bypassed without too much effort, efforts that are actually less difficult than getting something into the memory space to start with.

Keep in mind, they "pwnd" both an iPhone and a Mac, but they could still not get any access, privileges, or data that was not simply available to the user itself, or by connecting the device to a cable. In fact, even the iPhone hack used, which took advantage of return-oriented programming (but in such a way as ASLR would NOT have prevented it anyway, as they used internal references, not direct memory access which the kernel sand-boxing prevented even in iOS 2), all they could get was access to the device and the SMS message database (not , contacts, not files, not access to any other apps). All they proved was they could get in, but additional code would have to be developed to exploit the sandbox, and further an exploit to enable them to install that code. Now that the iPhone encrypts its data, and with further improvements since iOS2, it's essentially no longer possible to do this without physical phone access and tethering it to a machine and loading custom firmware. As for the mac, all they really did was use a phishing scam to get a user to grant access, but they were still only to operate as that user, and the user the the keyboard could have seen the activity, and the hacker would have to have been "lying in wait" for the opportunity and once in operate manually as they could still not install code or get out of Safari's limited sandbox (which as of the new release is now completely sand-boxed on Macs).

ASLR sounds to many people as an end-all method to preventing hacks, but in reality, it only prevents one kind of hacks, and a kind for which many other options exist, including bypassing ASLR entirely and still getting the data in that memory space we don;t know the address of beforehand.

1
0
Gates Halo

ASLR?

"impossible to know ahead of time where malicious payloads are located."

No, for two reasons.

One, it's not the payload's address that's being randomised. It's the address of useful (exploitable?) bits of OS code.

Two, the whole ASLR thing is a bit of a red herring anyway. If a legitimate user can find out where to call the code in question (the code to be exploited), then so can the malware payload. The code in question is usually part of the OS, and much of the OS is usually intended to be accessible by the user.

Moving the OS, RTLs, whatever, around a bit from time to time doesn't make the OS (including the intended exploit code) inaccessible to the user or to the malware, it just makes it a little harder to find. But far from impossible.

Is it worth the effort?

Was it worth the hype?

0
3

thank you

There are many workarounds for ASLR, it just requires a small bit more code. it does not prevent someone from attacking any security vulnerabilities and getting in, it simply means you can't directly, through predetermined knowledge, access code in RAM. however, where that code was randomized to is still obtainable once you are in. The pwn used on the iPhone in March would have worked had ASLR been in place or not. ASLR is a waste of time, until such a time as that process of randomization is removed from the OS itself and becomes an aspect of a hardware level hyper-visor, or more integrated system.

App sandboxing, tight permissions, and data encryption are fare more successful systems for preventing access to data. At that point, even if they get in, they can't DO anything. For example, in the Pnw2Own, all they got was the SMS database, and essentially they did it by sending "replys" to messages queued, which anyone who picked up the phone could have done (btw, this database has since been moved and secured, equal to the others).

1
0
Stop

dyld_shared_cache

"Stefan Esser ... plans to unveil a process for jailbreaking iDevices that automatically fortifies them with ASLR. It works by reordering the contents of dyld_shared_cache, a massive file that houses the libraries."

What is to stop malware from doing the exact same thing as Esser's process, but putting the contents of dyld_shared_cache in a known order?

0
0
Happy

Marvels of modern technology

There's a particular house on the street you want to break into. You know it has yellow Windows and the garage door is blue, and the house next door has green Windows. No other house fits that description.

How easy is it to find the house you want? Does the owner need to worry?

Some bright spark comes along and sells all the owners a security device called ASLR. ASLR moves the house numbers around from house to house.

How easy is it to find the house you want? Does the owner need to worry?

Still convinced it's worth the effort, worth the hype?

0
0
This topic is closed for new posts.