Good video explaining it...
Computerphile...
https://www.youtube.com/watch?v=6RNKtwAGvqc
4286 publicly visible posts • joined 28 Jun 2013
How are we going to get to Mars if we can't get a video feed feed back from a barge parked just 600 miles off shore?
Hint: A 2nd smaller barge, stable for satellite feed since rockets aren't landing on it, with a ~1km fiber optic cable from one to the other. Easy.
"...the number of wrong attempts must be written somewhere non-volatile."
One of the presentation videos on CCC.de has the hacker/cracker illuminating an individual non-volatle memory cell (one bit, a flag) within a de-capped security chip to reset that security state flag bit (with light).
That's the sort of unforeseen approach that's used to get into a system with 'infallible' security. They might need to invent or discover something for the iPhone 5C. But there's always a way, unless this is the very first device with perfect security (seems unlikely).
Nothing to do with brute forcing. Seems to take days or weeks, not trillions of times the life of a universe filled with etc etc etc.
They'd certainly want to buy a box of phones to practise on, before their one-shot at the subject.
You're right. The author is bending over backwards trying to find a reason to excuse the failure. The section you quoted is simply ridiculous.
Furthermore, why was the idiot car backing up ? "Ooh, there's some obstacle in my path. I've stopped too close. Now I need to back up and crash backwards into an oncoming bus."
We'd better get used to this.
Would it help if I clarified that 'eventually' is likely at least ten years out? Maybe 15-ish. Certainly less than 20.
To me the trends are crystal clear. Emulation. Virtualization. Moore's law. Etc. It all comes together in a few years into a big schmozzle where stale-thinking OS-as-religious-beliefs get utterly trampled.
Most apps will run under any of several OSes, unpacking themselves accordingly. OSes will accept apps from other OSes. OSes will themselves be apps. Apps may bring along their own OS. None of this requires more than a couple steps past where we are now. Obviously.
Cheers.
Eventually, most OSes will just run each other's apps (possibly interpretively), or each other, or layered-on copies of themselves. How about sharing hardware and running multiple OSes at once in parallel? Future CPU speeds will certainly allow almost anything.
Last time I posted this sort of prediction, I was heavily down-voted by the near-sighted.
For the next decade or two, the software faults in the top level self-driving algorithms will so dominate the self-driving fault / crash space that the computer hardware could be made from fragile vacuum tubes running wobbly Windows ME for all the difference it would make.
Eventually these concepts would rise to significance, just don't hold your breath.
And in related news, a Googly self driving car caused a crash today. Not just involved, but caused.
As written, it's clearly a plural ("2 bridges") followed by a 'was'.
It could have been written "Having two bridges was unnecessary"; but it was truncated into incorrectness.
"The idea" (offered as a counter-example) is singular. Thus perfectly non-applicable.
Anyway, by Godwin's Law, I win this debate anyway.
No question it was A Very Bad Thing, and - yes - we should strive to reduce methane emissions as a top priority since they've got such a large multiplier over CO2. Such things must be discouraged. Etc. Etc.
But reportedly its impact is clearly insignificant on the global scale. Mostly because it was a relatively short term source (as opposed to endless). It doesn't begin to compete with other methane sources in the long run. Reportedly...
If you dislike paragraph 2, then please reread paragraph 1.
"...now realistic..."
CPU/GPU performance per watt improvement is a steady trend, any demarcation point is obviously arbitrary. People have been doing fanless PCs for years already, while others will insist that they will need PCs with several fans for the foreseeable future.
So I take issue with the word 'now'. It implies that this demarcation point (26 Feb 2016) is somehow silently assumed to be superior, or somehow more appropriate, to anyone else's demarcation point (past or future).
It's arbitrary.
DropBear on El Reg "...script... ...(zero pictures btw.)..."
What!??
The pictures are almost always hilarious, sometimes subtly hilarious which makes them even hilariouser.
Oftentimes they're the perfect representation of a famous meme.
One does require a sense of humour to enjoy them.
"...outdated encryption..."
The word "outdated" papers-over a recurring theme in the history of cryptography.
It often refers to an encryption standard that was once believed to be secure, but was then subsequently shown to be less secure than was once imagined.
Preempting your next thought: Almost all of the time, at least in the history of modern cryptography, the 'outdatedness' has NOTHING to do with the progress of Moore's Law and 'brute forcing'. More often it has to do with the cryptanalysts beavering away until they uncover the seemingly-inevitable subtle flaws; either in the fundamental algorithm, some particular implementation, or (in some applications) an unacceptably high risk of operator error.
Too many have the incorrect impression that the deterioration over time of an encryption standard is due only to some external process, like the weathering or erosion of a big rock. In fact, 'the rock' was typically internally-flawed from Day 0. The subtle flaws are eventually exposed by close examination, which may take several years.
The timing of the public pronouncement of 'outdatedness' often comes down to motivation (effort, speed) of the crackers, or even if they wish to keep their success a secret for a while (e.g. Churchill and the Enigma, kept Ultra Secret for decades). An encryption standard may actually be 'outdated', but hardly anyone is in on the secret.
The adjective 'outdated' tends to support muddled naïve, wishful thinking about whatever cryptography algorithm is the present standard du jour.
Sometimes the best adjective would be 'flawed'.
Typically these very complicated steps need to be done once, by someone, and then published.
Then somebody releases the attack as a 'script'.
Then the script kiddies just 'Click-Click'.
It's a mistake to assume that every attacker needs to start from scratch.
Your post may lead some to make that mistake.
For some folks, still stuck with a low bandwidth connection (e.g. dial-up, cellular, or even 2400 bps Iridium) to the 'net, the crashes are probably caused by their connection being completely plugged up with all this 'telemetry' data.
Does setting a 'Metered Connection' flag automatically turn off all this rubbish?
It really should be fully integrated in the OS. Set one (for example) 'Dial-Up (56kbps) Connection' flag, and the OS should configure itself and installed apps to minimize traffic.
It should have been implemented decades ago.
"...rob the lithium of oxygen ."
A typical 18650 Li-ion cell (super common in laptops) might be something like 3.7 volts and 2000 mA-hour. So it contains within it (when charged) about 7 watt-hours of energy, roughly. If it internally shorts out, due to contamination, damage or random what-not, and that energy is all discharged in, say, one minute of excitement due to an internal short circuit, then that's roughly 400 watts of power, over one-half horsepower, being liberated in that minute. It could be in a hard vacuum, and it would still be very exciting. It wouldn't last a minute. It would vent and/or explode. Might trigger off the neighbouring cells too.
Random factoid: Lithium primary (non-rechargeable) cells as used in some avionics (INUs, ELTs, etc.) are now often LiMnO2, and they're usually certified very safe. Plus or minus the occasional smoldering ELT in a 787 parked at Heathrow.
Ref https://www.gov.uk/aaib-reports/aircraft-accident-report-2-2015-boeing-b787-8-et-aop-12-july-2013
The nickel-studded mentioned in the item is a solution (like many) that is based on an assumption that the short circuit is outside the cell.
But some of the more-famous failures (cough Sony cough) have been caused by metal particles causing shorts through the insulating film INSIDE the battery.
There's next to nothing that can be done about such internal short circuits, except to invent some new-fangled self-limiting or self-extinguishing action within the cell materials.
That's ---^ where you went wrong.
I believe it's very likely possible to 'crack' their way past the phone's security, but I don't think it's "very easy".
My assumption is that it's very unlikely that there's not at least several subtle implementation flaws. History of cryptography indicates that it's almost a general rule.
AC (John?) "...as yet unproven assumption that Apple has made the sort of trivial errors that would indeed reduce the available keyspace."
The latest crypto and latest implementations are just the latest in a very long line.
It's called Inductive Reasoning to see the sun 'rise' in the East every morning and leap to the unproven hypothesis that it'll almost certainly continue to 'rise' in future mornings.
It's extremely unlikely that the iPhone 5C will end up in a museum as the very first perfect implementation in history.
In fact, the Feds have already identified an attack vector. It's underway and they're very likely to succeed.
There would be 'N' such attack vectors. The theory that 'N' = 0 is extremely unlikely.
"...trivial errors..."
The crackers find implementation errors that are sometimes trivial (often only in hindsight), but sometimes they find implementation errors are unbelievably subtle. Other times they're exploiting an inherent physical or design weakness, that may have nothing directly to do with the security designers.
The attacks do not "reduce the available keyspace" (you're still stuck in that same limited thinking, sigh... seriously, please stop...). The side-channel attacks often reveal the key almost directly. The key could be a million bits and their attack would still read it out bit by bit.
Just because I can't be arsed to give you anything more than a quote from Wiki, it doesn't mean that I don't have a shelf bulging with books on the history of cryptography. Cryptographer-Hubris is a recurring theme in history. I'm here to make the world a better place by gently mocking such dangerous cryptographer-hubris.
You'd be a better person if you drop the naive faith in cryptography. Learn the endlessly recurring history. We've been through this exact same cycle so many times before.
(Unless you're a terrorist. Then, please... ...trust the crypto fully.)
AC "Even if the key hidden away in a secure chip that can't be removed or decapped without self-destructing, thus the ONLY place the encrypted memory can be read is on the actual device?"
Congratulations! You're clearly NOT a hardware cracker! Yay!
Neither am I. But I've seen how they work. Because I have Internet, a video player and *interest*.
One video I saw was cracking a 'totally secure' SmartCard. The card processor had all sorts of physical roadblocks. It took him almost four hours to get the keys out. All friggin' morning. Crikey!
Try CCC.de Media. It's a goldmine of presentations.
It'll shake your worldview to its core.
partypop69 "Whenever the device in your possession you have full control, I don't care what encryption it has, anything can be broken."
Agree.
Anyone that disagrees needs to spend some remidial education time on CCC.de Media presentations to address their missing background understanding of the real world.
What's really shocking is how quickly the hardware-in-possession crackers can crack. Days or a couple of weeks, done and dusted.
Cryptographer-hubris is dangerous. It's an attitude that needs to stamped out.
Cryptography-Keyspace fanboyism ("...10^77 years!!") is just annoying.
John H Woods - "The 'serious' encryption is universally the XOR function -- No, it isn't."
At its heart, yes it is.
Advanced Encryption Standard: "...InitialRound - AddRoundKey - each byte of the state is combined with a block of the round key using bitwise XOR." "The subkey is added by combining each byte of the state with the corresponding byte of the subkey using bitwise XOR."
Note the "XOR" mentioned.
Yes, there's also some shuffling and such. But it's nearly universal that there be an XOR function at the heart of any cryptographic system, ...obviously.
You were clearly incorrect in your rebuttal. Clearly.