Perhaps if they can reduce the resources required or Moore's observation solves that problem for them, this could be the next version of bitlocker.
Microsoft researchers smash homomorphic encryption speed barrier
Microsoft researchers, in partnership with academia, have published a paper detailing how they have dramatically increased the speed of homomorphic encryption systems. With a standard encryption system, data is scrambled and then decrypted when it needs to be processed, leaving it vulnerable to theft. Homomorphic encryption, …
COMMENTS
-
-
Wednesday 10th February 2016 15:32 GMT Gary Bickford
bit locker is old hat - self encrypting drives are already here
Bitlocker is about "data at rest", while homomorphic encryption is about data in process and SSL (for example) is about data in transit. At present nearly all server class hard drives are self encryping, and most consumer drives though it may not say so on the label. I am told that by the end of this year nearly all HDs and SSDS of all types wi) be. All Apple products have been for years. What's been missing is a standardized library, which now exists (OPAL) and a widely accepted API and user/OS interface, which is now in the process of being accepted - see the Drive Trust Alliance (http://drivetrust.org iirc)
An SED drive keeps all the data encrypted all the time using an internally generated 256 bit key. Another set of keys - passwords - can be set externally. Resetting the drive's internal key effectively erases the drive as brute force decryption would require millions of dollars worth of cpu time at present.
-
-
Tuesday 9th February 2016 09:26 GMT Paul Crawford
So you have the key stored somewhere in the program's memory to run the operations on the encrypted data, instead of both the key and some plaintext in memory?
I guess its a bit less likely to get slurped, but if the machine is compromised enough to allow reading arbitrary blocks of memory, isn’t the key also vulnerable to this? In the conventional system I guess you could zero the memory after using it so the plaintext was short lived (if that really is the nature of the risk it is mitigating) and be a damn sight faster.
-
-
Tuesday 9th February 2016 11:38 GMT Anonymous Coward
Huh?
"No. The whole point of homomorphic encryption is that you perform the mathematical operations on the encrypted data, but the result is the same as if you had decrypted, performed the operations and re-encrypted.The program doesn't need the key."
If you don't decrypt the data I can't see how you can do any useful operations on it other than guess its size and perhaps the entropy of the data inside or maybe move fixed position fields around (assuming the encryption process leaves byte positions unchanged).
-
-
Tuesday 9th February 2016 10:18 GMT Bc1609
The key is not stored
The entire purpose of this operation is that no access to the key is required.
The Reg article is a little unclear on this, but if you read the paper they provide an illuminating example on the first page:
Consider a hospital that would like to use a cloud service to predict the probability of readmission of a patient within the next 30 days, in order to improve the quality of care and to reduce costs. Due to ethical and legal requirements regarding the confidentiality of patient information, the hospital might be prohibited from using such a service. In this work we present a way by which the hospital can get this valuable service without sacrificing patient privacy. In the protocol we propose, the hospital encrypts the private information and sends it in encrypted form to the prediction provider, referred to as the cloud in our discussion below. The cloud is able to compute the prediction over the encrypted data records and sends back the results that the hospital can decrypt and read. The encryption scheme uses a public key for encryption and a secret key (private key) for decryption. It is important to note that the cloud does not have access to the secret key, so it cannot decrypt the data, nor can it decrypt the prediction. The only information it obtains during the process is that it did perform a p prediction on behalf of the hospital.
-
Tuesday 9th February 2016 11:41 GMT Anonymous Coward
Re: The key is not stored
"The cloud is able to compute the prediction over the encrypted data records and sends back the results that the hospital can decrypt and read. "
Sorry, I want to see proof of this because its sounds like BS. You can't do data analysis on data that you don't have.
-
-
Tuesday 9th February 2016 12:35 GMT Anonymous Coward
Re: The key is not stored
boltar wrote:
> Sorry, I want to see proof of this because its sounds like BS.
> You can't do data analysis on data that you don't have.
As an illustration: Let us suppose that the analysis consists of multiplying two numbers together and checking whether they are greater than another pair of numbers multiplied. This analysis is to be provided without revealing what the numbers are.
The data owner can provide the numbers as logarithms of some 'secret' base; not knowing the 'secret' the analyser cannot discover the numbers. But, they do know how to multiply numbers expressed as logarithms (you add them) and they do know logarithms maintain order. Therefore they can complete the analysis without knowing the actual numbers.
Obviously in a truly secure system the techniques and the maths are more complex.
HTH
-
Tuesday 9th February 2016 16:34 GMT Anonymous Coward
Re: The key is not stored
"The data owner can provide the numbers as logarithms of some 'secret' base; not knowing the 'secret' the analyser cannot discover the numbers. But, they do know how to multiply numbers expressed as logarithms (you add them) and they do know logarithms maintain order. Therefore they can complete the analysis without knowing the actual numbers."
Except in your example the relationship of the original numbers is maintained when converted to logarithms. However no relationship will be maintained between any properly encrypted data (and not only that, if the input has been shuffled first you won't even know where the data is) unless the encryption method specificially allows it. And a decent one won't. So either there are known holes in numerous ciphers or there's something they're not telling us.
-
Tuesday 9th February 2016 23:40 GMT Stuart Palin
Re: The key is not stored
boltar wrote:
"Except in your example the relationship of the original numbers is maintained when converted to logarithms."
Yes the ordering relationship is maintained, and there is a mapping of the multiplication operation in cleartext to the ciphertext - addition. In 'industrial strength' systems the preserved properties and mapping of operations are not so trivial - that is why the analyses are so much slower.
"However no relationship will be maintained between any properly encrypted data ..."
There is always a relationship between ciphertext and cleartext - otherwise you could not decrypt it. In homomorphic encryption some further relationships are possible (without knowing the cleartext).
"... (and not only that, if the input has been shuffled first you won't even know where the data is) ..."
So don't shuffle it - there are other ways of hiding live data, e.g. amongst sets of random data.
"... unless the encryption method specificially allows it."
Exactly! And guess what is special about homomorphic encryption.
"And a decent one won't."
Disagree - but then I am taking on trust the rigour of the maths behind homomorphic encryption (just as I take it on trust that the rigour of other methods of encryption). Your suggestion that it is not 'a decent encryption' does not shake my belief.
Bear in mind that homomorphic encryption has different objectives than many other forms of encryption - so it makes different tradeoffs.
"So either there are known holes in numerous ciphers or there's something they're not telling us."
Well there are known holes in numerous ciphers, the literature abounds with them - but not practically exploitable holes in decent ones; and I am sure there is plenty that they (who?) are not telling us. But neither of these points seem relevant - or are you aware of a practically exploitable hole in homomorphic encryption that they are not telling us about. What makes you believe this?
-
Wednesday 10th February 2016 09:39 GMT Anonymous Coward
Re: The key is not stored
"There is always a relationship between ciphertext and cleartext - otherwise you could not decrypt it. "
Except you need the key to reveal that relationship.
"So don't shuffle it - there are other ways of hiding live data, e.g. amongst sets of random data."
Huh? I would want to shuffle it if I was writing an encryption system, then encrypt the shuffle order with the key too.
"But neither of these points seem relevant - or are you aware of a practically exploitable hole in homomorphic encryption that they are not telling us about. What makes you believe this?"
Because I don't believe this is possible without some information unwittingly leaking out from the encrypted data due to a fault in the encryption algorithm. I'd love to see them try this method with a one time pad. I'd bet a lot of money on it that it wouldn't work.
-
Wednesday 10th February 2016 09:58 GMT Chris Harden
Re: The key is not stored
You would be surprised at the things you can do which would make it easy to break.
For example.
In your setup above you shuffle the data, OK it seems like that would make it more secure, I agree with that
However, then you encrypt and store the shuffle order. Now, that's a problem, I assume your using your data encrypting key as having more than one key to decrypt the data is a pain.
As we don't want to use security through obscurity lets assume your encryption algorithm is published and people know how it works, or at least can pull your systems apart and figure it out worst case.
So you now have a known, small and finite (there is an infinite amount of data to encrypt, but only so many shuffle patterns) amount of data which is encrypted with your data encryption key.
Which means you just gave an attacker your keys.
That's the point of encryption systems like this, for mere mortals like ourselves its usually best to trust the hardcore maths guys, because if something seems intuitive it usually means its mathematically weak.
-
Wednesday 10th February 2016 21:56 GMT Anonymous Coward
Re: The key is not stored
boltar wrote:
"Except you need the key to reveal that relationship.
Yes, but you also need the key to reveal the cleartext with homomorphic encryption. In my original illustration I did say:
<<Obviously in a truly secure system the techniques and the maths are more complex.>>
The fact that you can understand the relationships in my illustration is the whole point of the illustration - it was attempting to bridge the gap in understanding by relating the complex 'industrial strength' encryption to an easily understood mathematical relationship (with 'toy' encryption properties).
"Huh? I would want to shuffle it if I was writing an encryption system, then encrypt the shuffle order with the key too."
Do you want to write a homomorphic encryption system? If not your protestations are irrelevant because that is what these people want to write. If you do you would need to find a way to encrypt the shuffle order in a way that some operations can be mapped into the cipherspace. That is why the maths of an industrial strength system get hairy.
"Because I don't believe this is possible without some information unwittingly leaking out from the encrypted data due to a fault in the encryption algorithm."
OK, I can't imagine anything I am going to say is going to change your mind - you need to study the literature to convince yourself. Or make a name for yourself by demonstrating a flaw in their work.
"I'd love to see them try this method with a one time pad. I'd bet a lot of money on it that it wouldn't work."
Well it depends on how you apply the one time pads and what method you want to apply. Suppose someone wanted to average a set of data. You don't want them to see the raw data so you encrypt it with a OTP by adding the random values from the OTP to each data element in turn. You send them the encrypted data and the sum of the OTP. They calculate the average by summing the encrypted data,subtracting the OTP sum and dividing by the number of data elements. They have their average but do not know the raw data values. Ta-da!!
(This probably will not satisfy you; but you really do need to look at the more technical literature if you want greater understanding.)
-
Wednesday 2nd March 2016 21:41 GMT Deadlock Victim
Re: The key is not stored
"Because I don't believe this is possible without some information unwittingly leaking out from the encrypted data due to a fault in the encryption algorithm"
I don't even pretend to understand homomorphic encryption, but that sir, is the logical fallacy of personal incredulity.
"The argument from incredulity is a logical fallacy that occurs when someone decides that something did not happen, because they cannot personally understand how it could happen." - Rational Wiki
-
Friday 12th August 2016 21:17 GMT Charles 9
Re: The key is not stored
"The argument from incredulity is a logical fallacy that occurs when someone decides that something did not happen, because they cannot personally understand how it could happen."
It's also called being properly paranoid. Don't trust what you can't understand, especially in a world where it's hard to trust ANYONE; you're as likely as not being taken for a ride.
-
-
-
-
-
-
Tuesday 9th February 2016 12:54 GMT GrumpenKraut
Re: The key is not stored
> Sorry, I want to see proof of this because its sounds like BS. You can't do data analysis on data that you don't have.
The encryption is an homomorphism with respect to the operations needed for the computation. Calling anything BS because you lack the brain to understand it is pretty poor.
-
-
-
-
-
-
Tuesday 9th February 2016 19:41 GMT Robert Helpmann??
Re: Snooping
... encrypted "not databases" ... could be sent for analysis by third party clouds ... whilst still keeping both the data and the results of the analysis encrypted at all times.
But by doing this, wouldn't it open the door to data capture and eventual decryption? It seems to me that such an approach would be putting a large amount of trust in encryption alone instead of a more layered approach of encryption, access controls, et cetera. How ironic it would be if too much reliance was placed something intended to increase security and thereby allowed unauthorized access*.
*Not that anything of this nature has ever happened before.
-
-
-
Tuesday 9th February 2016 11:22 GMT Dave 126
Re: So let me get this right..
The Windows teams are expected to take input from MS's strategic business team, which itself would be trying to work out how to use Windows to maximise profits or footholds for other products and services, in a rapidly changing, competitive environment. Or something like that.
The research teams, whilst doing stuff that makes my head hurt just trying to substandard, have in some ways more simply defined tasks.
-
Tuesday 9th February 2016 13:45 GMT DropBear
Re: So let me get this right..
"Do Microsoft R&D and production actually ever talk to each other?"
Don't be silly, of course we do! Now here, take this totally free screen saver and please don't be so rude as to ask why it constantly pegs one of the cores or why it connects to the internet occasionally...
-
-
Tuesday 9th February 2016 12:05 GMT amanfromMars 1
Moving on SWIFTly ....... How about AI Quantum Leap this Year of Our Laud?
.... with Greater Remote Virtual Controls for Heavenly ProgramMining and ReProgramming of Reality Programming for Mass Control of Command‽
Methinks that be something of not inconsiderable concern to exclusive executive operating systems.
Is "Homomorphic encryption, first proposed in 1978 but only really refined in the last decade thanks to increasing computing power, allows software to analyze and modify encrypted data without decrypting it into plaintext first. The information stays encrypted while operations are performed on it – provided you have the correct key, of course." kith and kin and akin to NEUKlearer HyperRadioProActive IT?
Wired helpfully hopefully thinks so ..... :-) ..... for somebody has nabbed it off the thread for a second time. Is that a good move, Wired? Can we have our ball back, please, mister? The Great Game Goes On AIPace.
amanfromMars [1602090450] ……. clearing away the smoke and mirrors on http://www.wired.com/2016/02/its-been-20-years-since-this-man-declared-cyberspace-independence/
“Governments have learned in what might be the steepest learning curve in history that they can shape this global phenomenon called the Internet and in ways that often go beyond what they can do in the physical world, and they’re doing so at an alarming pace.” … Google general counsel David Drummond [2013]
Those steep learning curved lessons are surely replaced today by governments suddenly slowly realising they are increasingly opposed and competing unfairly and unjustly against seriously serially smarter systems of operation seamlessly ripping and pirating for both private vector and public sector use, exclusive executive SCADA administrations …….. for Creative Cyber Command and Control of Computers and Communications.
And there is nothing effective that governments, which really be those few and/or that which is presumed and assumed to act as a government, can do to alter those emergent facts in the fictions they promote and support. And to both ignore and/or deny such an evident truth is that which clearly hastens the catastrophic end to their families of follies.
Beg to differ if you will and prove the point valid and current running.
Have a nice day, y’all, and you aint really seen nothing yet, but be assured, something different is on its way to you, to be delivered in a series of flash market cached crashes.
-
Tuesday 9th February 2016 12:42 GMT AndyS
What on earth is this article on about?
It starts off coherently enough, talking about an interesting approach to encryption.
Then it jumps to something about neural networks examining photos of hand-written notes.
Then it says they managed to process these notes with a 99% accuracy.
I may be missing something, but what were they processing, what was the "correct" answer, what is the 99% figure all about in relation to data processing, and how does this relate to encryption?
Having read this article, I honestly have no idea what has been achieved, and if it's impressive or not.
-
Tuesday 9th February 2016 13:02 GMT GrumpenKraut
Re: What on earth is this article on about?
The article itself is (from what I see) about machine learning (using neural networks) with data that has been encrypted homomorphically. The way these two are combined makes it a nontrivial task to write an article about it.
[edit: the stated(!) gains in performance are impressive]
-
Tuesday 9th February 2016 13:52 GMT DropBear
Re: What on earth is this article on about?
"Having read this article, I honestly have no idea what has been achieved, and if it's impressive or not."
In other words, they can now recognize / match handwriting without ever having seen either the written sample itself or the result of the match operation. There, not so hard after all...
-
Tuesday 9th February 2016 19:52 GMT Hurn
Re: What on earth is this article on about?
Does the 99% figure relate to (what sounds like) OCR hit rate, which doesn't sound all that good, depending on the source images, or is 99% somehow related to internally guessing the values of the encrypted data, prior to doing work, which also does not sound like something to brag about?
I'm guessing there may have been a copy & paste incident, and we're actually reading content from what should be two different articles.
-
Wednesday 10th February 2016 06:21 GMT Anonymous Coward
Re: What on earth is this article on about?
It's poorly written, but yes the 99% should be the OCR accuracy rate. 95% accuracy is the usual number given for humans, so 99% is really good.
The way the system should work is that the client system encrypts the training data for the ANN. The encrypted data is then uploaded to the "insecure" server, which runs a modified version of an ANN over the encrypted data. Then, the finished computation can be sent back to the client, which can use the results safe in the knowledge that the data was never decrypted on the server.
As pointed out above, this requires some really screwy thinking to make sure the important relationships within the data aren't clobbered by the encryption.
It helps that I just saw a presentation about a hardware method where the data was only decrypted on the CPU--not memory, disk or network--just last week. In either case, the guy that owns the server has no way of determining what's running on the system.
-
Wednesday 10th February 2016 06:33 GMT diodesign
Re: Re: What on earth is this article on about?
Apologies the second half wasn't clear enough - I've tidied it up. Basically, the neural network demonstrates that work can be done to encrypted data without knowing the decryption key. So you can pass encrypted scans of your handwriting to the AI, it recognizes your writing and converts it to text and sends it back encrypted, all without having to decrypt the data to analyze it.
And quickly.
C.
-
-
-
-
-
-
Wednesday 10th February 2016 10:19 GMT Jedit
"I suspect its probably the rather childish association"
To be honest, I first read the headline as "homophobic encryption speed barrier".
I think the downvotes have come because ISIS have not broken the homophobic encryption speed barrier. In fact, their homophobia is not encrypted at all. They're getting plenty of practice taking it to terminal velocity, though.
(Bomb, because we're dropping a lot of them on the gay-bashers of the Caliphate.)
-
-
-
-
Tuesday 9th February 2016 14:38 GMT Seajay#
Witchcraft
This is deep voodoo indeed and very impressive.
I don't see it actually being all that useful though.
In-house you have to: work out what calculations you need performed, collect, store and encrypt the data, including re-encrypting it with the special homomorphic encryption and send that data to a third party.
Sub-contractor: Blindly carry out number crunching
In-house: Decrypt, interpret, action, re-encrypt in your in-house format, store.
That seems like a lot of hassle for very little gain unless the number crunching itself is either very computationaly intensive (which can't be the case since you're accepting a factor of several 1000 increase in processing time by using homomorphic encryption) or highly secret. The only customer organisations who would care enough about privacy and have the deep pockets to afford all this can be trusted to respect patent laws and EULAs, just let them run your super secret algorithm on their internal servers. Most of all, there aren't really many super-secret algorithms which a business can be built on, the hard stuff and therefore the stuff which gets paid is all the tedious integration.
-
-
Wednesday 10th February 2016 08:56 GMT Seajay#
Re: I don't see it actually being all that useful though.
"a way to allow a sister company, who gathered data under one set of T&Cs to allow me access for marketing/research purposes, without breaking said T&Cs, or any DPA issues."
Can it really be used for that? I think if your sister company gathered the information then you can process the information but I don't think you can use it for marketing/research. All you get after your processing is an encrypted answer which you can pass back to your sister company for them to decrypt.
-
-
-
Tuesday 9th February 2016 20:00 GMT Pirate Dave
confused
So I admit, I'm no cryptographer. That's way too much math for me and it puts me to sleep. But I'm confused by this new homomorphic thing. If you can "massage" the encrypted data until you see manipulable patterns in it, doesn't that mean your encryption isn't really that good? I mean, I thought the whole point of encryption, the gold standard, was that without the keys the data was basically jumbled streams of worthless numbers. Only with the keys can you do anything worthwhile with the data.
Sorry to sound dense, but this doesn't make sense.
-
Tuesday 9th February 2016 23:45 GMT Anonymous Coward
Some algorithms are homomorphic in nature, some info on it here
https://www.schneier.com/blog/archives/2009/07/homomorphic_enc.html
http://www.americanscientist.org/issues/pub/2012/5/alice-and-bob-in-cipherspace/99999
One advantage of a homomorphic encryption is that as well as the data to be queried being encrypted so is the query itself, and the results.
With the query being encrypted you could then have a scenario whereby the person holding the encrypted data does not see what the query is, as well as the result.
Example of this could be in military operations, Country A has some data on where they will be operating in a hostile territory. Country B, an ally of country B, is also operating in the same theatre of operations and wants to know whether Country A would be in a certain area that Country B has a planned mission.
So Country B sends an encrypted query to the encrypted data held of Country As whereabouts, and gets an encrypted result returned that can only be decrypted with a key held by Country B.
Country B get to know will they be crossing paths with Country A on their route, but don't have access to any further data from the source about Country A whereabouts as its encrypted with a key only Country A have.
They also get to do this without giving away to Country A where they will be as the query and result are encrypted with a key only Country B have, Country A never get to see the question in plain text.
That's a really rubbish attempt at an explanation of a benefit of being able to perform encrypted operations on encrypted data so ignore any 'plot holes', it's been a long day and was a while ago since I sat through the IBM demo of this.