I've written about proving systems in the past, and will have more to say in the future, but today we'll talk about homomorphic encryption.

Homomorphic encryption will change the web in the same way that SSL/TLS did. I say this with quite a bit more confidence than I have in the past! If you remember the web in 1993, that's interesting to you. If not, imagine the web as a magazine which could show you ads, but required calling an 800 number if you wanted to make a purchase, and contrast that with today's amazon.com.

I have given two presentations with similar titles before. But this presentation will be an almost complete rewrite, just like the last. In the time since my first article on homomorphic encryption, it's gone from a gleam in a mathematician's eye to an open source DB access library. It's a fast-moving technology, which, thankfully, becomes more practical each year.

Interestingly, as the technology for cloud security becomes more practical, the need for it becomes more pressing.

CloudDevelop 2014 is just $20, which is quite cheap, as conferences go, but you can use this link to save 50%, which means that for the $10 you might have spent for lunch that day you get a conference for free!

Read more]]>

In the meantime, I'll leave you with some decidedly weird Python code:

`> def my_factorial_less_than_20(n):`

.. result = 1;

.. for i in range(2, 20):

.. result *= 1 if i > n else i

.. return result

> my_factorial_less_than_20(4)

=> 24

> my_factorial_less_than_20(100)

=> 121645100408832000L

> my_factorial_less_than_20(1000)

=> 121645100408832000L

This looks both limited and inefficient! Is there any reason at all to write such a strange function?

In what significant way does the behavior of code written in such a style differ from more standard factorial implementations? (Hint: The answer can be found in

Read more]]>

This, apparently, has done nothing to stop the patent land rush for any process which might conceivably utilize homomorphic encryption. Some of these are legitimate under the current rules of U.S. patents, even if you personally find those rules wrongheaded. For example, IBM has filed patent claims on Craig Gentry's breakthrough algorithms.

On the other hand, it is not difficult to find patents which don't so much describe an invention as speculation on what someone might invent in the future. It reminds me of the well-worn economics joke, "Assume a can opener..." Some of these contain amusing disclaimers, such as the following:

Currently there is no practical fully homomorphic cryptosystem, that is, there is no secure cryptosystem that allows for the homomorphic computation of additions and products without restrictions. There has been a recent contribution by Gentry that presents a cryptosystem based on ideal lattices with bootstrappable decryption, and it is been shown that it achieves a full homomorphism. Nevertheless, the authors of this method to concede that making this scheme practical remains an open problem.

Others invoke a form of rhetorical hand waving more commonly seen in freshman term papers than patent applications:

There exist well known solutions for secure computation of any function e.g. as described in [45]… The general method employed by most of these solutions is to construct a combinatorial circuit that computes the required function… It seems hard to apply these methods to complete continuous functions or represent Real numbers, since the methods inherently work over finite fields.

The “well-known” solutions, here, are both computationally impractical and covered by other patent applications. And “seems hard” is really not the right phrase to pick when you are describing a problem for which there is no known solution.

In 2004, Phillips filed a patent application for the very idea of homomorphic encryption. Never mind that no algorithm for fully homomorphic encryption existed at the time, and that Phillips did not invent the idea itself. Buried in paragraph 69 are the following weasel words:

An encryption scheme with these two properties is called a homomorphic encryption scheme. The Paillier system is one homomorphic encryption scheme, but more ones[sic]exist.

Note that Pascal Paillier does not appear to work for Phillips, published his encryption scheme 5 years prior to this patent application, and Paillier encryption is only homomorphic with respect to addition.

Some approved patents venture into the absurd. Remember the string concatenation example I used in my first article on homework encryption? Folks who follow my public speaking closely (hi, mom!) will remember that I sometimes joke about a startup selling cloud-based, homomorphically encrypted string concatenation ("Concatenatr"). It's a joke, right? You can stop laughing now; SAP actually received a patent for that one.

Read more]]>

Moving to Better Secure the Cloud: Governance, Risk, and Compliance Management

My presentation will be on the potential business impact on the web if an efficient and fully homomorphic encryption system is invented. I'll be speaking sometime in between 3:15 and 4:00 EST, for about 20 minutes. The target audience is CIOs.

Sorry for the short notice, but this came together at the last minute!

Read more]]>

Perhaps surprisingly, it's actually very simple. (I say "surprisingly" because much of the math and technical papers on encryption is decidedly not simple, including that of Gentry's first fully homomorphic scheme, which was based on ideal lattices.)

This scheme created by Marten van Dijk, Craig Gentry, Shai Halevi and Vinod Vaikuntanathan, however, uses basic arithmetic operations like division, with a few uncommon, but easy to understand, variations.

The nature of homomorphic encryption also dictates a programming style which will seem unfamiliar to users of most high-level languages, but very familiar to anyone who has ever taken a digital circuits class.

What is the integer quotient of 10 / 6? Most people would probably say "1, remainder 4." But the exact answer, in the domain of real numbers, is 1.666666 (repeating), which is closer to 2 than the previous answer of 1. So another way we could answer this question is "2, remainder -2." This quotient is closer to the exact answer, and, correspondingly, has a smaller remainder.

So which answer is correct? Arguably, both of them. As long as the following rule holds:

`dividend = divisor * quotient + remainder `

(the division rule)

...then we have a valid answer.

Which method should you use?

In fact, there are at least five different, valid algorithms for selecting a quotient and remainder for a given integer division problem.

The homomorphic encryption scheme used in the van Dijk, et. al. paper and in Gentry's CACM article uses "R-division":

- Compute the real quotient
`Q`

._{R} - Compute the integer quotient
`Q`

by rounding_{Z}`Q`

to the closest integer._{R} - Compute the remainder
`R = Dividend - Divisor * Q`

_{Z}

This is probably different than the method you learned in elementary school, but both are valid. Importantly, it's probably different than what the integer division and remainder operators in your favorite programming language do.

If you've ever seen a 12 hour, analog clock, then you've worked with modular arithmetic. The time of 10:00 today is the same position on the clock as 10:00 p.m/22:00. Another way to say this is:

`10 ≡ 22 mod 12`

This reads: "10 is congruent to 22 modulo 12." In other words, we just ignore the "extra" 12 hours in the measurement of 22:00, because it's the same clock dial position as 22:00, only with a circle around the dial.

Naturally, we can perform arithmetic operations like addition and subtraction and compare the congruence of the result. I can say "10:00 + 5 hours means that the hour hand will point to 3 on the clock," or:

`10 + 5 ≡ 3 mod 12`

You can use any number as the modulus. When we make analog clocks, we use 12. A modulus which is particularly interesting in computer programming is 2. Arithmetic operations mod 2 correspond to binary operations, e.g.:

`0 + 0 ≡ 0 mod 2`

0 + 1 ≡ 1 mod 2

1 + 0 ≡ 1 mod 2

1 + 1 ≡ 0 mod 2

From this example, you can see that addition modulo 2 is the same as the binary operation

`XOR`

. It turns out that subtraction is exactly the same thing (for mod 2 only!):`0 - 0 ≡ 0 mod 2`

0 - 1 ≡ 1 mod 2

1 - 0 ≡ 1 mod 2

1 - 1 ≡ 0 mod 2

Multiplication modulo 2, on the other hand, corresponds to the binary operation

`AND`

:`0 * 0 ≡ 0 mod 2`

0 * 1 ≡ 0 mod 2

1 * 0 ≡ 0 mod 2

1 * 1 ≡ 1 mod 2

This is all very simple. However, a word of caution is in order here for anyone who has used a high-level programming language. Most languages have an operator like

`mod`

(in Delphi) or `%`

(in C#, which is commonly pronounced "mod"). However, this operator is not strictly modular congruence. It is more like a remainder, although, as we've seen, different people and different programming languages might choose to compute a remainder in different (but hopefully valid) ways.Remainders and congruence are often the same. In fact, the congruence relationship and the integer division remainder for the examples above are

`10 ≡ 22 mod 12`

22 / 12 = 1 R 10

However, it's easy to show examples where this is not true:

`22 ≡ 34 mod 12 (1)`

34 / 12 = 2 R 10 (2)

-22 ≡ 2 mod 12 (3)

-22 / 12 = -1 R -10 (4)

(1) is probably not the most common choice; 10 would be a more common answer, as with (2). (1) is nevertheless correct as a statement of congruence. Now compare (3) with (4). The most common way to compute a modulus is to pick the smallest positive number. But the most common way to perform integer division is so-called "T-division", where you select the quotient closest to zero and then calculate the remainder, resulting in a negative remainder when either the dividend or the divisor is negative.

A homomorphic encryption scheme, in addition to the usual

`Encrypt`

and `Decrypt`

and `KeyGen`

functions, has an `Evaluate`

function which performs operations on the ciphertext, resulting in the ciphertext of the result of the function.Obviously, this necessitates a different style of programming than that afforded by the typical programming language. In particular, code like this:

boola = b ? c : d

...(where

`a`

, `b`

, `c`

, and `d`

are all `bool`

s) is impossible, because the value of `b`

is encrypted, so the program cannot know what to assign to `a`

. If, however, we can perform binary operations on the ciphertext of `b`

, then we can rewrite the statement above as:

boola = (b & c) | ((!b) & d)

...or its homomorphic encryption equivalent:

CypherTexta = (b h_and c) h_or (h_not(b) h_and d)

...where the

`h_*`

operators are the homomorphic equivalents of the usual Boolean operations and `a`

, `b`

, `c`

, and `d`

are now of type `CypherText`

.Note that the version with the ternary operator and the version with the

`AND`

and `OR`

operators are not strictly the same, although their results are; the ternary operator is lazy, whereas the `AND`

and `OR`

version uses complete evaluation. This is necessary when the values are encrypted. It also means that the function may be inefficient.Intuitively, it's easy to see that any referentially transparent function can be reduced to such operations; this is what compilers and operating systems do under the hood anyway. I'm not going to go into any detail on how this is done; get an introduction to digital circuits book if you are interested in the particulars.

Gentry's article notes that other changes in programming style are necessary when performing operations within a homomorphic encryption scheme. For example, the size of the output of a function must be set in advance. Even if the plaintext is variable-length, the ciphertext must be of a known length, which corresponds to the number of "wires" in the circuit. The plaintext, therefore, would have "whitespace" at the end or be truncated, depending upon its size.

In my first post on homomorphic encryption, I mentioned that Gentry's encryption schemes can be considered fully homomorphic because they support two homomorphic operations, corresponding to Boolean

`AND`

and `XOR`

. I'd like to go into a little bit more detail as to why that is true.Many combinations of Boolean operations are equivalent. Perhaps the most famous are De Morgan's laws. There are a variety of techniques for converting one function using Boolean operations into an equivalent function with different operations. This is often done in order to produce a simpler or more efficient function, but can also be done in order to use a specific type of Boolean gate.

It is possible to use combinations of

`NAND`

or `NOR`

gates, for example, to produce circuits which perform a Boolean AND, OR, NOT, etc. Hence, a cryptosystem which supported a homomorphic `NAND`

operation could be considered "fully homomorphic."Similarly, having both

`AND`

and `XOR`

gates available allows you to produce the same outputs as all other Boolean gates, as, for example, `NOT p`

is the same as `p XOR 1`

and we can see by De Morgan's laws that with `NOT`

and `AND`

we can implement `OR`

.Therefore, we can see that a cryptosystem can be considered fully homomorphic if it supports enough homomorphic operations to perform any logical Boolean operation. In particular, a cryptosystem which supports any of the following homomorphic operations would qualify:

- Just
`NAND`

- Just
`NOR`

- Both
`XOR`

and`AND`

If you understand the math above, you should now be able to follow along as I implement Gentry's algorithm for "somewhat homomorphic" encryption. As you'll see, the "somewhat homomorphic" encryption will be the first step towards a fully homomorphic cryptosystem.

Read more]]>

Although these words may be unfamiliar to many, the subject matter is terribly important, because, like public-key encryption, which paved the way for secure transactions over the web, an efficient and fully homomorphic encryption scheme would enable new kinds of distributed computing. I'll explain more about this in a little bit.

If you have an ACM membership, or can find the magazine in a library, I recommend the article. It is far and away more readable than the technical papers on the same subject which are available on the web. The article uses a running analogy of a jeweler who doesn't quite trust her employees in order to help the reader understand the design of the system.

"Homomorphic" is an adjective which describes a property of an encryption scheme. That property, in simple terms, is the ability to perform computations on the ciphertext without decrypting it first. Because this tends to sound either baffling or miraculous the first time you hear it, let's begin with a very simple example.

The popular but wildly insecure cipher scheme rot-13 (a.k.a. "Caesar cipher") is partially homomorphic, specifically with respect to the concatenation operation. Imagine we write an

`Encrypt`

and `Decrypt`

function using the rot-13 algorithm. The "secret key" will be 13, the number of characters each letter is shifted. Let's encrypt two words and then concatenate the ciphertext, and finally decrypt the result. In psuedocode, this is:

`var c1 = Encrypt(13, "HELLO"); // c1 = URYYB`

var c2 = Encrypt(13, "WORLD"); // c2 = JBEYQ

var c3 = Concat (c1, c2); // c3 = URYYBJBEYQ

var p = Decrypt(13, c3); // p = HELLOWORLD

Because it was not necessary to first decrypt the two fragments of ciphertext before performing the concatenation operation, we can say that rot-13 is homomorphic with respect to concatenation. In other words, it is possible to take two pieces of ciphertext and perform an operation on them which results in the ciphertext of the concatenation of the two respective pieces of plaintext.

Visually, it looks like this:

It just so happens that in this case the homomorphic concatenation (concatenating the two fragments of ciphertext) is the same operation as the non-homomorphic concatenation (concatenating two fragments of plaintext). This is not always the case. What is important is that we can perform some operation on the input ciphertext which will produce new ciphertext that, when decrypted, will produce output plaintext corresponding to a desired operation on the input plaintext.

Although rot-13 is not at all secure, it turns out that some cryptosystems widely believed to be secure have homomorphic properties. For example, "pure" and un-padded RSA is homomorphic with respect to multiplication.

All of this may seem academically interesting, but one might wonder why I started this post claiming that "an efficient and fully homomorphic encryption scheme would enable new kinds of distributed computing."

I think that the scenario which best makes the benefits of homomorphic encryption clear is cloud computing. It is often much more economical to buy computing resources from a cloud provider than to build a data center yourself. This is especially true if your need for computing horsepower fluctuates. But what if you want to do this computing on private data? Maybe you trust your cloud provider, but what if trust is not enough? Is there a way to get the compelling economic benefits of cloud computing while keeping your data secure?

Imagine that you have developed a web application for income tax preparation. Your application collects personal and financial information from the user, creates a tax strategy resulting in the lowest possible (legal!) income tax payment, and submits the prepared return to the IRS. From a business point of view, this application might be compelling, but it's a daunting task operationally. In the United States, most people file their taxes once a year, before April 15th. This means that your servers will see a huge spike in demand in the first quarter of the year, with relatively little demand other times. Also, potential customers might be understandably wary about uploading their personal information and financial data to the website of a new business.

Cloud computing offers an excellent answer to the first problem. Because you buy computing resources on as-needed basis, it is very easy to increase the number of available servers when tax season arrives, and reduce the number at other times. Unfortunately, it doesn't do anything for the second problem.

Hard security in the cloud is not really possible today. It's very easy to have secure transmission from a local machine to a cloud data store, and of course the stored data can be encrypted. But actually performing computations on that data in the cloud requires decrypting it first. Anyone with physical access to the machine, therefore, has access to your data.

What if it were possible for a user of your tax preparation software to upload their financial information encrypted under the IRS's public key? Then their data would be secure. You can do this today, of course, but your software would be unable to select a tax strategy for the user or prepare forms. If, however, the encryption method used was also fully homomorphic, then your software could do all of this work without first decrypting the user's information. The output of your software would be a tax return encrypted under the IRS's public key. Only the IRS could decrypt it.

So a fully homomorphic encryption scheme would seem to solve the security problem while still being compatible with a cloud deployment scenario. Unfortunately, preparing a tax return requires more than just a Concat operation. We can say that an encryption scheme is "fully homomorphic" if it supports enough homomorphic operations to implement any function we need. I'm being purposely imprecise here, because the details are a bit of a digression at this point.

Although there are a number of encryption schemes with one homomorphic operation, until recently no one was really sure if fully homomorphic encryption was even possible. Rivest, Adleman, and Dertouzos suggested that fully homomorphic encryption may be possible in 1978, but were unable to find a secure scheme. Another scheme, from Boneh, et. al., supports two operations, but you can only perform one of them once.

When you consider the complexity of a typical programming language, two operations doesn't sound like very many, but since we are only looking at operations which mutate data (as opposed to keeping track of the current instruction pointer, etc.) it turns out that supporting the

`and`

and `xor`

operations is enough to consider a scheme fully homomorphic, So it was big news when Craig Gentry managed to create such a scheme as part of his Stanford doctoral thesis. Gentry, along with several other researchers from IBM and MIT later created yet another scheme using a different encryption technique, but a similar general approach, with similar homomorphic properties.

Some would say "too big." Bruce Schneier criticized the IBM press release for implying, in his reading, that Gentry's scheme was practical for real applications, today. It isn't; the computational and data storage overheads are far too high. However, this takes nothing away from Gentry's achievement; he has shown that fully homomorphic encryption is actually possible. Indeed, Schneier concludes:

Visions of a fully homomorphic cryptosystem have been dancing in cryptographers' heads for thirty years. I never expected to see one. It will be years before a sufficient number of cryptographers examine the algorithm that we can have any confidence that the scheme is secure, but -- practicality be damned -- this is an amazing piece of work.

The second, fully homomorphic scheme proposed by Gentry, et al., and described in the CACM article is simple enough that I decided to implement it myself. I've learned quite a lot in the process of doing this. So in the near future I'll be sharing my code and some of the insights I've made in this process.

Before I can do that, however, I need to describe some of the math involved. And I'd like to elaborate on why two operations is enough to consider a scheme "fully homomorphic."

(Continue reading...)

Read more]]>