Leftover hash-lemma

Leftover hash-lemma

Imagine that you have a secret key X that has n uniform random bits, and you would like to use this secret key to encrypt a message. Unfortunately, you were a bit careless with the key, and know that an adversary was able to learn about t < n bits of that key, but you do not know which. Can you still use your key, or do you have to throw it away and choose a new key? The leftover hash-lemma tells us that we can produce a key of almost n - t bits, over which the adversary has almost no knowledge. Since the adversary knows all but n - t bits, this is almost optimal.

More precisely, leftover hash-lemma tells us that we can extract about H_infty(X) (the min-entropy of X) bits from a random variable X that are almost uniformly distributed.In other words, an adversary who has some partial knowledge about X, will have almostno knowledge about the extracted value. That is why this is also called privacy amplification (see privacy amplification section in Quantum Crytography).

Extractors achieve the same result, but use (normally) less randomness.

The leftover hash-lemma was first stated by Russell Impagliazzo, Leonid Levin and Michael Luby and is a very useful tool in cryptography.Fact|date=January 2008

Leftover hash-lemma

Let X be a random variable over mathcal X and let m > 0. Let h : mathcal{S} imes mathcal{X} ightarrow {0,1}^m be a 2-universal hash function. If:m leq H_infty(X) - 2 log(1/varepsilon),then for S uniform over mathcal S and independent of X, we have:delta((h(S,X),S),(U,S)) leq varepsilon,where U is uniform over {0,1}^m and independent of S.

H_infty(X) = - log max_x Pr [X=x] is the Min-entropy of X, which measures the amount of randomness X has. The min-entropy is always less than or equal to the Shannon entropy. Note that max_x Pr [X=x] is the probability of correctly guessing X. (The best guess is to guess the most probable value.) Therefore, the min-entropymeasures how difficult it is to guess X.

delta(X,Y) = frac 1 2 sum_v left | Pr [X=v] - Pr [Y=v] ight | is the statistical distance between X and Y.

ee also

*Extractor
*Universal hashing
*Min-entropy, R%C3%A9nyi entropy
*Information theoretic security

References

* [http://portal.acm.org/citation.cfm?coll=GUIDE&dl=GUIDE&id=45477 C. H. Bennett, G. Brassard, and J. M. Robert. "Privacy amplification by public discussion". SIAM Journal on Computing, 17(2):210-229, 1988.]

* [http://portal.acm.org/citation.cfm?coll=GUIDE&dl=GUIDE&id=73009 R. Impagliazzo, L. A. Levin, and M. Luby. "Pseudo-random generation from one-way functions". In Proceedings of the 21st Annual ACM Symposium on Theory of Computing (STOC '89), pages 12-24. ACM Press, 1989.]

* [http://ieeexplore.ieee.org/xpls/abs_all.jsp?isnumber=10153&arnumber=476316&type=ref C. Bennett, G. Brassard, C. Crepeau, and U. Maurer. "Generalized privacy amplification". IEEE Transactions on Information Theory, 41, 1995.]

* [http://portal.acm.org/citation.cfm?coll=GUIDE&dl=GUIDE&id=312213 J. Håstad, R. Impagliazzo, L. A. Levin and M. Luby. "A Pseudorandom Generator from any One-way Function". SIAM Journal on Computing, v28 n4, pp. 1364-1396, 1999.]


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • List of lemmas — This following is a list of lemmas (or, lemmata , i.e. minor theorems, or sometimes intermediate technical results factored out of proofs). See also list of axioms, list of theorems and list of conjectures. 0 to 9 *0/1 Sorting Lemma ( comparison… …   Wikipedia

  • List of mathematics articles (L) — NOTOC L L (complexity) L BFGS L² cohomology L function L game L notation L system L theory L Analyse des Infiniment Petits pour l Intelligence des Lignes Courbes L Hôpital s rule L(R) La Géométrie Labeled graph Labelled enumeration theorem Lack… …   Wikipedia

  • Sharp-P — The correct title of this article is #P. The substitution or omission of the # sign is because of technical restrictions. In computational complexity theory, the complexity class #P (pronounced number P or, sometimes sharp P or hash P ) is the… …   Wikipedia

  • Information theoretic security — A cryptosystem is information theoretically secure if its security derives purely from information theory. That is, it is secure even when the adversary has unbounded computing power. An example of an information theoretically secure cryptosystem …   Wikipedia

  • Min-entropy — In probability theory or information theory, the min entropy of a discrete random event x with possible states (or outcomes) 1... n and corresponding probabilities p1... pn is The base of the logarithm is just a scaling constant; for a… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”