ETH Zürich » Computer Science » Theory » Cryptography

Research Highlights

The following is a list of some highlights of our research (most recent first):
  • Generic equivalence of breaking RSA and factoring. A long-standing open problem in cryptography is whether breaking the RSA cryptosystem is as hard as factoring the modulus, or whether it is potentially much easier. It is proved in [AM09] that breaking RSA and factoring are equivalent in a generic model of computation. Any generic algorithm for breaking RSA can be transformed into a factoring algorithm.
  • New theory of pseudo-randomness proofs. An abstract theory for pseudo-randomness proofs is presented in [MT09]. It allows to simplify, unify, and generalize a large body of results in the literature.
  • Cascade encryption revisited. The security of cascade block-cipher encryption is an important and well-studied problem in theoretical cryptography with practical implications. It is well-known that double encryption improves the security only marginally, leaving triple encryption as the shortest reasonable cascade. In a recent paper, Bellare and Rogaway showed that in the ideal cipher model, triple encryption is significantly more secure than single and double encryption. Three contributions are made in [GM09]: First, we point out and correct some errors in the paper of Bellare and Rogaway, re-establishing the lower bound on the security of triple encryption up to a constant factor. Second, a new lemma extending Maurer's theory of random systems allows us to rephrase their proof into this framework, thus making the argument more abstract and hence easy to follow. Finally, this reformulation allows us to prove a generalization of the theorem, addressing the security of longer cascades.
  • Quantum cryptography -- revisited. Quantum cryptography has been advocated as being provably secure based only that the laws of quantum mechanics are correct. It was shown in [AKMR07] that, contrary to the established security proofs in quantum cryptography, there are provably-secure key agreement schemes which can be broken. While a generated key is indeed provably secure, its security is potentially lost when it is used. This shows that the traditional security definition is too weak. The flaw is due to a locking property of the notion of accessible information in quantum physics: one additional (physical) bit of information can increase the accessible information by more than one bit. This problem was solved in [Ren05] where the right security definition for quantum key distribution is stated and a new framework for proving the security is proposed.
  • Efficient symmetric cryptography under very weak assumptions. In contrast to the standard notion of a pseudo-random function (PRF), which is indistinguishable from a random function for a computationally-bounded adversary asking arbitrarily many queries (within its running time constraint), the new notion of an s-query weak PRF was introduced in [MT08]. Such a function need only be secure against an adversary who obtains a fixed number s (e.g. s=2) of input-output pairs and, moreover, the inputs are random (not chosen). It is shown that provably-secure encryption can be obtained from an s-query weak PRF essentially as efficiently as from a full-fledged PRF.
  • Separation of secure multi-party computation (MPC) and secure function evaluation (SFE). In secure computation, which come in two flavors, MPC and SFE, one considers an adversary with a certain capability of corrupting the parties in various ways. In [BFHMZ08] and [HMZ08] we have characterized exactly for which adversary capabilities on can securely perform MPC and SFE and have illustrated that the conditions for MPC and SFE separate, contrary to the fact that previous protocols applied to both MPC and SFE.
  • Efficient asynchronous multi-party computation. It was shown in [HNP05] and [HNP08] that MPC protocols for asynchronous networks (like the Internet) can be made almost as efficient as their synchronous counterparts.
  • Multi-party computation with linear communication. The communication overhead of multi-party computation could be reduced to be constant per participant and per multiplication, first with cryptographic security [HN05, HN06], then even with perfect security [BH08].
  • Detectable Byzantine agreement and multi-party computation. In [FGHHS02], it was shown that Byzantine agreement is possible with a higher corruption threshold, if the protocol may fail in the presence of cheating. This result was extended and generalized to multi-party computation in [FHHW03].
  • Inexistence of field-homomorphic one-way permutations. If a one-way permutation were field-homomorphic, this would allow for entirely new applications in cryptography, including secure computation with encrypted data. It was proved in [MR07] that field-homomorphic one-way permutations cannot exist.
  • Indistinguishability amplification. [MPR07] presents a new generic approach to proving upper bounds on the information-theoretic distinguishing advantage (from an ideal system) for a combined system, based on upper bounds for the component systems. For a general type of combination operation of systems, including the XOR of functions or the cascade of permutations, two amplification theorems are proved. The first is a product theorem, in the spirit of XOR-lemmas: The distinguishing advantage of the combination of two systems is at most twice the product of the individual distinguishing advantages. This bound is optimal. The second theorem states that the combination of systems is secure against some strong class of distinguishers, assuming only that the components are secure against some weaker class of distinguishers.
  • Parallel repetition of games. In [Hol06b] a significantly simplified proof of Ran Raz' (SIAM J. Comput. 27, 1998) parallel repetition theorem for games was given.
  • Pseudo-random generators from one-way functions. In a seminal paper, Hastad, Impagliazzo, Levin, and Luby showed that pseudo-random generators exist if and only if one-way functions exist. The construction they propose to obtain a pseudo-random generator from an n-bit one-way function uses O(n^8) random bits in the input (which is the most important complexity measure of such a construction). In [Hol06a] a significantly simplified and improved reduction is presented.
  • Indifferentiability theory. In [MRH04] a generalized notion of indistinguishability was introduced which can be applied to primitives that are publicly accessible, like a hash function. The theory of indifferentiability has many applications: it allows to prove the impossibility of realizing a random oracle as well as to prove the security and soundness of hash function constructions. It is also the basis for providing domain extension of public random functions, see [MT07].
  • Abstract models of computation in cryptography. Computational security proofs in cryptography, without unproven intractability assumptions, exist today only if one restricts the computational model. For example, one can prove a lower bound on the complexity of computing discrete logarithms in a cyclic group if one considers only generic algorithms which can not exploit the properties of the representation of the group elements. In [Mau05] an abstract model of computation is proposed which allows to capture such reasonable restrictions on the power of algorithms. All previously considered generic algorithms are special cases of this model, in which proofs are more elegant and simpler.
  • New approaches to digital evidence. Digital evidence, such as digital signatures, is of crucial importance in the emerging digitally operating economy because it is easy to transmit, archive, search, and verify. Nevertheless the initial promises of the usefulness of digital signatures were too optimistic. This calls for a systematic treatment of digital evidence. In [Mau04] a foundation for reasoning about digital evidence systems and legislation is provided, thereby identifying the roles and limitations of digital evidence, in the apparently simple scenario where it should prove that an entity A agreed to a digital contract d. The approach is in sharp contrast to the current general views documented in the technical literature and in digital signature legislation. An entirely new view of the concepts of certification, time-stamping, revocation, and other trusted services, is proposed, potentially leading to new and more sound business models for trusted services.
  • Random systems and indistinguishability theory. Many aspects of cryptographic security proofs can be seen as the proof that a certain system (e.g. a block cipher) is indistinguishable from an ideal system (e.g. a random permutation), for different types of distinguishers. [Mau02] introduces the abstract concept of a random system and develops a theory of indistinguishability of random systems. Many complex security proofs (Luby-Rackoff, CBC-MAC, switching lemma, etc.) in the literature are instantiations of a general theorem on random systems.
  • Bounded-storage model (BSM) in cryptography. The basic idea of this model, proposed in [Mau92b], is that the security of a cryptographic protocol can be proven even against an adversary with infinite computing power, as long as the size of his memory is bounded. This allows for secure protocols in settings where neither classical information-theoretic security nor classical computational security is possible. In [DM02] the first full-fledged security proof for key agreement in this model was given. A natural conjecture of Ding and Rabin was that if the initial (short) key for a key agreement scheme secure in the BSM is generated by a computationally secure key agreement protocol such as Diffie-Hellman, then the overall scheme remains secure even if the adversary obtains infinite computing power after termination of the initial key agreement protocol (everlasting security). In [DM04] it was proved that this conjecture is false.
  • Resilience for free in secure multi-party computation. It was demonstrated in [HM01] that there exists a secure multi-party computation protocol resilient against cheating but equally efficient as protocols that are only secure if the players are guaranteed to follow the protocol correctly. The complexity of the basic operation in the best protocol was reduced from O(n^6) to O(n^2), where n is the number of players.
  • Generalized Byzantine agreement. Various new results on Byzantine agreement were published, introducing new models. For instance, it was shown in [FM00] that one can tolerate up to one half cheaters if broadcast from any sender to two other receivers is assumed to be available as a primitive.
  • Secure MPC from any linear secret sharing scheme. In [CDM00] it was shown that a secure MPC protocol can be constructed from any linear secret-sharing scheme, a primitive that was previously believed to be much weaker than MPC.
  • Cramer-Shoup cryptosystem. This public-key cryptosystem proposed in [CS98] was the first public-key cryptosystem secure (under a concrete intractability assumption) against the adaptive chosen-message attacks, the most powerful type of attack. All previous such schemes were highly inefficient.
  • General adversary structures. In [HM97] (see also [HM00]) a new adversary model in multi-party cryptography was introduced. Instead of simply counting the number of players corruptible by an adversary (and hence assuming that all players are equally hard to corrupt and independent), all subsets of potentially corruptible players can be specified. Depending on the scenario, secure MPC is possible if and only if no two (three) of these sets cover the full player set. This allows to tolerate strictly more cheating than in a threshold setting.
  • Diffie-Hellman and discrete logarithms. In [Mau94] and [MW98d] we solved a long-standing open complexity-theoretic problem in cryptography, namely proving that breaking the Diffie-Hellman key agreement protocol (the first proposed public-key scheme by which public-key cryptography was founded) is as hard as computing discrete logarithms in the underlying group. It is easy to construct groups for which the equivalence holds while for an arbitrary group the equivalence holds under a very plausible number-theoretic conjecture.
  • Efficient voting protocol. In [CGS97] the most efficient known voting protocol was proposed.
  • Group signature scheme. In [CS97] the first efficient group signature scheme for large groups was proposed. A group signature scheme allows members of a group (e.g. a company) to sign anonymously on behalf of the group. In case of dispute a special authority can determine the identity of the signer.
  • Revocable anonymity in payments. A new paradigm in anonymous payment schemes (like e-cash), together with a concrete protocol, was proposed in [CMS97]. If necessary, the anonymity can be revoked by a trusted party. This is based on earlier work [SPC95] where the concept of fair blind signatures was introduced.
  • Generation of prime numbers. In [Mau95a] the fastest known algorithm for generating prime numbers was proposed. The algorithm generates provable primes rather than only probable primes and is nevertheless faster than the known algorithms based on pseudo-primality tests. Several security conditions needed in public-key cryptography can be guaranteed without additional cost.
  • Secret-key agreement by public discussion. It was demonstrated in [Mau93a] that secret-key agreement is possible by public discussion, under mild assumptions about the information available to the legitimate parties as well as to the adversary. In contrast to public-key cryptography achieving the same goal, the security of our schemes is NOT based on any computational assumptions and holds even if an adversary has unbounded computing power. In this context novel and quite fundamental information-theoretic quantities, the secret-key rate and intrinsic mutual information [MW99a], were introduced.
  • Universal statistical test. A new statistical randomness test was proposed in [Mau92a] which measures the entropy of a given bit source, universally for the large classes of stationary ergodic sources. The test is now widely used in cryptographic and other applications.
  • Non-interactive public-key cryptography, identity-based encryption (IBE). In 1984, Shamir proposed the concept of non-interactive public-key encryption where a user's identity can be used as his public-key and hence need not be requested. The first realization of this concept was proposed in [MY96]. Later this concept became known as identity-based encryption (IBE) and has developed into a prominent area of cryptography.

Main Research Page