All papers in 2005 (Page 5 of 469 results)
Fast Elliptic Curve Point Multiplication using Double-Base Chains
Uncategorized
Uncategorized
Among the various arithmetic operations required in implementing public key
cryptographic algorithms, the elliptic curve point multiplication has probably
received the maximum attention from the research community in the
last decade.
Many methods for efficient and secure implementation of point
multiplication have been proposed.
The efficiency of these methods mainly depends on the representation
one uses for the scalar multiplier.
In the current work we propose an efficient algorithm based on the
so-called double-base number system.
We introduce the new concept of double-base chains which, if
manipulated with care, can significantly reduce the complexity of
scalar multiplication on elliptic curves.
Besides we have adopted some other measures to further reduce the
operation count.
Our algorithm compares favorably against classical and other
similar approaches.
N-adic Summation-Shrinking Generator. Basic properties and empirical evidences.
The need of software-flexible stream ciphers has led to several alternative proposals in the last few years. One of them is a new Pseudo Random Number Generator (PRNG), named N-adic Summation-Shrinking (NSumSG), which architecture is described in this paper. It uses N-1 parallel working slave summation generators and one N-adic summation generator, controlling the nonlinearity in the generator. The implementation, some properties and statistical tests of NSumSG are given.
The results from statistical analysis show that the sequence generated by NSumSG is uniform, scalable, uncompressible, whit large period; consistent and unpredictable. This gives the reason consider the NSumSG as suitable for a particular cryptographic application.
Colliding X.509 Certificates
We announce the construction of a pair of valid X.509 certificates with identical signatures.
Unconditionally Secure Constant Round Multi-Party Computation for Equality, Comparison, Bits and Exponentiation
In this paper we are interested in efficient and secure constant round
multi-party protocols which provide unconditional security against so called honest-but-curious adversaries.
In particular, we design a novel constant round protocol that converts from shares over Z_q to shares over the integers working for all shared inputs from Z_q. Furthermore, we present a constant round protocol to securely evaluate a shared input on a public polynomial whose running time is linear in the degree of the polynomial. The proposed solution makes use of Chebyshev Polynomials. We show that the latter two protocols can be used to design efficient constant round protocols for the following natural problems:
(i) Equality: Computing shares of the bit indicating if a shared input value equals zero or not. This provides the missing building blocks for many constant round linear algebra protocols from the work of Cramer and Damgaard [CrDa01].
(ii) Comparison: Computing shares of a bit indicating which of two shared inputs is greater.
(iii) Bits: Computing shares of the binary representation of a shared input value.
(iv) Exponentiation: Computing shares of x^a mod q given shares of x, a and q.
Prior to this paper, for all the above mentioned problems, there were in general no efficient constant round protocols known providing unconditional security.
Efficient hardware for the Tate pairing calculation in characteristic three
In this paper the benefits of implementation of the Tate pairing
computation in dedicated hardware are discussed. The main
observation lies in the fact that arithmetic architectures in the
extension field are good candidates for
parallelization, leading to a similar calculation time in hardware
as for operations over the base field . Using this
approach an architecture for the hardware implementation of the
Tate pairing calculation based on a modified Duursma-Lee algorithm
is proposed.
On Efficient Key Agreement Protocols
A class of efficient key agreement protocols proposed by Boyd is examined. An attack is demonstrated on a round-optimal example protocol of this class, and a simple countermeasure is suggested. The whole class is known to be vulnerable to an attack proposed by Bauer, Berson and Feiertag. A new class of key agreement protocols without this vulnerability but having the same advantages in efficiency is identified, and a number of concrete protocols are suggested.
Last updated: 2006-08-24
On the Key Schedule of Blowfish
In this article the author shows that for the block cipher Blowfish, the subkeys for the third and fourth round do not depend on the first 64 bits of the userkey
Last updated: 2005-04-05
Cryptanalysis of One Fair E-cash System
One fair e-cash system was proposed in [1].
In this paper, we show that the system is insecure.
Besides, we point out that there are two drawbacks. One is that
those integer intervals for are unappropriate. The other is that the datum in signature data is redundant.
Moreover, we give a minute description of the technique to
shun the challenge in the scheme. We think the method is
a little interesting.
Key Derivation and Randomness Extraction
Key derivation refers to the process by which an agreed upon large
random number, often named master secret, is used to derive keys to
encrypt and authenticate data. Practitioners and standardization
bodies have usually used the random oracle model to get key material
from a Diffie-Hellman key exchange. However, proofs in the standard model
require randomness extractors to formally extract the entropy of the
random master secret into a seed prior to derive other keys.
This paper first deals with the protocol , in which the key
derivation phase is (deliberately) omitted, and security inaccuracies
in the analysis and design of the Internet Key Exchange
(IKE version 1) protocol, corrected in IKEv2.
They do not endanger the practical use of IKEv1, since the security
could be proved, at least, in the random oracle model.
However, in the standard model, there is not yet any formal global security
proof, but just separated analyses which do not fit together well.
The first simplification is common in the theoretical security analysis
of several key exchange protocols, whereas the key derivation phase is a
crucial step for theoretical reasons, but also practical purpose, and
requires careful analysis. The second problem is a gap between the
recent theoretical analysis of HMAC as a good randomness extractor
(functions keyed with public but random elements) and its practical
use in IKEv1 (the key may not be totally random, because of the lack
of clear authentication of the nonces).
Since the latter problem comes from the probabilistic property of this
extractor, we thereafter review some \textit{deterministic}
randomness extractors and suggest the \emph{'Twist-AUgmented'}
technique, a new extraction method quite well-suited for
Diffie-Hellman-like scenarios.
Compact E-Cash
This paper presents efficient off-line anonymous e-cash schemes
where a user can withdraw a wallet containing 2^l coins each of which she can spend unlinkably. Our first result is a scheme, secure under the strong RSA and the y-DDHI assumptions, where the complexity of the withdrawal and spend operations is O(l+k)
and the user's wallet can be stored using O(l+k) bits, where k is a security parameter.
The best previously known schemes require at least one of these complexities to
be O(2^l k).
In fact, compared to previous e-cash schemes, our whole wallet of 2^l coins
has about the same size as one coin in these schemes.
Our scheme also offers exculpability
of users, that is, the bank can prove to third parties that a user has
double-spent.
We then extend our scheme to our second result, the first e-cash scheme that provides traceable coins without a trusted third party.
That is, once a user has double spent one of the 2^l coins in her wallet, all her spendings of these coins can be traced.
We present two alternate constructions. One construction shares the same complexities with our first result but requires a strong bilinear map assumption that is only conjectured to hold on MNT curves. The second construction works on more general types of elliptic curves, but the price for this is that the complexity of the spending and of the withdrawal protocols becomes O(lk) and O(lk + k^2) bits, respectively, and wallets take O(lk) bits of storage.
All our schemes are secure in the random oracle model.
Secret sharing schemes on graphs
Given a graph , a perfect secret sharing scheme based on is a method to distribute a secret data among the vertices of , the participants, so that a subset of participants can recover the secret if they contain an edge of , otherwise they can obtain no information regarding the secret. The average information rate is the ratio of the size of the secret and the average size of the share a participant must remember. The information rate of is the supremum of the information rates realizable by perfect secret sharing schemes.
We construct a graph on vertices with average information rate below . We obtain this result by determining, up to a constant factor, the average information rate of the /dimensional cube.
Generic Constructions of Identity-Based and Certificateless KEMs
We extend the concept of key encapsulation mechanisms to the
primitives of ID-based and certificateless encryption.
We show that the natural combination of ID-KEMs or CL-KEMs
with data encapsulation mechanisms results in encryption schemes
which are secure in a strong sense.
In addition, we give generic constructions of ID-KEMs and
CL-KEMs, as well as specific instantiations, which are provably
secure.
Tight Reductions among Strong Diffie-Hellman Assumptions
We derive some tight equivalence reductions between several Strong Diffie-Hellman (SDH) assumptions.
Deniable Authentication with RSA and Multicasting
A deniable authentication scheme using RSA is described and proven secure in the random oracle model. A countermeasure to a well-known attack on efficient deniable authentication to multiple recipients is describe and proven secure.
Untraceability of Two Group Signature Schemes
A group signature scheme allows a
group member of a given group to sign messages on behalf of
the group in an anonymous fashion. In case of
a dispute, however, a designated group manager can reveal
the signer of a valid group signature. In the paper, we
show the untraceability of two group signatures in [1, 5] by new and very simple attacks.
Although those flaws, such as, forgeability,
untraceability and linkability have been shown in [2, 7, 8, 9], we should point out that our attacks are more simple.
Last updated: 2005-03-19
Comment on cryptanalysis of Tseng et al.¡¦s authenticated encryption schemes
Recently, Xie and Yu proposed a forgery attack on the Tseng et al¡¦s authenticated encryption schemes and showed that their schemes are not secure in two cases: the specified verifier substitutes his secret key, or the signer generates the signature with these schemes for two or more specified verifiers. In addition, Xie and Yu made a small modification for the Tseng et al¡¦s schemes and claimed that the modified schemes can satisfy the security requirement. However, we show that the modified schemes are still insecure.
An Approach Towards Rebalanced RSA-CRT with Short Public Exponent
Based on the Chinese Remainder Theorem (CRT), Quisquater and Couvreur proposed an RSA variant, RSA-CRT, to speedup RSA decryption. According to RSA-CRT, Wiener suggested another RSA variant, Rebalanced RSA-CRT, to further speedup RSA-CRT decryption by shifting decryption cost to encryption cost. However, such an approach will make RSA encryption very time-consuming because the public exponent e in Rebalanced RSA-CRT will be of the same order of magnitude as £p(N). In this paper we study the following problem: does there exist any secure variant of Rebalanced RSA-CRT, whose public exponent e is much shorter than £p(N)? We solve this problem by designing a variant of Rebalanced RSA-CRT with d_{p} and d_{q} of 198 bits. This variant has the public exponent e=2^511+1 such that its encryption is about 3 times faster than that of the original Rebalanced RSA-CRT.
Picking Virtual Pockets using Relay Attacks on Contactless Smartcard Systems
A contactless smartcard is a smartcard that can communicate with other
devices without any physical connection, using Radio-Frequency
Identifier (RFID) technology. Contactless smartcards are becoming
increasingly popular, with applications like credit-cards,
national-ID, passports, physical access. The security of such
applications is clearly critical. A key feature of RFID-based systems
is their very short range: typical systems are designed to operate at
a range of ~10cm. In this study we show that contactless
smartcard technology is vulnerable to relay attacks: An attacker can
trick the reader into communicating with a victim smartcard that is
very far away. A ``low-tech'' attacker can build a pick-pocket system
that can remotely use a victim contactless smartcard, without the
victim's knowledge. The attack system consists of two devices, which
we call the ``ghost'' and the ``leech''. We discuss basic designs for
the attacker's equipment, and explore their possible operating
ranges. We show that the ghost can be up to 50m away from the card
reader---3 orders of magnitude higher than the nominal range. We also
show that the leech can be up to 50cm away from the the victim
card. The main characteristics of the attack are: orthogonality to any
security protocol, unlimited distance between the attacker and the
victim, and low cost of the attack system.
A Note on Shor's Quantum Algorithm for Prime Factorization
It's well known that Shor[1] proposed a
polynomial time algorithm for prime factorization by using quantum
computers. For a given number , he gave an algorithm for
finding the order of an element (mod ) instead of giving an algorithm for factoring directly. The indirect
algorithm is feasible because factorization can be reduced to
finding the order of an element by using randomization[2]. But a
point should be stressed that the order of the number must be
even. Actually, the restriction can be removed in a particular
case. In this paper, we show that factoring RSA modulus (a product
of two primes) only needs to find the order of , whether it is
even or not.
David Chaum's Voter Verification using Encrypted Paper Receipts
In this document, we provide an exposition of David Chaum's voter
verification method that uses encrypted paper receipts. This
document provides simply an exposition of the protocol, and does
not address any of the proofs covered in Chaum's papers.
Adversarial Model for Radio Frequency Identification
Uncategorized
Uncategorized
Radio Frequency Identification (RFID) systems aim to identify objects in open environments with neither physical nor visual contact. They consist of transponders inserted into objects, of readers, and usually of a database which contains information about the objects. The key point is that authorised readers must be able to identify tags without an adversary being able to trace them. Traceability is often underestimated by advocates of the technology and sometimes exaggerated by its detractors. Whatever the true picture, this problem is a reality when it blocks the deployment of this technology and some companies, faced with being boycotted, have already abandoned its use. Using cryptographic primitives to thwart the traceability issues is an approach which has been explored for several years. However, the research carried out up to now has not provided satisfactory results as no universal formalism has been defined.
In this paper, we propose an adversarial model suitable for RFID environments. We define the notions of existential and universal untraceability and we model the access to the communication channels from a set of oracles. We show that our formalisation fits the problem being considered and allows a formal analysis of the protocols in terms of traceability. We use our model on several well-known RFID protocols and we show that most of them have weaknesses and are vulnerable to traceability.
Cryptanalysis of two identification schemes based on an ID-based cryptosystem
Two identification schemes based on the Maurer-Yacobi ID-based
cryptosystem are analysed and shown to suffer from serious
security problems.
Cryptanalysis of an anonymous wireless authentication and conference key distribution scheme
In this paper we analyse an anonymous wireless authentication and
conference key distribution scheme which is also designed to
provide mobile participants with user identification privacy
during the conference call. The proposed scheme consists of three
sub-protocols: the Call Set-Up Authentication Protocol, the
Hand-Off Authentication Protocol, and the Anonymous Conference
Call Protocol. We show that the proposed scheme suffers from a
number of security vulnerabilities.
New Approaches for Deniable Authentication
Deniable Authentication protocols allow a Sender to authenticate a message for a Receiver, in a way that the Receiver cannot convince a third party that such authentication (or any authentication) ever took place.
We present two new approaches to the problem of deniable authentication. The novelty of our schemes is that they do not require the use of CCA-secure encryption (all previous known solutions did), thus showing a different generic approach to the problem of deniable authentication. These new approaches are practically relevant as they lead to more efficient protocols.
In the process we point out a subtle definitional issue for deniability. In particular we propose the notion of "forward deniability", which requires that the authentications remain deniable even if the Sender wants to later prove that she authenticated a message. We show that a simulation-based definition of deniability, where the simulation can be computationally indistinguishable from the real protocol does not imply forward deniability. Thus for deniability one needs to restrict the simulation to be perfect (or statistically close). Our new protocols satisfy this stricter requirement.
Choosing Parameter Sets for NTRUEncrypt with NAEP and SVES-3
We present, for the first time, an algorithm to choose parameter sets for NTRUEncrypt that give a desired level of security.
Note: This is an expanded version of a paper presented at CT-RSA 2005.
On the affine classification of cubic bent functions
We consider cubic boolean bent functions, each cubic monomial of which contains the same variable. We investigate canonical forms of these functions under affine transformations of variables.
In particular, we refine the affine classification of cubic bent functions of 8 variables.
An Efficient Solution to The Millionaires' Problem Based on Homomorphic Encryption
Uncategorized
Uncategorized
We proposed a two-round protocol for solving the
Millionaires' Problem in the setting of semi-honest
parties.
Our protocol uses either multiplicative or additive
homomorphic encryptions.
Previously proposed protocols used additive or XOR
homomorphic encryption schemes only.
The computation and communication costs of our protocol
are in the same asymptotic order as those of
the other efficient protocols.
Nevertheless, since multiplicative homomorphic encryption
scheme is more efficient than an additive one practically,
our construction saves computation time and communication
bandwidth in practicality.
Polyhedrons over Finite Abelian Groups and Their Cryptographic Applications
We are using the group-theory methods for justification of
algebraic method in cryptanalysis. The obtained results are using
for investigation of Boolean functions cryptographic properties.
On the Security of a Group Signature Scheme with Strong Separability
A group signature scheme allows a
group member of a given group to sign messages on behalf of
the group in an anonymous and unlinkable fashion. In case of
a dispute, however, a designated group manager can reveal
the signer of a valid group signature. Many applications of
group signatures require that the group manager can be split
into a membership manager and a revocation manager. Such a
group signature scheme with strong separability was proposed
in paper [1]. Unfortunately, the scheme is insecure which has been shown in [2][3][4]. In this paper
we show that the scheme is untraceable by a simple and direct attack. Besides, we show its universal forgeability by a
general attack which only needs to choose five random numbers.
We minutely explain the technique to shun the challenge in
the scheme.
Unfairness of a protocol for certified delivery
Recently, Nenadić \emph{et al.} (2004) proposed the RSA-CEGD
protocol for certified delivery of e-goods. This is a relatively
complex scheme based on verifiable and recoverable encrypted
signatures (VRES) to guarantee properties such as strong fairness
and non-repudiation, among others. In this paper, we demonstrate how
this protocol cannot achieve fairness by presenting a severe attack
and also pointing out some other weaknesses.
Distinguishing Stream Ciphers with Convolutional Filters
This paper presents a new type of distinguisher for the shrinking generator and the alternating-step generator with known feedback polynomial and for the multiplexor generator. For the former the distinguisher is more efficient than existing ones and for the latter it results in a complete breakdown of security. The distinguisher is conceptually very simple and lends itself to theoretical analysis leading to reliable predictions of its probability of success.
Last updated: 2005-03-13
Cryptanalysis of improvement of digital signature with message recovery using self-certified public keys and its variants
In 2003, Tseng et al. proposed a self-certified public key signature with message recovery, which gives two advantages: one is that the signer¡¦s public key can simultaneously be authenticated in verifying the signature and the other one is that only the specified verifier can recover the message. Lately, Xie and YU proposed an attack to the Tseng et al.¡¦s scheme under the cases: the specified verifier substitutes his secret key or two or more specified verifiers cooperatively forge the signer¡¦s signature. About the same time, Shao also proposed another insider forgery attack to break the Tseng et al.¡¦s scheme. In addition, he claimed the Tseng et al.¡¦s scheme without the properties of non-repudiation and forward security. Therefore, he proposed an improved scheme to overcome the weakness. In this paper, we will show that the Shao¡¦s improved scheme is still insecure against the insider forgery attack. A specified verifier can forge many different valid signatures with the same message to the other verifiers who cooperatively provide their secret keys. Furthermore, we give a small modification to overcome this weakness.
Improving Secure Server Performance by Re-balancing SSL/TLS Handshakes
Much of today's distributed computing takes place in a client/server model.
Despite advances in fault tolerance -- in particular, replication and load
distribution -- server overload remains to be
a major problem. In the Web context, one of the main overload factors is the
direct consequence of expensive Public Key operations performed by servers
as part of each SSL handshake. Since most SSL-enabled servers use RSA,
the burden of performing many costly decryption operations can be
very detrimental to server performance. This paper examines a
promising technique for re-balancing RSA-based client/server
handshakes. This technique facilitates more favorable load distribution
by requiring clients to perform more work (as part of encryption) and
servers to perform commensurately less work, thus resulting in better
SSL throughput. Proposed techniques are based on careful adaptation of
variants of Server-Aided RSA originally constructed by
Matsumoto, et al. Experimental results demonstrate that
suggested methods (termed Client-Aided RSA) can speed up processing
by a factor of between 11 to 19, depending on the RSA key size. This represents
a considerable improvement. Furthermore, proposed techniques can be a useful
companion tool for SSL Client Puzzles in defense against DoS and DDoS attacks.
Concurrent Composition of Secure Protocols in the Timing Model
In the setting of secure multiparty computation, a set of mutually
distrustful parties wish to securely compute some joint function
of their inputs. In the stand-alone case, it has been shown that
{\em every} efficient function can be securely computed.
However, in the setting of concurrent composition, broad
impossibility results have been proven for the case of no honest
majority and no trusted setup phase. These results hold both for
the case of general composition (where a secure protocol is run
many times concurrently with arbitrary other protocols) and self
composition (where a single secure protocol is run many times
concurrently).
In this paper, we investigate the feasibility of obtaining
security in the concurrent setting, assuming that each party has a
local clock and that these clocks proceed at approximately the
same rate. We show that under this mild timing assumption, it is
possible to securely compute {\em any} multiparty functionality
under concurrent \emph{self} composition. We also show that it
is possible to securely compute {\em any} multiparty
functionality under concurrent {\em general} composition, as
long as the secure protocol is run only with protocols whose
messages are delayed by a specified amount of time. On the
negative side, we show that it is impossible to achieve security
under concurrent general composition with no restrictions
whatsoever on the network (like the aforementioned delays), even
in the timing model.
An Efficient CDH-based Signature Scheme With a Tight Security Reduction
Uncategorized
Uncategorized
At Eurocrypt'03, Goh and Jarecki showed that, contrary to other
signature schemes in the discrete-log setting, the EDL signature
scheme has a tight security reduction, namely to the
Computational Diffie-Hellman (CDH) problem, in the Random Oracle
(RO) model. They also remarked that EDL can be turned into an
off-line/on-line signature scheme using the technique of Shamir
and Tauman, based on chameleon hash functions.
In this paper, we propose a new signature scheme that also has a
tight security reduction to CDH but whose resulting signatures
are smaller than EDL signatures. Further, similarly to the
Schnorr signature scheme (but contrary to EDL), our signature is
naturally efficient on-line: no additional trick is needed for
the off-line phase and the verification process is unchanged.
For example, in elliptic curve groups, our scheme results in a
25% improvement on the state-of-the-art discrete-log based
schemes, with the same security level. This represents to date
the most efficient scheme of any signature scheme with a tight
security reduction in the discrete-log setting.
Flexible Framework for Secret Handshakes (Multi-Party Anonymous and Un-observable Authentication)
Uncategorized
Uncategorized
In the society increasingly concerned with the erosion of privacy,
privacy-preserving techniques are becoming very important.
This motivates research in cryptographic techniques offering
built-in privacy.
A secret handshake is a protocol whereby participants establish
a secure, anonymous and unobservable communication channel only
if they are members of the same group. This type of ``private"
authentication is a valuable tool in the arsenal of privacy-preserving
cryptographic techniques. Prior research focused on 2-party secret
handshakes with one-time credentials.
This paper breaks new ground on two accounts: (1) it shows how
to obtain secure and efficient secret handshakes with reusable
credentials, and (2) it represents the first treatment of group
(or {\em multi-party}) secret handshakes, thus providing a
natural extension to the secret handshake technology. An
interesting new issue encountered in multi-party secret handshakes
is the need to ensure that all parties are indeed distinct.
(This is a real challenge since the parties cannot expose their
identities.) We tackle this and other challenging issues in
constructing GCD -- a flexible framework for secret handshakes.
The proposed framework lends itself to many practical
instantiations and offers several novel and appealing features
such as self-distinction and strong anonymity with reusable
credentials. In addition to describing the motivation and
step-by-step construction of the framework, this paper provides
a thorough security analysis and illustrates two concrete
framework instantiations.
An Attack on CFB Mode Encryption As Used By OpenPGP
This paper describes an adaptive-chosen-ciphertext attack on the Cipher Feedback (CFB) mode of encryption as used in OpenPGP. In most circumstances it will allow an attacker to determine 16 bits of any block of plaintext with about oracle queries for the initial
setup work and oracle queries for each block. Standard CFB mode encryption does not appear to be affected by this attack. It applies to a particular variation of CFB used by OpenPGP. In particular it exploits an ad-hoc integrity check feature in OpenPGP which was meant as a "quick check" to determine the correctness of the decrypting symmetric key.
On the Notion of Statistical Security in Simulatability Definitions
We investigate the definition of statistical security (i.e.,
security against unbounded adversaries) in the framework of reactive
simulatability. This framework allows to formulate and analyze
multi-party protocols modularly by providing a composition theorem
for protocols. However, we show that the notion of statistical
security, as defined by Backes, Pfitzmann and Waidner for the
reactive simulatability framework, does not allow for secure
composition of protocols. This in particular invalidates the proof
of the composition theorem.
We give evidence that the reason for the non-composability of
statistical security is no artifact of the framework itself, but of
the particular formulation of statistical security. Therefore, we
give a modified notion of statistical security in the reactive
simulatability framework. We prove that this notion allows for
secure composition of protocols.
As to the best of our knowledge, no formal definition of statistical
security has been fixed for Canetti's universal composability
framework, we believe that our observations and results can also
help to avoid potential pitfalls there.
The Vector Decomposition Problem for Elliptic and Hyperelliptic Curves
The group of m-torsion points on an elliptic curve, for a prime
number m, forms a two-dimensional vector space. It was suggested
and proven by Yoshida that under certain conditions the vector
decomposition problem (VDP) on a two-dimensional vector space is
at least as hard as the computational Diffie-Hellman problem
(CDHP) on a one-dimensional subspace. In this work we show that
even though this assessment is true, it applies to the VDP for
m-torsion points on an elliptic curve only if the curve is
supersingular. But in that case the CDHP on the one-dimensional
subspace has a known sub-exponential solution. Furthermore, we
present a family of hyperelliptic curves of genus two that are
suitable for the VDP.
Weak keys of the Diffie Hellman key exchange II : Pairing based schemes on elliptic curves.
Uncategorized
Uncategorized
This paper develops a cryptanalysis of the pairing based Diffie Hellman (DH) key exchange schemes which have found important applications as in the tripartite exchange scheme proposed in \cite{joux}. The analysis of \emph{weak keys} of the standard DH scheme proposed in \cite{kas1} is applied to show existence of weak sessions for tripartite schemes over super-singular curves. It is shown that for such sessions the associated Bilinear Diffie Hellman Problem (BDHP) is solvable in polynomial time, without computing the private keys i.e. without solving the discrete logarithms. Similar
applications of the analysis to Decisional Diffie Hellman Problem (DDHP)and the Identity Based DH scheme (IBS) are also developed. The tripartite key exchange scheme is analyzed in detail and it is shown that the number of weak keys increases in this scheme as compared to the standard two party DH scheme. It is shown that the random choice of private keys by the users independent of each other's knowledge is insecure in these schemes. Algorithms are
suggested for checking weakness of private keys based on an order of selection. A modified tripartite key exchange scheme is presented in which detection of weak keys is incorporated.
A model and architecture for pseudo-random generation with applications to /dev/random
Uncategorized
Uncategorized
We present a formal model and a simple architecture for robust pseudorandom generation that ensures resilience in the face of an
observer with partial knowledge/control of the generator's entropy source. Our model and architecture have the following properties:
1 Resilience: The generator's output looks random to an observer with no knowledge of the internal state. This holds even if that observer has complete control over data that is used to refresh the internal state.
2 Forward security: Past output of the generator looks random to an observer, even if the observer learns the internal state at a later time.
3 Backward security/Break-in recovery: Future output of the generator looks random, even to an observer with knowledge of the current state, provided that the generator is refreshed with data of sufficient entropy.
Architectures such as above were suggested before. This work differs
from previous attempts in that we present a formal model for robust
pseudo-random generation, and provide a formal proof within this model
for the security of our architecture. To our knowledge, this is the
first attempt at a rigorous model for this problem.
Our formal modeling advocates the separation of the *entropy extraction* phase from the *output generation* phase. We argue that the former is information-theoretic in nature, and could therefore rely on combinatorial and statistical tools rather than on cryptography. On the other hand, we show that the latter can be implemented using any standard (non-robust) cryptographic PRG.
We also discuss the applicability of our architecture for applications such as /dev/(u)random in Linux and pseudorandom generation on smartcards.
Improved Proxy Re-Encryption Schemes with Applications to Secure Distributed Storage
Uncategorized
Uncategorized
In 1998, Blaze, Bleumer, and Strauss (BBS) proposed an application called
atomic proxy re-encryption, in which a semi-trusted proxy
converts a ciphertext for Alice into a ciphertext for Bob without
seeing the underlying plaintext. We predict that fast and
secure re-encryption will become increasingly popular as a method for
managing encrypted file systems. Although efficiently computable, the
wide-spread adoption of BBS re-encryption has been hindered by
considerable security risks. Following recent work of Ivan and Dodis,
we present new re-encryption schemes that realize a stronger notion of
security and we demonstrate the usefulness of proxy re-encryption as a
method of adding access control to the SFS read-only file system.
Performance measurements of our experimental file system demonstrate
that proxy re-encryption can work effectively in practice.
Tag-KEM/DEM: A New Framework for Hybrid Encryption
This paper presents a novel framework for the generic construction of hybrid encryption schemes which produces more efficient schemes than the ones known before. A previous framework introduced by Shoup combines a key encapsulation mechanism (KEM) and a data encryption mechanism (DEM). While it is sufficient to require both components to be secure against chosen ciphertext attacks (CCA-secure), Kurosawa and Desmedt showed a particular example of KEM that is not CCA-secure but can be securely combined with a specific type of CCA-secure DEM to
obtain a more efficient, CCA-secure hybrid encryption scheme. There are also many other efficient hybrid encryption schemes in the literature that do not fit Shoup's framework. These facts serve as motivation to seek another framework.
The framework we propose yields more efficient hybrid scheme, and in addition provides insightful explanation about existing schemes that do not fit into the previous framework. Moreover, it allows immediate
conversion from a class of threshold public-key encryption to a hybrid one without considerable overhead, which may not be possible in the previous approach.
Techniques for random maskin in hardware
A new technique for Boolean random masking of the logic AND operation in terms of NAND logic gates
is presented and its potential for masking arbitrary cryptographic functions is pointed out.
The new technique is much more efficient than a previously known technique, recently applied to AES.
It is also applied for masking the integer addition.
In addition, new techniques for the conversions from Boolean to arithmetic random masking and vice versa
are developed. They are hardware oriented and do not require additional random bits.
Unlike the previous, software-oriented techniques showing a substantial difference in the complexity
of the two conversions, they have a comparable complexity being about the same as that
of one integer addition only.
All the techniques proposed are in theory secure against the first-order differential
power analysis on the logic gate level.
They can be applied in hardware implementations of various cryptographic functions,
including AES, (keyed) SHA-1, IDEA, and RC6.
Analysis of Affinely Equivalent Boolean Functions
By walsh
transform, autocorrelation function, decomposition, derivation and
modification of truth table, some new invariants are obtained.
Based on invariant theory, we get two results: first a general
algorithm which can be used to judge if two boolean functions are
affinely equivalent and to obtain the affine equivalence
relationship if they are equivalent. For example, all 8-variable
homogenous bent functions of degree 3 are classified into 2
classes; second, the classification of the Reed-Muller code
which can be used to almost
enumeration of 8-variable bent functions.
Weak keys of the Diffe Hellman key exchange I
Uncategorized
Uncategorized
This paper investigates the Diffie-Hellman key exchange scheme
over the group of nonzero elements of finite fields and
shows that there exist exponents , satisfying certain
conditions called the \emph{modulus conditions}, for which the
Diffie Hellman Problem (DHP) can be solved in polynomial number
of operations in without solving the discrete logarithm problem (DLP). These special private keys of the scheme are termed
\emph{weak} and depend also on the generator of the cyclic group. More generally the triples with generator
and one of private keys weak, are called \emph{weak triples}. A sample
of weak keys is computed and it is observed that their number may not be
insignificant to be ignored in general. Next, an extension of the
analysis and weak triples is carried out for the Diffie Hellman
scheme over the matrix group and it is shown that for an
analogous class of session triples, the DHP can be solved without
solving the DLP in polynomial number of operations in the matrix
size . A revised Diffie Hellman assumption is stated, taking into account the above exceptions.
A Construction of Public-Key Cryptosystem Using Algebraic Coding on the Basis of Superimposition and Randomness
In this paper, we present a new class of public-key cryptosystem (PKC) using algebraic coding on the basis of superimposition and randomness. The proposed PKC is featured by a generator matrix, in a characteristic form, where the generator matrix of an algebraic code is repeatedly used along with the generator matrix of a random code, as sub-matrices. This generator matrix, in the characteristic form, will be referred to as -matrix. We show that the -matrix yields the following advantages compared with the conventional schemes:
\begin{description}
\item [(i)] It realizes an abundant supply of PKCs, yielding more secure PKCs.
\item [(i\hspace{-.1em}i)] It realizes a fast encryption and decryption process.
\end{description}
An Improved and Efficient Countermeasure against Power Analysis Attacks
Uncategorized
Uncategorized
Recently new types of differential power analysis attacks (DPA)
against elliptic curve cryptosystems (ECC) and RSA systems have been
introduced. Most existing countermeasures against classical DPA
attacks are vulnerable to these new DPA attacks which include
refined power analysis attacks (RPA), zero-value point attacks
(ZPA), and doubling attacks. The new attacks are different from
classical DPA in that RPA uses a special point with a zero-value
coordinate, while ZPA uses auxiliary registers to locate a zero
value. So, Mamiya et al proposed a new countermeasure against RPA,
ZPA, classical DPA and SPA attacks using a basic random initial
point. His countermeasure works well when applied to ECC, but it has
some disadvantages when applied to general exponentiation algorithms
(such as RSA and ElGamal) due to an inverse computation. This paper
presents an efficient and improved countermeasure against the above
new DPA attacks by using a random blinding concept on the message
different from Mamiya's countermeasure and show that our proposed
countermeasure is secure against SPA based Yen's power analysis
which can break Coron's simple SPA countermeasure as well as
Mamiya's one. The computational cost of the proposed scheme is very
low when compared to the previous methods which rely on Coron's
simple SPA countermeasure. Moreover this scheme is a generalized
countermeasure which can be applied to ECC as well as RSA system.
Partial Hiding in Public-Key Cryptography
This paper explores the idea of partially exposing sections of the private key in public-key cryptosystems whose security is based on the intractability of factorising large integers.
It is proposed to allow significant portions of the private key to be publicly available, reducing the amount of data which must be securely hidden.
The ``secret'' data could be XORed with an individual's biometric reading in order to maintain a high level of security, and we suggest using iris templates for this purpose.
Finally, we propose an implementation of this system for RSA, and consider the potential risks and advantages associated with such a scheme.
(De)Compositions of Cryptographic Schemes and their Applications to Protocols
The main result of this paper is that the Dolev-Yao model is a safe abstraction of the computational model for security protocols including those that combine asymmetric and symmetric encryption, signature and hashing. Moreover, message forwarding and private key transmission are allowed. To our knowledge this is the first result that deals with hash functions and the combination of these cryptographic primitives.
A key step towards this result is a general definition of correction of cryptographic primitives, that unifies well known correctness criteria such as IND-CPA, IND-CCA, unforgeability etc.... and a theorem that allows to reduce the correctness of a composition of two cryptographic schemes to the correctness of each one.
The Full Abstraction of the UC Framework
We prove that security in the Universal Composability framework (UC) is equivalent to security in the probabilistic polynomial time calculus ppc. Security is defined under active and adaptive adversaries with synchronous and authenticated communication. In detail, we define an encoding from machines in UC to processes in ppc and show it is fully abstract with respect to UC-security and ppc-security, i.e., we show a protocol is UC-secure iff its encoding is ppc-secure. However, we restrict security in ppc to be quantified not over all possible contexts, but over those induced by UC-environments under encoding. This result is not overly-simplifying security in ppc, since the threat and communication models we assume are meaningful in both practice and theory.
Collusion Resistant Broadcast Encryption With Short Ciphertexts and Private Keys
We describe two new public key broadcast encryption systems for
stateless receivers. Both systems are fully secure against any number
of colluders. In our first construction both ciphertexts and private
keys are of constant size (only two group elements), for any
subset of receivers. The public key size in this system is
linear in the total number of receivers. Our second system is a
generalization of the first that provides a tradeoff between
ciphertext size and public key size. For example, we achieve a
collusion resistant broadcast system for n users where both
ciphertexts and public keys are of size O(sqrt(n)) for any subset
of receivers. We discuss several applications of these systems.
Side Channel Attacks on Implementations of Curve-Based Cryptographic Primitives
The present survey deals with the recent research in side channel
analysis and related attacks on implementations of cryptographic
primitives. The focus is on software contermeasures for primitives
built around algebraic groups. Many countermeasures are described,
together with their extent of applicability, and their weaknesses.
Some suggestions are made, conclusion are drawn, some directions for
future research are given. An extensive bibliography on recent
developments concludes the survey.
Narrow T-functions
Uncategorized
Uncategorized
T-functions were introduced by Klimov and Shamir in a series of papers during the last few years. They are of great interest for cryptography as they may provide some new building blocks which can be used to construct efficient and secure schemes, for example block ciphers, stream ciphers or hash functions.
In the present paper, we define the narrowness of a T-function and study how this property affects the strength of a T-function as a cryptographic primitive.
We define a new data strucure, called a solution graph, that enables solving systems of equations given by T-functions. The efficiency of the algorithms which we propose for solution graphs depends significantly on the narrowness of the involved T-functions.
Thus the subclass of T-functions with small narrowness appears to be weak and should be avoided in cryptographic schemes.
Furthermore, we present some extensions to the methods of using solution graphs, which make it possible to apply these algorithms also to more general systems of equations, which may appear, for example, in the cryptanalysis of hash functions.
Hierarchical Identity Based Encryption with Constant Size Ciphertext
We present a Hierarchical Identity Based Encryption (HIBE) system
where the ciphertext consists of just three group elements and decryption
requires only two bilinear map computations,
independent of the hierarchy depth. Encryption is as efficient
as in other HIBE systems. We prove that the scheme is selective-ID secure
in the standard model and fully secure in the random oracle
model. Our system has a number of applications: it gives very
efficient forward secure public key and identity based cryptosystems (where ciph
ertexts are
short), it converts the NNL broadcast encryption system into an
efficient public key broadcast system, and it provides an efficient
mechanism for encrypting to the future. The system also supports
limited delegation where users can be given restricted private keys
that only allow delegation to certain descendants. Sublinear size private
keys can also be achieved at the expense of some ciphertext expansion.
A Chosen Ciphertext Attack on a Public Key Cryptosystem Based on Lyndon Words
In this paper, we present a chosen ciphertext attack against a
public key cryptosysten based on Lyndon words \cite{sm}. We show
that, provided that an adversary has access to a decryption oracle,
a key equivalent to the secret key can be constructed efficiently,
i.e. in linear time.
Comments: Insider attack on Cheng et al.'s pairing-based tripartite key agreement protocols
Recently, Cheng et al. proposed two tripartite key agreement protocols from pairings: one is certificate-based and the other is identity-based (ID-based). In this article, we show that the two schemes are vulnerable to the insider impersonation attack and the ID-based scheme even discloses the entities¡¦ private keys. Solutions to this problem are discussed.
Efficient Certificateless Public Key Encryption
In [3] Al-Riyami and Paterson introduced the notion of "Certificateless Public Key Cryptography" and presented an instantiation. In this paper, we revisit the formulation of certificateless public key encryption and construct a more efficient scheme and then extend it to an authenticated
encryption.
An Improved Elegant Method to Re-initialize Hash Chains
Hash chains are widely used in various cryptographic systems such as electronic micropayments and one-time passwords etc. However, hash chains suffer from the limitation that they have a finite number of links which when used up requires the system to re-initialize new hash chains. So system design has to reduce the overhead when hash chains are re-initialized. Recently, Vipul Goyal proposed an elegant one-time-signature-based method to re-initialize hash chains, in this efficient method an infinite number of finite length hash chains can be tied together so that hash chains can be securely re-initialized in a non-repudiable manner. Vipul Goyal¡¯s method is improved in this paper to reach a little more efficient method, which, more importantly, is a natural extension of the concept of conventional hash chains.
Update on SHA-1
We report on the experiments we performed in order to assess the
security of SHA-1 against the attack by Chabaud and Joux. We present some ideas for optimizations of the attack and some properties of the message expansion routine.
Finally, we show that for a reduced version of SHA-1, with 53
rounds instead of 80, it is possible to find collisions in less
than operations.
Mixing properties of triangular feedback shift registers
Uncategorized
Uncategorized
The purpose of this note is to show that Markov chains induced by non-singular triangular feedback shift registers and non-degenerate sources are rapidly mixing. The results may directly be applied to the post-processing of random generators and to stream ciphers in CFB mode.
Comments on ``Distributed Symmetric Key Management for Mobile Ad hoc Networks" from INFOCOM 2004
In IEEE INFOCOM 2004, Chan proposed a distributed key management
scheme for mobile ad hoc networks, and deduced the condition under
which the key sets distributed to the network nodes can form a
cover-free family (CFF), which is the precondition that the scheme
can work. In this paper, we indicate that the condition is falsely
deduced. Furthermore, we discuss whether CFF is capable for key
distributions in ad hoc networks.
The Misuse of RC4 in Microsoft Word and Excel
In this report, we point out a serious security flaw in Microsoft Word and Excel. The stream cipher RC4 with key length up to 128 bits is used in Microsoft Word and Excel to protect the documents. But when an encrypted document gets modified and saved, the initialization vector remains the same and thus the same keystream generated from RC4 is applied to encrypt the different versions of that document. The consequence is disastrous since a lot of information of the document could be recovered easily.
A Metric on the Set of Elliptic Curves over .
Elliptic Curves over finite field have found application in many areas including cryptography. In the current article we define a metric on the set of elliptic curves defined over a prime field .
A sufficient condition for key-privacy
The notion of key privacy for encryption schemes was defined formally by Bellare, Boldyreva, Desai and Pointcheval in Asiacrypt 2001. This notion seems useful in settings where anonymity is important. In this short note we describe a (very simple) sufficient condition for key privacy. In a nutshell, a scheme that provides data privacy is guaranteed to provide also key privacy if the distribution of a *random encryption of a random message* is independent of the public key that is used for the encryption.
Benes and Butterfly schemes revisited
Uncategorized
Uncategorized
In~\cite{AV96}, W. Aiello and R. Venkatesan have shown how to
construct pseudo-random functions of bits
bits from pseudo-random functions of bits
bits. They claimed that their construction, called ``Benes'',
reaches the optimal bound ( ) of security against
adversaries with unlimited computing power but limited by
queries in an adaptive chosen plaintext attack (CPA-2). However a
complete proof of this result is not given in~\cite{AV96} since
one of the assertions of~\cite{AV96} is wrong. Due to this, the
proof given in~\cite{AV96} is valid for most attacks, but not for
all the possible chosen plaintext attacks. In this paper we will
in a way fix this problem since for all , we will
prove CPA-2 security when . However we
will also see that the probability to distinguish Benes functions
from random functions is sometime larger than the term in
given in~\cite{AV96}. One of the key idea in
our proof will be to notice that, when and
, for large number of variables linked with some critical
equalities, the average number of solutions may be large (i.e.
) while, at the same time, the probability to have at least
one such critical equalities is negligible (i.e. ).
\textbf{Key Words}: Pseudo-random functions, unconditional
security, information-theoretic primitive, design of keyed hash
functions.
Cryptanalysis of Hiji-bij-bij (HBB)
In this paper, we show several known-plaintext attacks on the stream cipher HBB which was proposed recently at INDOCRYPT 2003. The cipher can operate either as a classical stream cipher (in the B mode) or as an asynchronous stream cipher (in the SS mode). In the case of the SS mode, we present known-plaintext attacks recovering 128-bit key with the complexity 2^66 and 256-bit key with the complexity 2^67. In the case of B mode with 256-bit key, we show a known-plaintext attack recovering the whole plaintext with the complexity 2^140. All attacks need only a small part of the plaintext to be known.
Logcrypt: Forward Security and Public Verification for Secure Audit Logs
Logcrypt provides strong cryptographic assurances that data stored by
a logging facility before a system compromise cannot be modified after
the compromise without detection. We build on prior work by showing
how log creation can be separated from log verification, and
describing several additional performance and convenience features not
previously considered.
On Obfuscating Point Functions
We study the problem of obfuscation in the context of point functions
(also known as delta functions). A point function is a Boolean
function that assumes the value 1 at exactly one point. Our main
results are as follows:
- We provide a simple construction of efficient obfuscators for
point functions for a slightly relaxed notion of obfuscation - wherein
the size of the simulator has an inverse polynomial dependency on the
distinguishing probability - which is nonetheless impossible for
general circuits. This is the first known construction of obfuscators
for a non-trivial family of functions under general computational
assumptions. Our obfuscator is based on a probabilistic hash function
constructed from a very strong one-way permutation, and does
not require any set-up assumptions. Our construction also yields
an obfuscator for point functions with multi-bit output.
- We show that such a strong one-way permutation - wherein any
polynomial-sized circuit inverts the permutation on at most a
polynomial number of inputs - can be realized using a random
permutation oracle. We prove the result by improving on the counting
argument used in [GT00]; this result may be of independent
interest. It follows that our construction yields obfuscators for
point functions in the non-programmable random permutation oracle
model (in the sense of [N02]). Furthermore, we prove that an
assumption like the one we used is necessary for our obfuscator
construction.
- Finally, we establish two impossibility results on obfuscating
point functions which indicate that the limitations on our
construction (in simulating only adversaries with single-bit output
and in using non-uniform advice in our simulator) are in some sense
inherent. The first of the two results is a consequence of a simple
characterization of functions that can be obfuscated against general
adversaries with multi-bit output as the class of functions that are
efficiently and exactly learnable using membership queries.
We stress that prior to this work, what is known about obfuscation are
negative results for the general class of circuits [BGI01] and
positive results in the random oracle model [LPS04] or under
non-standard number-theoretic assumptions [C97]. This work
represents the first effort to bridge the gap between the two for a
natural class of functionalities.
- « Previous
- 1
- ...
- 4
- 5