2010 Reports :  Cryptology ePrint Archive Forum
Discussion forum for Cryptology ePrint Archive reports posted in 2010. Please put the report number in the subject.  

Current Page: 1 of 1
Results 1 - 20 of 20
27-Mar-2011 06:32
wai2ha
We can always only use one surjection round in the last iteration to recovere the domain $X$ by a sum block $ÓM_(L-1)$(assume the message was L- blocks),whenever the previous reductions were great or
Forum: 2010 Reports
28-Jan-2011 13:08
wai2ha
One of the key questions is that processing the last block with additional bits in a normal iterative hash function,there's the entropy of CV_(L-1) only n bits,namely a n-bit domain X maps to a n-
Forum: 2010 Reports
28-Dec-2010 15:58
wai2ha
It's interesting that 2010/430 and 2010/384 said there's a big trouble with narrow-pipe hash functions.
Forum: 2010 Reports
24-Dec-2010 17:00
wai2ha
Applications of surjection round try to thwart the conclusion( Of 2010/384 and 2010/430) on narrow-pipe hash functions.
Forum: 2010 Reports
24-Dec-2010 16:49
wai2ha
A new text 2010/652 can thwart the conclusions on narrow-pipe hash functions.
Forum: 2010 Reports
02-Dec-2010 13:06
kyqf
The writer of 2010/384 gives a conclusion that a narrow-pipe hash function will lose the entropy and the codomain will Reduce.However,the ideal random compression function C is designated by the writ
Forum: 2010 Reports
01-Dec-2010 16:55
kyqf
If the Ideal random compression functions C is always chosen and kept as a surjective function namely a onto mapping,what about the conclusion?
Forum: 2010 Reports
05-Aug-2010 22:02
hugo
A few weeks ago, 7-15-2010, in a discussion in the TLS WG mailing list regarding the applicability of the results in this paper to TLS (and practical crypto in general) I posted the following comments
Forum: 2010 Reports
04-Aug-2010 23:05
Vlastimil Klima
Please excuse our delay. We will write the update and more accurate version of the paper very soon. Vlastimil Klima
Forum: 2010 Reports
21-Jul-2010 16:41
marshray
In many real-world situations an attacker is able to extend the length of messages with chosen text in an attempt to engineer a collision. In these cases, there may be 160 bits of entropy coming in th
Forum: 2010 Reports
21-Jul-2010 09:37
Orr
marshray: If you enter a random chaining value and a random message to SHA-1's compression function, you put in 512+160 bits of entropy. You would get with very high probability 160 bits of entropy
Forum: 2010 Reports
20-Jul-2010 19:22
marshray
When evaluating the effect of this phenomenon on actual hash designs, it's probably important to look inside the block structure as well. For example, SHA-1: for i from 0 to 79 // thanks Wikipedia
Forum: 2010 Reports
19-Jul-2010 20:17
gligoroski
Dear Marsh, From the link you have sent I quote: "I did some Monte Carlo testing and found that my intuition was mostly wrong. The entropy loss effect is also observable with the Davies-Meyer cons
Forum: 2010 Reports
19-Jul-2010 19:34
marshray
Yes, but some of the possible compression functions approximate a random function more closely than others. If we have information that a given construction does not, then we may be obligated not to a
Forum: 2010 Reports
17-Jul-2010 06:37
gligoroski
marshray Wrote: ------------------------------------------------------- > So the > > h_(i+1) = h_i + F(h_i, m_i) > > construction used by seems to be credited to > Davies-Meyer. It's not
Forum: 2010 Reports
15-Jul-2010 04:47
marshray
So the h_(i+1) = h_i + F(h_i, m_i) construction used by seems to be credited to Davies-Meyer. It's not used in MD2 but is used in MD4 (RFC 1186 October 1990), MD5, SHA-1, and SHA-2. Howev
Forum: 2010 Reports
15-Jul-2010 01:12
marshray
The model h_(i+1) = compress(h_i, m_i) may be a bit of an oversimplification. Notice that in actual SHA-256 the result of the compression(h_i, m_i) is added into the h_i input chaining values
Forum: 2010 Reports
09-Jul-2010 20:47
gligoroski
I am receiving very good comments, suggestions and corrections from Jean-Philippe and Orr - so soon I will post corrected version of the paper. But Duchman is right: this is more theoretical work a
Forum: 2010 Reports
09-Jul-2010 16:05
Orr
The latest version I've read contains mistakes which are currently are under discussion with the authors (offline). For example, the claims concerning the lose of entropy if you iterate the compres
Forum: 2010 Reports
08-Jul-2010 08:24
DutchShtull
Three remarks for this paper: 1. It is nice discovery to show that narrow-pipe hash functions can not ever replace random oracles. From this point of view, wide-pipe hash designs have obvious advan
Forum: 2010 Reports
Current Page: 1 of 1

Search Messages   Search Authors