2012 Reports :  Cryptology ePrint Archive Forum
Discussion forum for Cryptology ePrint Archive reports posted in 2012. Please put the report number in the subject. 
Goto Thread: PreviousNext
Goto: Forum ListMessage ListNew TopicSearchLog In
2012/097: useless factorization algorithm
Posted by: djb (IP Logged)
Date: 29 February 2012 11:31

2012/097 finds a divisor p of n by computing gcd{n,x1}, gcd{n,x2}, ... for a particular pattern of random-looking integers x1,x2,...

All of the numerical evidence collected in the paper is consistent with the idea that each of these integers has about 1/p chance of finding p, for a total cost of essentially p to find p. This isn't even as fast as trial division. It's a giant step backwards from the rho method, ECM, etc., which take far fewer than p operations to find p once p is moderately large.

The paper provides no reason to believe that its method is of any use. The paper claims that its "FAC(n,1)" is surprisingly small compared to p, but the number of operations in the algorithm is essentially quadratic in FAC(n,1). The "FAC(n,a)" algorithm is non-uniform, and again the paper provides no reason to think it is better than random guessing. One might as well generate the x values as AES outputs.

---D. J. Bernstein
Research Professor, Computer Science, University of Illinois at Chicago

Re: 2012/097: useless factorization algorithm
Posted by: dyp (IP Logged)
Date: 02 March 2012 10:27

I can not agree with Professor Bernstein's comments on our paper 2012/097. First, the title of Bernstein's comments "useless factorization algorithm" is a useless title. Before Bernstein say so, please Bernstein prove rigorously mathematically that our algorithm is useless.

The practical effect of our Algorithm A is not good on a single PC, it is worse than many known algorithms. I know this point. I have pointed it in the section "Introduction" of our paper. Anymore, if an algorithm is not good in practice, maybe it can be proven to be the best one in theory. For example, Miller, AKS for primality testing.

Bernstein said "All of the numerical evidence collected in the paper is consistent with the idea that each of these integers has about 1/p chance of finding p, for a total cost of essentially p to find p. This isn't even as fast as trial division." This is almost true for small n, and I know this point. But not for all n, e.g. as shown in our Proposition 4.7. So, I also please Bernstein prove it rigorously mathematically for any n before Bernstein give his conclusion.

These coefficients in Definition 2.2 have special structure, and not random. They are some kind of combinational sums( Definition 2.5), and are well-known objects in combinatorial number theory. So, Bernstein said that they are random, please him prove it before giving his conclusion.

Bernstein said"One might as well generate the x values as AES outputs." Please Bernstein prove it rigorously mathematically !!!

Yingpu Deng



Please log in for posting a message. Only registered users may post in this forum.