The fact that there is no polynomial time integer factorization algorithm is something that will never not be suprising to me. All of the related problems are polytime. Primality testing is polytime. GCD is polytime. So why not integer factorization? Perhaps a polytime integer factorization algorithm exists based on GCD or on primality testing.

Before we go farther, let me clarify what I mean by the integer factorization problem. In a more heuristic sense, integer factorization is actually kind of easy. If I am given some integer uniformly random, theres a 1/2 chance it contains 2 as a factor. If it doesn’t, theres a 1/6 chance it contains 3, and so on. Specific hard instances can be created (like for RSA). Lets define integer factorization as, given where and both prime, find . If are close (but not too close) to , then this is actually quite difficult to solve.

Based on GCD

The first algorithm I came up with was based on GCD. Let be the ith prime. You can download textfiles of the lists of all found primes. for each prime, compute . If it returns anything but 1 then return it. Each gcd computation is done in polytime, but the domain you have to search is from , which is not polynomial. (The size of the input is the number of bits of ). This was an okay start, but we can change the iteration from being linear, to being logarithmic, like binary search.

First define some preprocessing, let . Notice is not actually dependent on , so we can just keep computing for more and store them. Now we can binary search over this space. Keep bouncing around times to find such that but . Then must divide . This is polynomial right? Worse case is steps, and at each step, compute a gcd, which is also polynomial. It also seems quite fast, once the preprocessing has been done. From my experiments, it looks like it beats the python primefac library! But theres a catch. The space is super-exponential. 16GB of space for 41 bit integers. But 41 bits is already very fast, were talking like a hundred of a second vs two hundreths of a second. No one cares. You can perform some space optimization, but this will be dependent on N, so you have to count that as part of the run time, and not what you can compute before hand. For example, replace with . This will bound each , but you have to know before hand. This gcd searching idea could work for some weird enviroment when space is not an issue. I will explore more optimizations for differently defined .

Based on Irreducible Polynomials

Another algorithm I tried was based on the quadratic formula. Factorization of quadratic polynomials is poly time, so why isn’t factorization of polytime in ? Given a monic polynomial , all you have to do is plug it into the quadratic formula to get . Lets work backwards. If we want to factor , instead lets try to factor the polynomial . Then if you plug those into the quadratic formula, you get the roots as . We know since we know . But, I am unable to efficiently determine given . In fact, If there was some polytime algorithm such that , then you could factor integers as described above. As an argument such a task is impossible, consider the definition of a ring. Every definition varies some, but they all explicitly, and axiomatically define multiplcation and addition. If it were possible to define addition in terms of multiplication, maybe I would be able to find some definition that excludes addition. For example, other functions on rings exist (like a norm) and are defined in terms of composition of multiplcation and addition. In a ring, you shouldnt be able to define one and get the other. If you suppose you could, then the structures we see in rings wouldn’t be worth study. They would just look like groups with worse notation. Perhaps the fact that groups are not isomorphic to rings implies that a polytime integer factorization algorithm simply cannot exist.