RohanJ
RohanJ

Reputation: 616

Reduce time complexity of brute forcing - largest prime factor

I am writing a code to find the largest prime factor of a very large number.

Problem 3 of Project Euler : What is the largest prime factor of the number 600851475143 ?

I coded it in C...but the data type long long int is not sufficient enough to hold the value .

Now, I have rewritten the code in Python. How can I reduce the time taken for execution (as it is taking a considerable amount of time)?

def isprime(b):
    x=2
    while x<=b/2:
        if(b%x)==0:
            return 0
        x+=1
    return 1
def lpf(a):
    x=2
    i=2
    while i<=a/2:
        if a%i==0:
            if isprime(i)==1:
                if i>x:
                    x=i
                    print(x)
        i+=1
    print("final answer"+x)
z=600851475143
lpf(z)

Upvotes: 1

Views: 747

Answers (4)

JoshG79
JoshG79

Reputation: 1687

First off, your two while loops only need to go up to the sqrt(n) since you will have hit anything past that earlier (you then need to check a/i for primeness as well). In addition, if you find the lowest number that divides it, and the result of the division is prime, then you have found the largest.

First, correct your isprime function:

def isprime(b):
    x=2
    sqrtb = sqrt(b)
    while x<=sqrtb:
        if(b%x)==0:
            return 0
        x+=1
    return 1

Then, your lpf:

def lpf(a):
    x=2
    i=2
    sqrta = sqrt(a)
    while i<=sqrt(a):
        if a%i==0:
            b = a//i # integer
            if isprime(b):
                return b
            if isprime(i):
                x=i
                print(x)
        i+=1
    return x

Upvotes: 0

erewok
erewok

Reputation: 7835

I think you're checking too many numbers (incrementing by 1 and starting at 2 in each case). If you want to check is_prime by trial division, you need to divide by fewer numbers: only odd numbers to start (better yet, only primes). You can range over odd numbers in python the following way:

for x in range(3, some_limit, 2):
    if some_number % x == 0:
      etc.

In addition, once you have a list of primes, you should be able to run through that list backwards (because the question asks for highest prime factor) and test if any of those primes evenly divides into the number.

Lastly, people usually go up to the square-root of a number when checking trial division because anything past the square-root is not going to provide new information. Consider 100:

1 x 100
2 x 50
5 x 20
10 x 10
20 x 5
etc.

You can find all the important divisor information by just checking up to the square root of the number. This tip is useful both for testing primes and for testing where to start looking for a potential divisor for that huge number.

Upvotes: 1

lejlot
lejlot

Reputation: 66825

There are many possible algorithmic speed ups. Some basic ones might be:

  • First, if you are only interested in the largest prime factor, you should check for them from the largest possible ones, not smallest. So instead of looping from 2 to a/2 try to check from a downto 2.
  • You could load the database of primes instead of using isprime function (there are dozens of such files in the net)
  • Also, only odd numbers can be primes (except for 2) so you can "jump" 2 values in each iteration

Your isprime checker could also be speededup, you do not have to look for divisiors up to b/2, it is enough to check to sqrt(b), which reduces complexity from O(n) to O(sqrt(n)) (assuming that modulo operation is constant time).

Upvotes: 3

Manoj Pandey
Manoj Pandey

Reputation: 4666

You could use the 128 int provided by GCC: http://gcc.gnu.org/onlinedocs/gcc/_005f_005fint128.html . This way, you can continue to use C and avoid having to optimize Python's speed. In addition, you can always add your own custom storage type to hold numbers bigger than long long in C.

Upvotes: 2

Related Questions