Linear time algorithm for finding the greatest common divisor

Tag: math Author: dmy2009 Date: 2012-02-26

I have been doing some research and I have found some algorithms that have greater than 0(N) runtime. I am curious if anybody is aware of a linear time algorithm for finding the greatest common divisor?

What is the meaning of N in this case? The number of (binary) digits of the largest number? Also, what is the domain? Most GCD algorithms use division somewhere, and division is O(1) for 64 bit integers, but at least O(n) for integers of arbitrary precision.
I found a O( (n log n)^2 ) algorithm. I also found another that I was unsure what the actual runtime was but was confident that it wasn't linear.
@ElianEbbing: Sorry, in this case I am only concerned with integers. N would be the largest integer.
@VanDarg - If N is just the value of the largest integer, then the runtime complexity is O(log n) even for the euclidean algorithm. This is much better than O(n).
@ElianEbbing: If N were the value of the largest integer in an array from [0..N], would the runtime then be O(NLog(N))? I am trying to solve a problem that involves sorting fractions in an array. If I was able to find a GCD in the array in O(N) time, the problem is solved.

Best Answer

If there is, noone has found it yet; from Wikipedia;

the best known deterministic algorithm is by Chor and Goldreich, which (in the CRCW-PRAM model) can solve the problem in O(n/log n) time with n1+? processors.