Reputation: 359
I am looking at the following code
public class Solution {
public boolean judgeSquareSum(int c) {
for (long a = 0; a * a <= c; a++) {
for (long b = 0; b * b <= c; b++) {
if (a * a + b * b == c)
return true;
}
}
return false;
}
}
the author is stating the time complexity for this code is √c. What i don't understand is how.
So suppose we are giving an example of c=20. then the code will run 15 times however √20=4.47
Upvotes: 0
Views: 396
Reputation: 3166
I just wanted to see some actual runs. √c seems to be the best case scenario. I suspect that happens when a = 0
and b * b = c
, eliminating the need for multiple runs of the outer loop.
prints:
i = 10 sqrt = 3 runs = 8
i = 100 sqrt = 10 runs = 11
i = 1,000 sqrt = 31 runs = 351
i = 10,000 sqrt = 100 runs = 101
i = 100,000 sqrt = 316 runs = 4,121
i = 500,000 sqrt = 707 runs = 71,501
i = 1,000,000 sqrt = 1,000 runs = 1,001
i = 5,000,000 sqrt = 2,236 runs = 521,209
i = 10,000,000 sqrt = 3,162 runs = 382,721
i = 50,000,000 sqrt = 7,071 runs = 7,079,001
i = 100,000,000 sqrt = 10,000 runs = 10,001
Printed with:
public class StackOverflowTest {
static int counter;
public static void main(String[] args) {
print(10);
print(100);
print(1000);
print(10000);
print(100000);
print(500000);
print(1000000);
print(5000000);
print(10000000);
print(50000000);
print(100000000);
}
static void print(int i) {
new Solution().judgeSquareSum(i);
String format = " i = %,12d\tsqrt = %,6d\truns = %,12d\n";
System.out.printf(format,i,(int)Math.sqrt(i),counter);
}
static class Solution { // only added a counter
public boolean judgeSquareSum(int c) {
counter = 0;
for (long a = 0; a * a <= c; a++) {
for (long b = 0; b * b <= c; b++) {
counter++;
if (a * a + b * b == c)
return true;
}
}
return false;
}
}
}
Upvotes: 2
Reputation: 889
This can be further optimised to:
public boolean judgeSquareSum(final int c) {
final long rootC = (long) Math.sqrt(c);
if (c == rootC * rootC) {
return true;
}
for (long a = 0; a <= rootC; a++) {
final long aSquared = a * a;
final long b = (long) Math.sqrt(c - aSquared);
if (aSquared + b * b == c) {
return true;
}
}
return false;
}
The above produces identical results to the original Posting for all +ve c
.
Upvotes: -1
Reputation: 889
Put another way: if a² + b² = c
, the limiting case is:
a² + 0² = c
(& of course 0² + b² = c, which is analagous)
-> a² = c
-> a = √c
So, for any b
, a <= √c
The originally posted code checks a * a <= c
for every iteration.
It is considered best practice (because its faster) to evaluate limiting values once only.
As we just saw, a * a <= c
can be rewritten as a <= √c
So the for-loop could be coded as follows:
final long rootC = (long) Math.sqrt(c);
for (long a = 0; a <= rootC; a++) {
:
:
}
...which I hope makes the √c complexity claim for the original code clear.
The 2nd for-loop is unnecessary, as b
can be calculated empirically
For any a
:
a² + b² = c
(where c
is known)
-> b² = c - a²
-> b = √(c - a²)
(only integer values of b
being acceptable)
Upvotes: -1
Reputation: 420
The given snippet is O(n)
and Ω(√n)
, implying the stated time complexity is merely best case.
*Logarithmized vertical axis
Upvotes: 8
Reputation: 889
Your Max conditions are
a * a <= c
and
b * b <= c
If you take the Square Root of both sides you get a <= √c
& b <= √c
So, if you optimise your code for performance...
(i.e. don't calculate the maximum on every iteration!)
...it would look like this:
public boolean judgeSquareSum(final int c) {
final long rootC = (long) Math.sqrt(c);
for ( long a = 0; a <= rootC; a++) {
for (long b = 0; b <= rootC; b++) {
if (a * a + b * b == c) {
return true;
}
}
}
return false;
}
So now you see where the complexity claim of √c came from.
Upvotes: -5