What is the worst case analysis of this code fragment? - java

sum = 0;
for (int i = 0; i < N; i++)
for(int j = 0; j < i*i; j++)
sum++;
I'm not entirely sure of my answer; I think the inner loop runs i^2 operations and the outer loop runs N times so the final answer would be O(N^3)?

The number of operations is sum = 1 + 4 + 9 + ... + N^2. This is because when i = 0, j will increment itself 0 times. When i = 1, j will increment itself once. When i = 2, j will increment itself 4 times, and so on.
This sum is equal to N(N + 1)(2N + 1)/6, so the algorithm is indeed O(N^3). You can prove this formula by induction.

That looks right to me (asymptotically).

Related

Analysis of Algorithms and building of Time Equation?

I'm having trouble figuring out the time equation for a couple small snippets of code.
int sum = 0;
for (int k = n; k > 0; k /= 2)
for (int i = 0; i < k; i++)
sum++;
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < i; j++)
sum++;
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < n; j++)
sum++;
They're all very similar as you can see. I'm not looking for an exact answer or anything, I'm just not really sure where to start with the inner loops. It seems like they would all run n times, but they can't all be the same, right? I'm pretty sure all of the outer loops would be log(n) and that the sum++ would just be a constant (1), but I'm not really sure how all of the inner loops are different and how that would change the equation.
The third code snippet is the easiest to analyze. For each outer loop iteration the inner loop will make 'n' iterations. Since the number of outer loop iterations is O(log(n)) the total number of iterations (and the complexity of the third snippet) is O(n*log(n)).
The first two code snippets have the same complexity, just the outer loop iterates in the descending order in the first snippet, and in the ascending order in the second one. So you iterate over all powers of two which are smaller than 'n', and then repeat the inner loop the corresponding number of times. The total number of iterations is
1 + 2 + 4 + ... + 2^k
where k=log2(n). Sum of powers of 2 is 2^(k+1)=2*2^k=2*n. So, the complexity in both cases is O(n).
int sum = 0;
for (int k = n; k > 0; k /= 2)
for (int i = 0; i < k; i++)
sum++;
n + n/2 + n/4 + n/8 + ... + 1 ≈ 2n = Θ(n)
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < i; j++)
sum++;
1 + ... + n/8 + n/4 + n/2 + n ≈ 2n = Θ(n)
(Well, not exactly ending with n, n/2 etc, but within a factor of 2 of those, so doesn't matter for the complexity class.)
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < n; j++)
sum++;
n + n + ... + n ≈ log(n) × n = Θ(n log n)

How would I calculate big-O for this Algorithm

I have got this algorithm
int count = 0;
for(int i = n; i >= 1; i = i/2) {
for ( int j = 1; j <= i; j++) {
count++;
}
}
Am I right in saying that the Big-O for this would be n/2?
TL;DR The time complexity is O(n).
More details
Am I right in saying that the BigO for this would be n/2?
No that is accurate, in big-O notation you drop the constant part so (1/2)n simplifies to O(n).
I am not sure where that n/2 comes from because only the outer loop
for(int i = n; i >= 1; i = i/2) {
...
}
is log2n not n/2.
And with both loops together:
int count = 0;
for(int i = n; i >= 1; i = i/2) {
for ( int j = 1; j <= i; j++) {
count++;
}
}
the count would vary between N and 2N.
Let us go through the calculations:
int count = 0;
for(int i = n; i >= 1; i = i/2) {
for ( int j = 1; j <= i; j++) {
count++;
}
}
The inner loop will execute N iterations then N/2, then N/4 ... until N/N.
In another words we have (N + N/2 + N/4 ... N/N) which can be simplified to N * (1/2 + 1/4 + .. + 1/2^L)), with L = Log2 N.
This (1/2 + 1/4 + .. + ) series is well-known for being 1. Therefore, we can simplified N * (1/2 + 1/4 + .. + 1/2^L)) to O(N).
You are correct! This is basically a geometric progression with a quotient of 2 and the number of elements is lg(n) as we divide i by 2 each iteration of the outer loop.
1, 2, 4, ..., n
Using a known formula to calculate the sum, we get:
The reason we have lg (n) elements, is because we divide i each iteration by 2, thus we need to solve for the number of iterations k:

Why does the inner loop range make a difference in the way this problem is solved?

Write a static method printNumbers that takes an integer max as an argument and prints out all perfect numbers that are less than or equal to max.
At first, I kept getting the wrong answer because the inner loop was set to j < max before I changed it to j < i. However, I don't understand why that range would matter, because wouldn't i % j != 0 anyway, even if the range of j were to be larger?
for (int i = 1; i <= max; i++) {
int sum = 0;
for (int j = 1; j < i; j++) {
if (i % j == 0) {
sum += j;
}
}
if (sum == i) {
System.out.print(sum + " ");
}
}
If I changed the inner loop j < max, then printNumbers(6) gives both 1 and 6, but printNumbers(500) gives only 1 and no other number.
If you set j < max in the inner loop, then when j = i, i % j == 0 returns true and skews your result.
This is a good example of a mathematical error to watch out for in coding.

what is the complexity of this code snippet?

I have a question in regards to the time complexity of the below code. I am guessing that the time complexity is, O(n^3) but my friend told me that the time complexity should be O(n^2). However, I am still not convinced with the answer. My stand is that: the first and second for loop would cost O(1/2 n^2) and inner loop would need another some O(n) complexity. Therefore, it is about O(n^3).
for (int i = 1; i <= len; i++) {
for (int j = i + 1; j <= len; j++) {
int mid = (i + j) / 2;
for (int k = i; k <= j; k++) {
dist[i][j] += Math.abs(A[k - 1] - A[mid - 1]);
}
}
}
So you need to find the time complexity of something like this:
for (int i = 1; i <= N; i++) {
for (int j = i + 1; j <= N; j++) {
for (int k = i; k <= j; k++) {
// some O(1) operation
}
}
}
Each of the loops run in O(N), so the complexity is O(N^3). You can also write a simple test program in your language (I wrote in python):
def check(N):
s = 0
for i in xrange(1, N + 1):
for j in xrange(i + 1, N + 1):
for k in xrange(i, j + 1):
s += 1
return s
print [check(i) for i in xrange(1, 10)] // [0, 2, 7, 16, 30, 50, 77, 112, 156]
And checked for a closed form for this sequence. It is ,
which is clearly O(n^3)

time complexities of java loops

How to find time complexities for the following loops.
1)
int I, j, k, n, mini, tmp;
for(i = 0; i< k; i++){
mini = i;
for(j =i +1; j < n; j++)
if (a[j] < a[mini])
mini = j;
tmp = a[i];
a[i] = a[mini];
a[mini] = tmp;
}
return a[k-1];
}
2)
void SelectionSort(int A[], int n) {
int i = 0;
while (i < n - 1) {
int j = i + 1;
while (j < n) {
if (A[j] < A[i])
swap(A[j], A[i])
j++;
}
i++;
}
}
Both are O(n^2) 1,2
In both, the outer loop runs from 0 to n (exclusive), and for each iteration - the inner loop runs from i+1 to n (exclusive).
If we sum the running time of the inner loops we get:
n- (0+1) + n- (1+1) + .... + n-(n-1 + 1) =
= n-1 + n-2 + .... + 0 =
0 + 1 + ... + n-1 = (*)
n(n-1)/2
which is in O(n^2)
The equation (*) comes from sum of aritmetic progression.
As a side note - both are sorting algorithms, the first is min sort and the second is as the function name says, selection sort.
(1) technically the first is O(k^2), but I assume it means the same here.
(2) Assuming the return a[k-1]; should be AFTER closing the scope of the outer loop, and its placement is a mistake. If it is not a mistake - the outer loop runs only once, and complexity is O(n).

Categories

Resources