what is the time complexity of this loop,
O(N) or O(N (logN ))
also can you explain how you deduced
for (int i = 1; i <= n; i *= 2) {
for (int j = 0; j < i; j++) {
// Statement(s) that take(s) constant time
}
}
i have an explanation but it feels wrong
I think you are confused because of the statement O(n + log(n)), as you thought that the outer loop runs logN times and inner loop runs N times so the answer should be O(NlogN). You are wrong here because the inner loop doesn't run N times, it only runs i times as explained. Now when you sum all the i over outer loop, you will get that 2*2^k - 1 statement. This will come out to be order of N as given in the explanation.
Related
people of the internet,
I am studying algorithms and their complexity and I wrote a "naive" code for finding the number of inversions in an Array. First, it seemed easy and then I started wondering if that j=i+i changes the second loop's complexity from O(n) in a worst-case scenario to something lower ?
Here is my code written in java :
public static void naiveInversionCount(int[] T){
int count = 0;
for(int i = 0; i < T.length -1; i++){ // O(n)
for(int j = i+1; j < T.length; j++){ // O(n) ???
if(T[i]> T[j]) count++; // O(1)
}
}
System.out.println("Naive method returns : " + count);
}
Thank you very much
The outer loop runs exactly n times.
The inner loop runs n−1, n−2, …, 0 times per outer loop. That is, on average, n/2 times.
And count++ runs exactly once per loop.
Thus the nested loop runs 1·n(n/2) times, which is in 𝑂(n²).
Here is the code:
for (int i = 0; i < 60; i++) {
for (int j = i-1; j < N; j++) {
for (int k = 0; k+2 < N; k++) {
System.out.println(i*j);
System.out.println(i);
i=i+1;
}
}
}
I believe it is O(N^2) since there's two N's in the for loops, but not too sure.
Any help is appreciated, thanks!
It's rather because the i-loop has a fixed limit. Saying it's O(N^2) is not wrong IMO but if we are strict, then the complexity is O(N^2 * log(N)).
We can prove it more formally:
First, let's get rid of the i-loop for the analysis. The value of i is incremented within the k-loop. When N is large, then i will get larger than 60 right in the first iteration, so the i-loop will execute just one iteration. For simplicity, let's say i is just a variable initialized with 0. Now the code is:
int i = 0;
for (int j = -1; j < N; j++) { // j from - 1 to N - 1
for (int k = 0; k+2 < N; k++) { // k from 0 to N - 1 - 2
System.out.println(i*j);
System.out.println(i);
i=i+1;
}
}
If we are very strict then we'd have to look at different cases where the i-loop executes more than one time, but only when N is small, which is not what we are interested in for big O. Let's just forget about the i-loop for simplicity.
We look at the loops first and say that the inner statements are constant for that. We look at them separately.
We can see that the complexity of the loops is O(N^2).
Now the statements: The interesting ones are the printing statements. Printing a number is obviously done digit by digit (simply speaking), so it's not constant. The amount of digits of a number grows logarithmic with the number. See this for details. The last statement is just constant.
Now we need to look at the numbers (roughly). The value of i grows up to N * N. The value of j grows up to N. So the first print prints a number that grows up to N * N * N. The second print prints a number that grows up to N * N. So the inner body has the complexity O(log(N^3) + log(N^2)), which is just O(3log(N) + 2log(N)), which is just O(5log(N)). Constant factors are dropped in big O so the final complexity is O(log(N)).
Combining the complexity of the loops and the complexity of the executed body yields the overall complexity: O(N^2 * log(N)).
Ask your teacher if the printing statements were supposed to be considered.
The answer is O(N^2 log N).
First of all, the outer loop can be ignored, since it has a constant number of iterations, and hence only contributes by a constant factor. Also, the i = i+1 has no effect on the time complexity since it only manipulates the outer loop.
The println(i * j) statement has a time complexity of O(bitlength(i * j)), which is bounded by O(bitlength(N^2)) = O(log N^2) = O(log N) (and similarly for the other println-statement. Now, how often are these println-statement executed?
The two inner loops are nested and and both run from a constant up to something that is linear in N, so they iterate O(N^2) times. Hence the total time complexity is O(N^2 log N).
Need to know the worst-case complexity of the following code, would be grateful if you could provide the steps of how to solve it. I was thinking of the answer n^3logn, but not sure.
int complexity(int n)
int i,j,c
for(i=1; i < n; i=i*3)
for(j=n; j < n*n; j++)
c++
for(i=c; i > 0; i--)
if(random(0...999) > 0)
for(j=0; j < 3*n; j++)
c++
else
for(j=2*n*n; j > 0; j--)
c++
return c
Let's look at the first nested loop
for(i=1; i < n; i=i*3)
for(j=n; j < n*n; j++)
c++
The outer loop runs log(n) times [log base 3, but base changes is multiplying by a constant which does not effect the asymptotic complexity] and the inner loop n^2 times, thus after this loop c = n^2 * log(n).
For the second loop:
for(i=c; i > 0; i--)
if(random(0...999) > 0)
for(j=0; j < 3*n; j++)
c++
else
for(j=2*n*n; j > 0; j--)
c++
In the worst case the else case always happens, so we can modify it to
for(i=c; i > 0; i--)
for(j=2*n*n; j > 0; j--)
c++
The outer loop happens c times, which is O(n^2 * log(n)) and the inner loop increments c by 2*n^2, so c is incremented by 2 * n^2 * n^2 * log(n), adding the initial value we get that c (and thus the overall complexity) is in O(2*n^4*log(n) + n^2 * log(n)) = O(n^4 * log(n))
I hope I'm not just doing your homework for you. In future I recommend showing your thought processes thus far, rather than just your final answer.
Let's look at this code one section at a time.
int complexity(int n)
int i,j,c
for(i=1; i < n; i=i*3)
for(j=n; j < n*n; j++)
c++
So in the outer loop, i goes from 1 to n, but each time i is tripled. This means it will finish after log_3(n) loops. Changing base of log is just a constant factor, which doesn't matter in computational complexity, so we can just say log(n).
The inner loop has j going from n to n^2. O(n^2 - n) = O(n^2) since lower powers are dwarfed by higher ones (i.e. quadratic dwarfs linear).
So putting this all together, the first section has computational complexity O(n^2 logn). Now let's look at the second section.
for(i=c; i > 0; i--)
if(random(0...999) > 0)
for(j=0; j < 3*n; j++)
c++
else
for(j=2*n*n; j > 0; j--)
c++
return c
Now because the outer loop's initialization depends on c, we need to know what c is. c was incremented every time in the first two nested loops, so c's value is proportional to n^2 logn. Thus the outer loop will run O(n^2 logn) times.
For the inner loops, remember we are always considering the worst-case scenario. So the random number generator is a bit misleading: compute the computational complexity of both j loops, and assume the worst case always happens.
The first j loop goes from 0 to 3n, which is simply O(n). The second j loop goes from 2n^2 to 0 which is simply O(n^2). Since the second case is worse, we assume it always happens (even though this is highly improbable). So the inner loop is O(n^2).
Multiplying the two nested loops together, we get O(n^2 logn x n^2) = O(n^4 logn).
Finally, remember we have to see which of the two sections dominated. The first section was O(n^2 logn) and the second was O(n^4 logn), so we want O(n^2 logn + n^4 logn). The latter term obviously dominates, so the final answer is O(n^4 logn).
Hope this helps. Please ask if some part was unclear.
p.s. The current top answer states that c will be ~n^3/3. Because i is tripling every time, this is incorrect; it will be n^2 log_3(n). Otherwise their work is correct.
The way to solve this is to work out a formula that gives c as a function of n. (We can use the number of c++ operations as a proxy for the overall number of operations.)
In this case, the random function means that you can't get an exact formula. But you can work out two formulae, one for the case where random always returns zero, and the other for the case where random always returns > zero.
How do you work out the formula / formulae? Maths!
Hint: the worst-case will be one of the two cases that I mentioned above. (Which one? The one whose complexity is worst, of course!)
I am trying to figure out a tight bound in terms of big-O for this snippet:
for(int i = 1 ; i <= n ; i++) {
for(int j = 1; j <= i*i ; j++) {
if (j% i == 0){
for(int k = 0 ; k<j ; k++ )
sum++;
}
}
}
If we start with the inner most loop, it will in worst case run k = n^2 times, which accounts for O(N^2).
The if-statement will be true every time j = m*i, where m is an arbitrary constant. Since j runs from 1 to i^2, this will happen when m= {1, 2, ..., i}, which means it will be true i times, and i can at most be n, so the worst case will be m={1,2, ..., n} = n times.
The second loop should have a worsts case of O(N^2) if i = n.
The outer loop has a worst case complexity of O(N).
I argue that this will combine in the following way: O(N^2) for the inner loop * O(N^2) for the second loop * O(N) for the outer loop gives a worst case time complexity of O(N^5)
However, the if-statement guarantees that we will only reach the inner loop n times, not n^2. But regardless of this, we still need to go through the outer loops n * n^2 times. Does the if-test influence the worst case time complexity for the snippet?
Edit: Corrected for j to i^2, not i.
You can simplify the analysis by rewriting your loops without an if, as follows:
for(int i = 1 ; i <= n ; i++) {
for(int j = 1; j <= i ; j++) {
for(int k = 0 ; k<j*i ; k++ ) {
sum++;
}
}
}
This eliminates steps in which the conditional skips over the "payload" of the loop. The complexity of this equivalent system of loops is O(n4).
I analyse your question in a more straightfroward way
we first start by fix i as a costant,
for example, assume it to be k,
so j=1~k^2, when j=k,2k,3k,...,k^2, assume j to be c*k (c=1~k)
the next loop will be executed c^2 times,
so the complexity for a fix i can be expressed as=>
(1+.....+1)+(1+1+...+2^2)+(1+1+...+3^2)+.....+(1+1+...+k^2)
= O(k^3)
so now we set k to be 1~n, so the total complexity will be O(n^4)
Example1:
This I would say is O(log(n)), my reasoning is that if we choose a test n, n = 10, then 'i' would run: 0,2,4,6,8,10, so 'i' is behaving linearly not growing but just adding + 2 each time it iterates. And the method is also 'n' - dependent meaning that as n grows so does the method. So the first loop is O(n), then the second loop goes, if we choose n test to be n = 10, then after each iteration 'j' would run: 2,4,8,16,32,64 then can be thought of as 2^n which is a logarithmic function, so this loop is logarithmic. So computing this: O(n)*O(log(n)) = O(log(n))
for (i = 0; i < n; i = i + 2) {
for ( i = 1; j <= n; j = j*2) {
System.out.println("Hi");
}
}
Example2:
for this one if we choose n to be n = 10, then 'i' runs: 2,4,8,16,32 again can be rewritten as 2^n, this is logarithmic, O(log(n))
for (i = 1; i < n; i = i + i) {
System.out.println("Soda");
}
Example3:
for this one the first loop is n-dependent, the method changes as n grows, nothing fancy with 'i' it simply adds +1 so this is a n-dependent loop of O(n) for any inputted n.
if the number is even it runs the first nested loop, if the number is positive it runs the second nested loop. The first inner loop is the same complexity, O(n). and the last inner loop is (I think, despite n*n) O(n). So computing this we get: O(n)*O(n) = O(n^2) if the if-statement is true, and O(n)*O(n) if the if statement is false, so in the worst case it is O(n^2) because we multiply the two O(n)'s.
for (i = 0; i < n; i++) {
if (i % 2 == 0) {
for (j = 0; j < n; j++) {
System.out.println("Rawr");
}
} else {
for (j = 0; j < n*n; j++) {
System.out.println("Computer");
}
}
Have I made errors?
please explain
Thank you
Unfortunately, from the info you are giving, not much can be told. Are you looking for time complexity? spacetime complexity? runtime complexity?
In addition to this, the complexity of methods is actually only to be useful if the method is called more than once, which doesn't happen as you propose. If you don't know how the method is called, you will not be able to determine the complexity(either one of them) accurately.