worst case running time of the following code - java

I am attending a course online and stuck at the following question. What is the order of growth of the worst case running time of the following code fragment as a function of N?
int sum = 0;
for (int i = 1; i <= N*N; i++)
for (int j = 1; j <= i; j++)
sum++;
Can anyone explain this?

The execution of this depends on the size of N.
for (int i = 1; i <= N*N; i++)
Value of i varies from 1 to N*N. The inner loop variable j depends on i.
So for every i, inner loop is executed i times. sum++ will be executed N2((N2)+1)/2 times. O(N4) will be the time complexity of this loop.

the code will run everytime (N*N+1)C2 times

Related

What's the big O Notation for the two for loops

I am having a hard time analyzing this piece of code:
public int example(int[] arr){
int n = arr.length, total = 0 ;
for (int i=0; i < n; i++)
for (int j=0; j <= i; j++)
total += arr[j];
return total;
}
These two loops don't have curly braces. I could not analyze the time complexity of them.
I need help in counting the time of operation for both of them.
It is O(n2), since there are two layers of loops. The fact that there aren't curly braces does not matter.
The outer loop is executed O(n) times and the inner loop is executed first once, then twice, up until n times. This is an arithmetic sequence from 1 to n where the common difference is 1. Its sum is therefore
(1 + n) * (n) / 2
= (n^2 + n) / 2
= O(n^2)
That code could be rewritten with curly braces as follows:
for (int i=0; i < n; i++) {
for (int j=0; j <= i; j++) {
total += arr[j];
}
}
First loop is executed O(n) times.
Nested loop is executed O(n) times.
So overall O(n^2).

Time complexity of Nested loop with dependent variable and logarithmic increment

what is the time complexity of this loop,
O(N) or O(N (logN ))
also can you explain how you deduced
for (int i = 1; i <= n; i *= 2) {
for (int j = 0; j < i; j++) {
// Statement(s) that take(s) constant time
}
}
i have an explanation but it feels wrong
I think you are confused because of the statement O(n + log(n)), as you thought that the outer loop runs logN times and inner loop runs N times so the answer should be O(NlogN). You are wrong here because the inner loop doesn't run N times, it only runs i times as explained. Now when you sum all the i over outer loop, you will get that 2*2^k - 1 statement. This will come out to be order of N as given in the explanation.

Time complexity O(N) of nested loops with if-statement: O(N^4)?

I am trying to figure out a tight bound in terms of big-O for this snippet:
for(int i = 1 ; i <= n ; i++) {
for(int j = 1; j <= i*i ; j++) {
if (j% i == 0){
for(int k = 0 ; k<j ; k++ )
sum++;
}
}
}
If we start with the inner most loop, it will in worst case run k = n^2 times, which accounts for O(N^2).
The if-statement will be true every time j = m*i, where m is an arbitrary constant. Since j runs from 1 to i^2, this will happen when m= {1, 2, ..., i}, which means it will be true i times, and i can at most be n, so the worst case will be m={1,2, ..., n} = n times.
The second loop should have a worsts case of O(N^2) if i = n.
The outer loop has a worst case complexity of O(N).
I argue that this will combine in the following way: O(N^2) for the inner loop * O(N^2) for the second loop * O(N) for the outer loop gives a worst case time complexity of O(N^5)
However, the if-statement guarantees that we will only reach the inner loop n times, not n^2. But regardless of this, we still need to go through the outer loops n * n^2 times. Does the if-test influence the worst case time complexity for the snippet?
Edit: Corrected for j to i^2, not i.
You can simplify the analysis by rewriting your loops without an if, as follows:
for(int i = 1 ; i <= n ; i++) {
for(int j = 1; j <= i ; j++) {
for(int k = 0 ; k<j*i ; k++ ) {
sum++;
}
}
}
This eliminates steps in which the conditional skips over the "payload" of the loop. The complexity of this equivalent system of loops is O(n4).
I analyse your question in a more straightfroward way
we first start by fix i as a costant,
for example, assume it to be k,
so j=1~k^2, when j=k,2k,3k,...,k^2, assume j to be c*k (c=1~k)
the next loop will be executed c^2 times,
so the complexity for a fix i can be expressed as=>
(1+.....+1)+(1+1+...+2^2)+(1+1+...+3^2)+.....+(1+1+...+k^2)
= O(k^3)
so now we set k to be 1~n, so the total complexity will be O(n^4)

Showing the order of growth of the function

I have the following code
int sum = 0;
for (int i = 1; i <= n; i = i*2)
{
for (int j = 1; j <=n; ++j)
{
++sum;
}
}
And I have the following analysis which according to some people is wrong even though the answer is correct, but I don't understand why.
So, I do the following,
n = 1; i = 1; j is being executed 1 time
n = 2; i = 2; j is being executed 2* 1 time
n = 3; i = 4; j = 4 * 3 times
n = 4; i = 8; j = 8 * 4 times
n = 5; i = 16; j = 16 * 5 times
......
n = k; i = 2^k; j = n * 2^k times
And since 2^k is log(n)
The order of growth of this function is nlog(n) which is a linearithmic growth rate.
Am I doing something wrong in my analysis? Please let me know if I have any mistakes because it is confusing me a lot. Thank you! I want to apply this analysis to more complicated nested loops, but before I do that I need to know what I'm doing wrong.
Update: I spent a lot of time by myself trying to figure out why my analysis is wrong. Thanks to #YvesDaoust I think I'm starting to understand it a little bit more.
The body of the inner loop is executed n times.
The body of the outer loop is executed for all powers of 2 from 1 to n and there are about log2(n) of them, hence the global complexity
O(n.log(n))

What are the Big O Notation for these for loops?

I am finishing up an assignment for my class, this particular section involves giving an "analysis" of the running time for several for loops. The instructor specified that he wants a Big Oh Notation answer.
I am building on the foundation that total running time is based on:
1)The cost of executing each statement
2)The frequency of execution of each statement
3)The idea of working from "inside and out", specifically starting from inner loops.
I know that:
total = 0;
for (int i = 0; i < n;i++){
total++;
}
Answer: O(N) its running linear.
for (int i = 0; i < n;++i)
for (int j = 0; j < n;++j)
total++;
Answer: O(N^2), we only care about how large N grows.
I am confused on
for (int i = 0;i < n;++i)
for (j=0;j < n * n;++j)
++total;
and
for ( i = 0;i < n;++i)
for (j = 0; j < i;++j)
++total;
And last but not least, I am assuming from my textbook that all Triple nested loops are running at N^3 time?
You can analyze your algorithms using Sigma notation, to count/expand the number of iterations run by the inner for loop of your algorithms:
Where T_a covers
for (i = 0; i < n; ++i)
for (j = 0; j < n * n; ++j)
++total;
and T_b:
for (i = 0; i < n; ++i)
for (j = 0; j < i; ++j)
++total;
Finally, a note on your question:
"And last but not least, I am assuming from my textbook that all
Triple nested loops are running at N^3 time?"
This is not true: it depends on how the iterate is increased as well as bounded in the signature of each loop. Compare e.g. with the inner loop in T_a above (bounded by n^2, but simply increased by 1 in each iteration) or e.g. the algorithm analyzed in this answer, or, for a slightly trickier case, the single loop analyzed in this answer.

Categories

Resources