Analysis of Algorithms and building of Time Equation? - java

I'm having trouble figuring out the time equation for a couple small snippets of code.
int sum = 0;
for (int k = n; k > 0; k /= 2)
for (int i = 0; i < k; i++)
sum++;
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < i; j++)
sum++;
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < n; j++)
sum++;
They're all very similar as you can see. I'm not looking for an exact answer or anything, I'm just not really sure where to start with the inner loops. It seems like they would all run n times, but they can't all be the same, right? I'm pretty sure all of the outer loops would be log(n) and that the sum++ would just be a constant (1), but I'm not really sure how all of the inner loops are different and how that would change the equation.

The third code snippet is the easiest to analyze. For each outer loop iteration the inner loop will make 'n' iterations. Since the number of outer loop iterations is O(log(n)) the total number of iterations (and the complexity of the third snippet) is O(n*log(n)).
The first two code snippets have the same complexity, just the outer loop iterates in the descending order in the first snippet, and in the ascending order in the second one. So you iterate over all powers of two which are smaller than 'n', and then repeat the inner loop the corresponding number of times. The total number of iterations is
1 + 2 + 4 + ... + 2^k
where k=log2(n). Sum of powers of 2 is 2^(k+1)=2*2^k=2*n. So, the complexity in both cases is O(n).

int sum = 0;
for (int k = n; k > 0; k /= 2)
for (int i = 0; i < k; i++)
sum++;
n + n/2 + n/4 + n/8 + ... + 1 ≈ 2n = Θ(n)
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < i; j++)
sum++;
1 + ... + n/8 + n/4 + n/2 + n ≈ 2n = Θ(n)
(Well, not exactly ending with n, n/2 etc, but within a factor of 2 of those, so doesn't matter for the complexity class.)
int sum = 0;
for (int i = 1; i < n; i *=2)
for (int j = 0; j < n; j++)
sum++;
n + n + ... + n ≈ log(n) × n = Θ(n log n)

Related

What is the Time Complexity for the 3 nested loops below?

Here's the code:
for (int i = 0; i < n; i++) {
for (int j = 0; j < n * n; j++) {
for (int k = 0; k < j; k++) {
sum++;
}
}
}
I need to evaluate the Time complexity in Big-O notation of the nested loops above.
Is it just O(n) * O(n) * O(n) + O(1) to make O(n^3)? Or is there more to it?
The most inner loop is executed in quadratic time (not constant), hence it should be O(n) * O(n^2) * O(n^2) = O(n^5).
Here are all the costs:
Most outer loop - O(n)
The second loop - O(n^2) for each element for the outer loop
The most inner loop - O(n^2) for each element for the second loop
for (int i = 0; i < n; i++) -> runs n times.
for (int j = 0; j < n * n; j++) -> runs n² times.
for (int k = 0; k < j; k++) -> runs n² times (k == j == n²)
n * n² * n² = n^5.
sum+++ is an operation of constant runtime (1) and can therefore be ignored.
The second loop isn't O(n), it's O(n^2) by itself.

How would I calculate big-O for this Algorithm

I have got this algorithm
int count = 0;
for(int i = n; i >= 1; i = i/2) {
for ( int j = 1; j <= i; j++) {
count++;
}
}
Am I right in saying that the Big-O for this would be n/2?
TL;DR The time complexity is O(n).
More details
Am I right in saying that the BigO for this would be n/2?
No that is accurate, in big-O notation you drop the constant part so (1/2)n simplifies to O(n).
I am not sure where that n/2 comes from because only the outer loop
for(int i = n; i >= 1; i = i/2) {
...
}
is log2n not n/2.
And with both loops together:
int count = 0;
for(int i = n; i >= 1; i = i/2) {
for ( int j = 1; j <= i; j++) {
count++;
}
}
the count would vary between N and 2N.
Let us go through the calculations:
int count = 0;
for(int i = n; i >= 1; i = i/2) {
for ( int j = 1; j <= i; j++) {
count++;
}
}
The inner loop will execute N iterations then N/2, then N/4 ... until N/N.
In another words we have (N + N/2 + N/4 ... N/N) which can be simplified to N * (1/2 + 1/4 + .. + 1/2^L)), with L = Log2 N.
This (1/2 + 1/4 + .. + ) series is well-known for being 1. Therefore, we can simplified N * (1/2 + 1/4 + .. + 1/2^L)) to O(N).
You are correct! This is basically a geometric progression with a quotient of 2 and the number of elements is lg(n) as we divide i by 2 each iteration of the outer loop.
1, 2, 4, ..., n
Using a known formula to calculate the sum, we get:
The reason we have lg (n) elements, is because we divide i each iteration by 2, thus we need to solve for the number of iterations k:

How to calculate big-Oh time complexity in terms of n in each case?

Q2. Consider the following code fragments (a), (b) and (c) where n is the variable specifying data
size and C is a constant. What is the big-Oh time complexity in terms of n in each case? Show all
necessary steps.
(a)
for (int i = 0; i < n; i = i + C)
for (int j = 0; j < 10; j++)
Sum[i] += j * Sum[i];
(b)
for (int i = 1; i < n; i = i * C)
for (int j = 0; j < i; j++)
Sum[i] += j * Sum[i];
(c)
for (int i = 1; i < n; i = i * 2)
for (int j = 0; j < n; j = j + 2)
Sum[i] += j * Sum[i];
1. for (int i = 0; i < n; i = i + C)
2. for (int j = 0; j < 10; j++)
3. Sum[i] += j * Sum[i];
`
Line 1: int i=0, takes constant time of 1 so O(1)
i<n, takes n+1 time so O(n)
i=i+C, takes n time so O(n)
Total time: 1+(n+1)+n= 2n+2
Line 2:
int j=0, takes constant time of 1 so O(1)
j<10, loops through 10 times and takes n time so O(n)
i=i+C, loops through 10 times and takes n time so O(n)
Total time: 1+(10+10)n= 1+20n
Line 3:
Sum[i]=Sum[i]+j*Sum[i];
Addition and Multiplication takes constant time of 2 and plus 1 to store
or assign the value, and it loops through n times.
Total time:3n
T(n)=(2n+2)+(1+20n)+3n= 25n+3 is O(n) right?

Big-O notation check understanding

I want to check my understanding of Big-O notation. If I have code:
for(int bound = 1; bound <= n; bound *= 2){
for( int i = 0; i < bound; i++) {
for(int j = 0; j < n; j += 2){
.....Code
}
for(int j = 1; j < n; j *= 2){
......Code
}
}
}
is the Big-O notation for this N3?
Not quite. The outer loop increment is bound *= 2, so that loop is O(log n). The two inner loops (i and the first j loop) are both O(n), so when nested they're O(n2). (You can ignore the j *= 2 inner loop because it's faster than the j += 2 loop and won't significantly contribute to the program's run time.)
Put this all together and the whole program is O(log n * n2).

What is the worst case analysis of this code fragment?

sum = 0;
for (int i = 0; i < N; i++)
for(int j = 0; j < i*i; j++)
sum++;
I'm not entirely sure of my answer; I think the inner loop runs i^2 operations and the outer loop runs N times so the final answer would be O(N^3)?
The number of operations is sum = 1 + 4 + 9 + ... + N^2. This is because when i = 0, j will increment itself 0 times. When i = 1, j will increment itself once. When i = 2, j will increment itself 4 times, and so on.
This sum is equal to N(N + 1)(2N + 1)/6, so the algorithm is indeed O(N^3). You can prove this formula by induction.
That looks right to me (asymptotically).

Categories

Resources