Checking this Big O Analysis - java

I'm trying to learn Big O analysis, and I was wondering if someone could let me know if I'm doing it right with these two examples (and if I'm not, where did I go wrong?). I got the first one to be O(N^2) and the second to be O(N). My breakdown of how I got them is in the code below.
First example
public void sort(Integer[] v) {
//O(1)
if(v.length == 0)
return;
//O(N)*O(N)*O(1)
for(int i = 0; i < v.length; ++i)
{
for(int j = i + 1; j < v.length; ++j )
{
if(v[j].compareTo(v[i]) < 0)
{
Integer temp = v[i];
v[i] = v[j];
v[j] = v[i];
}
}
}
}
Second example
public void sort(Integer[] v){
TreeSet<Integer> t = new TreeSet<>();
//O(N)
for(int i = 0; i < v.length(); ++i)
{
t.add(v[i]);
}
int i = 0;
//O(N)
for(Integer value : temp)
{
v[i++] = v;
}
}
Thanks for the help!

You are right - the first is O(N^2) because you have one loop nested inside another, and the length of each depends on the input v. If v has length 2, you'll run those swaps 4 times. If v has length 8, they will execute 64 times.
The second is O(N) because you have iterate over your input, and your loop contain any iterative or expensive operations. The second is actually O(n log(n)) - see comments on the original post.

Your first example is O(N^2), you are right.
Your second example is not O(N), so you are not right.
It is O(N) * O(log N) + O(N)
O(N) first loop
O(log N) insert into set
O(N) second loop
Finally you have O(N * log N + N), take the higher value, so answer is O(N*log N)
Edited
By the way, Big O notation does not depend of programming language
It could helps

Related

Complexity in Java

I have a general question about my complexity in my Code. Since I am doing some things from Codeforces, complexity matters the first time to me.
Problemset: https://codeforces.com/problemset/problem/1554/A
Code:
import java.util.Arrays;
import java.util.List;
import java.util.Scanner;
import java.util.Collections;
public class main {
public static void main(String[] args) {
doCalculation();
}
public static void doCalculation () {
Scanner sc = new Scanner(System.in);
long n = sc.nextLong();
for (int i = 0; i < n; i++) {
int numbersInRow = sc.nextInt();
long [] numbersToRead = new long[numbersInRow];
for(int j = 0; j < numbersInRow; j++) {
numbersToRead[j] = sc.nextLong();
}
long maxMultiplication = 0;
for(int k = 0; k < numbersInRow; k++) {
for (int m = k + 1; m < numbersInRow; m++) {
//Von jetzt an von 1 bis 2; von 1 bis 3; von 2 bis 3
long[] subarray = subArray(numbersToRead, k, m);
//System.out.println(Arrays.toString(subarray));
long min = Arrays.stream(subarray).min().getAsLong();
//System.out.println(min);
long max = Arrays.stream(subarray).max().getAsLong();
//System.out.println(max);
long multiplicationEveryArray = min * max;
if(multiplicationEveryArray > maxMultiplication) {
maxMultiplication = multiplicationEveryArray;
}
}
}
System.out.println(maxMultiplication);
}
}
public static long[] subArray(long[] array, int beg, int end) {
return Arrays.copyOfRange(array, beg, end + 1);
}
}
I think copyRange has complexity O(n) and subarry has the same complexity.
What are some ways to improve my complexity without taking a different approach?
Usually, these problems are set up in a way that the brute-force method will finish in a million years.
Looking at the examples, my first intuition kind of was that you only need to look at neighboring numbers, making this an O(n) problem.
If you have f(1,2) and f(2,3) then f(1,3) will definitely be the max of f(1,2) and f(2,3). The product of the first and last elements is impossible to be the max as it means that at least one of them has to be smaller than or equal to the second element in the middle to be considered, making the product smaller than or equal to the product of "just neighbors".
You can continue on through induction to reach to the conclusion that f(1,n+1) will be max(f(1,2), f(2,3), ... f(n,n+1)).
In order to get rid of the O(n^2) loop try
for(int k = 0; k < numbersInRow - 1; k++) {
final int product = numbersToRead[k] * numbersToRead[k+1];
if (product > maxMultiplication) {
maxMultiplication = product;
}
}
There are multiple problem with this code :
I think you are using too much space for this question. In this question why are you always creating a new array before finding out max and min.
And that too in a loop so it will cause memory problems for sure later.
As approximately space complexity will go above O(n2)
long[] subarray = subArray(numbersToRead, k, m);
long min = Arrays.stream(subarray).min().getAsLong();
long max = Arrays.stream(subarray).max().getAsLong();
First of all you are doing all this in 3n time ( O(n) for copying +O(n) for finding minimum, O(n) for finding maximum )
i suggest if possible find a better algorithm which can find maximum and minimum in O(n) time if possible.
If thats not possible try finding maximum , minimum in a single loop and try avoiding this function of yours.
public static long[] subArray(long[] array, int beg, int end) {
return Arrays.copyOfRange(array, beg, end + 1);
}
Instead of creating a new array everytime try iterating on the same array.
Avoid streams if you dont have anykind of function which can iterate on original array without creating a new one though i doubt so, that would be the case. I think there is some kind of function which can help limit the finding of minimum and maximum on some range of array
Something like :
Arrays.stream(numbersToRead).range(k, m)

Asymptotic Complexity of Small Functions

I am currently learning Java and basic algorithms by myself. I am struggling with the runtime complexity of loops and some recursive functions in terms of n (i.e O(n^2)). For example, for the following program:
public static void p(int n) {
if (n == 0) return; //n times??
int s = 0; // n times?
for (int k = 1; k <= Math.min(100, n); k = k + 1) {
//'k = 1' is n times, 'k<=mathmin' I am not sure on this one
//'k = k+1' is n*n or n*100?
s = s + 10; //n*n or n*100 times??
}
p(n - 1); //n-1 times?
//and also what about the overall asymptotic complexity?
//is it just simply adding together all the above number of times?
}
I commented my interpretations of the runtime of each step in the program but I am not sure whether I am on the right track or not. could someone help me please??

the time complexity of array function

I wrote a function to find the position where the target value should be inserted in the given array. We assume the array has distinct values and is sorted in ascending order. My solution must be in O(log N) time complexity
public static int FindPosition(int[] A, int target) {
int a = A.length / 2;
System.out.println(a);
int count = a;
for (int i = a; i < A.length && A[i] < target; i++) {
count++;
}
for (int i = a; i > A.length && A[i] > target; i--) {
count++;
}
return count;
}
Does this code have complexity of O(log N)?
Short answer
No.
Longer answer
With increments of 1 in your indices, you cannot expect to have a better solution than O(n).
If your algorithm works at all (I don't think it does), it looks like it would need O(n) steps.
Also, you say you assume that the array is sorted, but you sort it anyway. So your code is O(n*log(n)).
What's more, trying to sort an already sorted array is the worst case for some sorting algorithms : it might even be O(n**2).
You're looking for a binary search.
No it isn't nlogn
public static int FindPosition(int[] A, int target){
/*
Time taken to sort these elements would depend on how
sort function is implemented in java. Arrays.sort has average
time complexity of Ω(n log n), and worst case of O(n^2)
*/
Arrays.sort(A);
/*Time taken to run this loop = O(length of A) or O(N)
where N = arraylength
*/
for(int i=a;i<A.length&&A[i]<target;i++){
count++;
}
/*Time taken to run this loop = O(length of A) or O(N)
where N = arraylength
*/
for(int i=a;i>A.length&&A[i]>target;i--){
count++;
}
return count;
}
Now time complexity would be represented by the longest of above three, since assignments and all are done in constant time.
Thus making your complexity O(n^2) in worst case.

O(n log n) Time Complexity Algorithm?

I created this algorithm to find the best trade between 3 numbers. It goes through the program and finds the best day to sell, buy, and profit from stock. I need to explain the algorithm used and how the time complexity is O(n log n) but I have a lot of trouble determining that. I was hoping someone could explain O(n log n) and relate it to the method I have.
Here's my method:
public static Trade bestTrade(int[] a)
{
int lowest = a[0];
int lowestIndex = 0;
int highest = a[a.length - 1];
int highestIndex = a.length - 1;
int profit = 0;
for(int i = 1; i < a.length; i++)
{
if (a[i] < lowest && i < highestIndex)
{
lowest = a[i];
lowestIndex = i;
}
}
for(int i = a.length - 2; i >= 0; i--)
{
if (a[i] > highest && i > lowestIndex)
{
highest = a[i];
highestIndex = i;
}
}
for(int i = 1; i < a.length; i++)
{
if (a[i] < lowest && i < highestIndex)
{
lowest = a[i];
lowestIndex = i;
}
}
if (highestIndex > lowestIndex)
{
profit = highest - lowest;
return new Trade(lowestIndex, highestIndex, profit);
}
return new Trade(lowestIndex, highestIndex, profit);
}
}
This function is of O(n) which is superior to O(n log n) .
In general you just look at the loops, since there is no nested loops and you only have loops which go through all elements of a The function is considered n.
The complexity is O(n), where n the length of array a.
You loop 3 times over a, so the running time is roughly 3n, so it is of the order n: O(n).
Try finding the answer to this by yourself. It will help a lot in the future. Also this looks like a O(N) , I am not sure why you are convinced that it is O(NlogN).
This link might be useful,
http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html
O(n)
It is directly proportional to the number of a.length. Each time the for function is run, it runs through every day of data. If there were a method where the number of processes went up by more than the pure number (nested fors) then it could be O(n log n) or O(n^2). But in this case, it's pretty clearly just big O of n.

Why is sorting/initializing an array not counted in Big O?

I am trying to find the most efficient answer (without using a HashMap) to the problem: Find the most frequent integer in an array.
I got answers like:
public int findPopular(int[] a) {
if (a == null || a.length == 0)
return 0;
Arrays.sort(a);
int previous = a[0];
int popular = a[0];
int count = 1;
int maxCount = 1;
for (int i = 1; i < a.length; i++) {
if (a[i] == previous)
count++;
else {
if (count > maxCount) {
popular = a[i-1];
maxCount = count;
}
previous = a[i];
count = 1;
}
}
return count > maxCount ? a[a.length-1] : popular;
}
and
public class Mode {
public static int mode(final int[] n) {
int maxKey = 0;
int maxCounts = 0;
int[] counts = new int[n.length];
for (int i=0; i < n.length; i++) {
counts[n[i]]++;
if (maxCounts < counts[n[i]]) {
maxCounts = counts[n[i]];
maxKey = n[i];
}
}
return maxKey;
}
public static void main(String[] args) {
int[] n = new int[] { 3,7,4,1,3,8,9,3,7,1 };
System.out.println(mode(n));
}
}
The first code snippet claims to be O(n log n). However, the Arrays.sort() function alone is O(n log n) [3]. If you add the for loop, wouldn't the findPopular() function be O(n^2 * log n)? Which would simplify to O(n^2)?
The second code [2] snippet claims to be O(n). However, why do we not consider the initialization of the arrays into our calculation? The initialization of the array would take O(n) time [4], and the for loop would take O(n). So wouldn't the mode() function be O(n^2)?
If I am correct, that would mean I have yet to see an answer that is more efficient than O(n^2).
As always, thank you for the help!
Sources:
Find the most popular element in int[] array
Write a mode method in Java to find the most frequently occurring element in an array
The Running Time For Arrays.Sort Method in Java
Java: what's the big-O time of declaring an array of size n?
Edit: Well, I feel like an idiot. I'll leave this here in case someone made the same mistake I did.
When you perform two tasks one after the other, you add complexities:
Arrays.sort(a); // O(n log n)
for (int i = 0; i < n; i++) { // O(n)
System.out.println(a[i]);
}
// O(n log n + n) = O(n (log n + 1)) = O(n log n)
Only when you repeat an algorithm, you will multiply:
for (int i = 0; i < n; i++) { // O(n)
Arrays.sort(a); // O(n log n), will be executed n times
}
// O((n log n) * n) = O(n² log n)
code -1 : you have only one for loop . So effectively, your time complexity will be : O(n Log n) + O(n) approximately equal to (n Log n)
code-2: Initialization also takes O(n). So effectively, O(n) + O(n) (loop) is still O(n).
Note : While calculating time-complexities with O (big-O), you just need the biggest term(s)

Categories

Resources