Finding median in unsorted array - java

I need to take input such as 4 1 6 5 0. The 4 determines how big the array is, and the rest are array elements. The catch is I can't sort it first. I'm at a loss of how to begin.

There Is A Chapter In MIT's Introduction To Algorithm Course (http://www.catonmat.net/blog/mit-introduction-to-algorithms-part-four) Dedicated To Order Statics. You Can Find The Median In O(N) Expected Time, O(N^2) Worst Case.

I think that you should use sorted list, it uses a perform any algorithm to sort the list. So you can sort them first and then get the n/2 element, it's your median.

Related

Search the different number in array, when all the other numbers are same , can this be done in O(logn) using divide and conquer

Lets say we have a very large array and we need to find the only different number in the array, all the other numbers are the same in the array, can we find it in O(log n) using divide and conquer, just like mergeSort, please provide an implementation.
This cannot be done in better time complexity than O(n) unless that array is special. With the constraints you have given, even if you apply an algorithm like divide and conquer you have to visit every array element at least once.
As dividing the array will be O(log n) and comparing 2 elements when array is reduced to size 2 will be O(1)
This is wrongly put. Dividing the array is not O(log n). The reason why something like a binary search works in O(log n) is because the array is sorted and that way you can discard the other half of the array at every step even without looking at what elements they have, thereby halving the size of original problem.
Intuitively, you can think this as follows : Even if you keep on dividing the array into halves, the leaf nodes of the tree formed are n/2 (Considering you compare 2 elements at leaf). You will have to make n/2 comparisons, which leads to asymptotic complexity of O(n).

Could a modified quicksort be O(n) best case?

It's generally agreed that the best case for quicksort is O(nlogn), given that the array is partitioned by roughly half each time. It's also said that the worst case is order n^2, assuming that the array is sorted.
Can't we modify quicksort by setting a boolean called swap? For example, if there is no initial swap in position for the first pass, then we can assume that the array is already sorted, therefore do not partition the data any further.
I know that the modified bubble sort uses this by checking for swaps, allowing the best case to be O(n) rather than O(n^2). Can this method be applied to quicksort? Why or why not?
There is one mistake with your approach...
For example we have an array like this:
1243 5 678
our Pivot Element is 5. After a first pass there would be no swap(because 4 and 3 are both smaller), but the array is NOT sorted. So you have to start dividing it and that leads to n log n.
No, this won't work for quicksort. In bubble sort if you do a pass through the array without making any swaps you know that the entire array is sorted. This is because each element is compared to its neighbor in bubble sort, so you can infer that the entire array is sorted after any pass where no swaps are done.
That isn't the case in quicksort. In quicksort each element is compared to a single pivot element. If you go through an entire pass without moving anything in quicksort it only tells you that the elements are sorted with respect to the pivot (values less than the pivot are to its left, values greater than the pivot are to its right), not to each other.
Also, there is also the problem that you also get O(n) behaviour with almost sorted arrays in addition to the fully sorted input.
You can try harder to make your approach work but I don't think you can make it breach the O(n log n) boundary. There is a proof that comparison-based sorts cannot be more efficient then O(n log n) in the worst case.

What is the most inefficient sorting routine?

For an array of integers, what is the least efficient way of sorting the array. The function should make progress in each step (eg no infinity loop). What is the runtime of that algorithm?
There can be no least efficient algorithm for anything. This can easily be proved by contradiction so long as you accept that, starting from any algorithm, another equivalent but less efficient algorithm can be constructed.
Bogosort has average runtime of O(n*n!), ouch.
The stupid sort is surely the worst algorithm. It's not exactly an infinite loop but this approach has as worst case O(inf) and the avarage is O(n × n!).
You can do it in O(n!*n) by generating all unqiue permutations and afterwards checking if the array is sorted.
The least efficiant algorithm I can think with a finite upper bound on runtime is permutation sort. The idea is to generate every permutation (combination) of the inputs and check if it's sorted.
The upper bound is O(n!), the lower bound is O(n) when the array is already sorted.
Iterate over all finite sequences of integers using diagonalization.
For each sequence, test whether it's sorted.
If so, test whether its elements match the elements in your array.
This will have an upper bound (first guess: O(n^(n*m)?).
Strange question, normally we go for the fastest.
Find the highest and move it to a list. Repeat the previous step till the original list has only one element left. You are guaranteed O(N^2).
Bogosort is a worst sorting algorithm uses shuffle. But in another point of view it has low probability to sort array in one step :)
"Worstsort" has a complexity of where factorial of n iterated m times. The paper by Miguel A. Lerma can be found here.
Bogobogo sort.It's like bogo sort,shuffles.But it creates auxiliary arrays,the first one is the same array others are smaller by 1 element compared to previous one.They can be removed as well.
It's average complexity is O(N superfactorial*n). It's best case is O(N^2). Just like bogo sort it has worst case of O(inf).

split a linked list into 2 even lists containing the smallest and largest numbers

Given a linked list of integers in random order, split it into two new linked lists such that the difference in the sum of elements of each list is maximal and the length of the lists differs by no more than 1 (in the case that the original list has an odd number of elements). I can't assume that the numbers in the list are unique.
The algorithm I thought of was to do a merge sort on the original linked list (O(n·log n) time, O(n) space ) and then use a recursive function to walk to the end of the list to determine its length, doing the splitting while the recursive function is unwinding. The recursive function is O(n) time and O(n) space.
Is this the optimal solution? I can post my code if someone thinks it's relevant.
No it's not optimal; you can find the median of a list in O(n), then put half of them in one list (smaller than median or equal, upto list size be n/2) and half of them in another list ((n+1)/2). Their sum difference is maximized, and there is no need to sort (O(n·log(n)). All things will be done in O(n) (space and time).
Why do you need recursive function? While sorting list, you can count it elements. Then just split it in half. This drops O(n) space requirement.
Even if you can't count list length while sorting, it still can be split in O(n) time and O(1) space: get two list iterators on the beginning, advance first 2 elements at each step, second 1 element each step. When first reaches list end - cut at second.

about the usage of modulus operator

this a part of code for Quick Sort algorithm but realy I do not know that why it uses rand() %n please help me thanks
Swap(V,0,rand() %n) // move pivot elem to V[0]
It is used for randomizing the Quick Sort to achieve an average of nlgn time complexity.
To Quote from Wikipedia:
What makes random pivots a good choice?
Suppose we sort the list and then
divide it into four parts. The two
parts in the middle will contain the
best pivots; each of them is larger
than at least 25% of the elements and
smaller than at least 25% of the
elements. If we could consistently
choose an element from these two
middle parts, we would only have to
split the list at most 2log2n times
before reaching lists of size 1,
yielding an algorithm.
Quick sort has its average time complexity O(nlog(n)) but worst case complexity is n^2 (when array is already sorted). So to make it O(nlog(n)) pivot is chosen randomly so rand()%n is generating a random index between 0 to n-1.

Categories

Resources