Calculate the percentage of worst cases for quick sort - quicksort

if I have n! options for array of n different numbers (1 till n) as input in quicksort ,how many of them can be worst cases? I understand that worst case is when the array is sorted/almost sorted , or reversed sorted/almost reversed sorted.
but I need to find out how much in percentage of all n! options I have, can be the worst cases , cant find the mathematical way to calculate that. is there anyway to calculate it ?

Related

Quicksort, given 3 values, how can I get to 9 operations?

Well, I want to use Quick sort on given 3 values, doesn't matter what values, how can I get to the worst case which is 9 operations?
Can anyone draw a tree and show how it show nlogn and n^2 operations? I've tried to find on the internet, but I still didnt manage to draw one properly to show that.
The worst case complexity of quick sort depends on the chosen pivot. If the pivot chosen is the leftmost or the rightmost element. Then the worst case complexity will occur in the following cases:
1) Array is already sorted in same order.
2) Array is already sorted in reverse order.
3) All elements are same (special case of case 1 and 2).
Since these cases occur very frequently, the pivot is chosen randomly. By choosing pivot randomly the chances of worst case are reduced.
The analysis of quicksort algorithm is explained in this blogpost by khan academy.

Why is merge sort's worst case still n log n?

It was a question on my final I took earlier and I had no idea how to answer it.
Well it was
What is Merge sort's worst case runtime but MORE IMPORTANTLY, why?
The divide-and-conquer contributes a log(n) factor. You divide the array in half log(n) times, and each time you do, for each segment, you have to do a merge on two sorted array. Merging two sorted arrays is O(n). The algorithm is just to walk up the two arrays, and walk up the one that's lagging.
The recursion you get is r(n) = O(n) + r(roundup(n/2))+r(rounddown(n/2).
The problem is that you cant use the Masters Theorem for solving this due to the rounding. Hence you can ether do the math or use a little hack-like solution. If ur input isn't a power of two number just "blow it up". Then u can use the masters theorem on r(n) = O(n) + 2r(n/2). Obviously this leads to O(nlogn). The function merge() itself is in O(n), because in the worst case you need n-1 compares.

algorithm to compare numbers within a certain distance from each other

So I have an array of numbers that look something like
1,708,234
2,802,532
11,083,432
5,098,123
5,777,111
I want to find out when two numbers are within a certain distance from each other (say 1,500,000) so I can group them into the same location and have just one UI element represent both for the level of zoom I am looking at. How would one go about doing this smartly or efficiently. I'm thinking I would just start with the first entry, loop through all the elements, and if one was close to another, flag those two and put it in a dictionary of some sort. That would be my brute force method, but I'm thinking there has to be a better way.
I'm coding in obj-c btw if that makes or breaks any design decisions.
How many numbers are we dealing with here? If it's small enough:
Sort the numbers (generally n-log-n)
Run through each number, n, and compare its bigger neighbor, n+1, to see if it's within your range.
Repeat for n+2, n+3, until the number is no longer within your range.
Your brute force method there is O((n/2)^2). This method will bring it to O(n + n log(n)), or O(n log n) on the average case.

What element of the array would be the median if the the size of the array was even and not odd?

I read that it's possible to make quicksort run at O(nlogn)
the algorithm says on each step choose the median as a pivot
but, suppose we have this array:
10 8 39 2 9 20
which value will be the median?
In math if I remember correct the median is (39+2)/2 = 41/2 = 20.5
I don't have a 20.5 in my array though
thanks in advance
You can choose either of them; if you consider the input as a limit, it does not matter as it scales up.
We're talking about the exact wording of the description of an algorithm here, and I don't have the text you're referring to. But I think in context by "median" they probably meant, not the mathematical median of the values in the list, but rather the middle point in the list, i.e. the median INDEX, which in this cade would be 3 or 4. As coffNjava says, you can take either one.
The median is actually found by sorting the array first, so in your example, the median is found by arranging the numbers as 2 8 9 10 20 39 and the median would be the mean of the two middle elements, (9+10)/2 = 9.5, which doesn't help you at all. Using the median is sort of an ideal situation, but would work if the array were at least already partially sorted, I think.
With an even numbered array, you can't find an exact pivot point, so I believe you can use either of the middle numbers. It'll throw off the efficiency a bit, but not substantially unless you always ended up sorting even arrays.
Finding the median of an unsorted set of numbers can be done in O(N) time, but it's not really necessary to find the true median for the purposes of quicksort's pivot. You just need to find a pivot that's reasonable.
As the Wikipedia entry for quicksort says:
In very early versions of quicksort, the leftmost element of the partition would often be chosen as the pivot element. Unfortunately, this causes worst-case behavior on already sorted arrays, which is a rather common use-case. The problem was easily solved by choosing either a random index for the pivot, choosing the middle index of the partition or (especially for longer partitions) choosing the median of the first, middle and last element of the partition for the pivot (as recommended by R. Sedgewick).
Finding the median of three values is much easier than finding it for the whole collection of values, and for collections that have an even number of elements, it doesn't really matter which of the two 'middle' elements you choose as the potential pivot.

Is there any formula to calculate the no of passes that a Quick Sort algorithm will take?

While working with Quick Sort algorithm I wondered whether any formula or some kind of stuff might be available for finding the no of passes that a particular set of values may take to completely sorted in ascending order.
Is there any formula to calculate the no of passes that a Quick Sort algorithm will take?
Any given set of values will have a different number of operations, based on pivot value selection method, and the actual values being sorted.
So...no, unless the approximations of 'between O(N log(N)) and O(N^2)' is good enough.
That one has to qualify the average versus worst case should be enough to show that the only way to determine the number of operations is to actually run the quicksort.