# Sorting algorithms

## Quicksort  Quicksort is one of the fastest and simplest sorting algorithms [Hoa 62]. It works recursively by a divide-and-conquer strategy .

### Idea

First, the sequence to be sorted a is partitioned into two parts, such that all elements of the first part b are less than or equal to all elements of the second part c (divide). Then the two parts are sorted separately by recursive application of the same procedure (conquer). Recombination of the two parts yields the sorted sequence (combine). Figure 1 illustrates this approach. Figure 1: Quicksort(n)

The first step of the partition procedure is choosing a comparison element x. All elements of the sequence that are less than x are placed in the first part, all elements greater than x are placed in the second part. For elements equal to x it does not matter into which part they come. In the following algorithm it may also happen that an element equal to x remains between the two parts.

 Algorithm Partition Input: sequence a0, ..., an-1 with n elements Output: permutation of the sequence such that all elements a0, ..., aj are less than or equal to all elements ai, ..., an-1   (i > j) Method: choose the element in the middle of the sequence as comparison element x let i = 0 and j = n-1 while i j search for the first element ai which is greater than or equal to x search for the last element aj which is less than or equal to x if i j exchange ai and aj let i = i+1 and j = j-1

After partitioning the sequence, quicksort treats the two parts recursively by the same procedure. The recursion ends whenever a part consists of one element only.

### Simulation

The following simulation illustrates the partitioning procedure.

(Java applet for the simulation of the partitioning procedure of quicksort)

### Program

The following Java program implements quicksort. The comparison element x is chosen as the element in the mid between lo and hi 1).

 ```public class QuickSorter { private int[] a; private int n; public void sort(int[] a) { this.a=a; n=a.length; quicksort(0, n-1); } // lo is the lower index, hi is the upper index // of the region of array a that is to be sorted private void quicksort (int lo, int hi) { if (lo>=hi) // less than two elements return; // comparison element x int x=a[lo+(hi-lo)/2]; // partition int i=lo, j=hi; while (i<=j) { while (a[i]x) j--; if (i<=j) exchange(i++, j--); } // recursion quicksort(lo, j); quicksort(i, hi); } private void exchange(int i, int j) { int t=a[i]; a[i]=a[j]; a[j]=t; } } // end class QuickSorter ```

### Analysis

The best-case behavior of the quicksort algorithm occurs when in each recursion step the partitioning produces two parts of equal length. In order to sort n elements, in this case the running time is in Θ(n log(n)). This is because the recursion depth is log(n) and on each level there are n elements to be treated (Figure 2 a).

The worst case occurs when in each recursion step an unbalanced partitioning is produced, namely that one part consists of only one element and the other part consists of the rest of the elements (Figure 2 c). Then the recursion depth is n-1 and quicksort runs in time Θ(n2).

In the average case a partitioning as shown in Figure 2 b is to be expected. Figure 2: Recursion depth of quicksort: a) best case, b) average case, c) worst case

The choice of the comparison element x determines which partition is achieved. Suppose that the first element of the sequence is chosen as comparison element. This would lead to the worst case behavior of the algorithm when the sequence is initially sorted. Therefore, it is better to choose the element in the middle of the sequence as comparison element.

Even better would it be to take the n/2-th greatest element of the sequence (the median). Then the optimal partition is achieved. Actually, it is possible to compute the median in linear time [AHU 74]. This variant of quicksort would run in time O(n log(n)) even in the worst case.

However, the beauty of quicksort lies in its simplicity. And it turns out that even in its simple form quicksort runs in O(n log(n)) on the average. Moreover, the constant hidden in the O-notation is small. Therefore, we trade this for the (rare) worst case behavior of Θ(n2).

Proposition:  The time complexity of quicksort is in

 Θ(n log(n)) in the average case   and in Θ(n2) in the worst case

### Simulation

(Java applet for simulation of quicksort)

### Conclusions

Quicksort turns out to be the fastest sorting algorithm in practice. It has a time complexity of Θ(n log(n)) on the average. However, in the (very rare) worst case quicksort is as slow as Bubblesort, namely in Θ(n2). There are sorting algorithms with a time complexity of O(n log(n)) even in the worst case, e.g. Heapsort and Mergesort. But on the average, these algorithms are by a constant factor slower than quicksort.

It is possible to obtain a worst case complexity of O(n log(n)) with a variant of quicksort (by choosing the median as comparison element). But this algorithm is on the average and in the worst case by a constant factor slower than Heapsort or Mergesort; therefore, it is not interesting in practice.

### References

 [AHU 74] A.V. Aho, J.E. Hopcroft, J.D. Ullman: The Design and Analysis of Computer Algorithms. Addison-Wesley (1974) [Hoa 62] C.A.R. Hoare: Quicksort. Computer Journal, Vol. 5, 1, 10-15 (1962) [CLRS 01] T.H. Cormen, C.E. Leiserson, R.L. Rivest, C. Stein: Introduction to Algorithms. 2nd edition, The MIT Press (2001) [Sed 03] R. Sedgewick: Algorithms in Java, Parts 1-4. 3rd edition, Addison-Wesley (2003)

##### Web
 [Web 1] http://www.bluffton.edu/~nesterd/java/SortingDemo.html   Simulation of quicksort and other sorting algorithms

1)  The computation lo+(hi-lo)/2 instead of (lo+hi)/2 is safer, since lo+hi may cause an integer overflow when it becomes greater than 2.147.483.647.

 Next:   [Heapsort]   or  H.W. Lang   Hochschule Flensburg   lang@hs-flensburg.de   Impressum   Datenschutz   ©   Created: 12.09.1997   Updated: 04.06.2018 