- Heapsort
Infobox Algorithm
class=Sorting algorithm
A run of the heapsort algorithm sorting an array of randomly permuted values. In the first stage of the algorithm the array elements are reordered to satisfy the heap property. Before the actual sorting takes place, the heap tree structure is shown briefly for illustration.
data=Array
time="O(n" log "n")
space="О(n)" total, "O(1)" auxiliary
optimal=SometimesHeapsort (method) is a comparison-based
sorting algorithm , and is part of theselection sort family. Although somewhat slower in practice on most machines than a good implementation ofquicksort , it has the advantage of a worst-case Θ("n" log "n") runtime. Heapsort is anin-place algorithm , but is not astable sort .Overview
Heapsort inserts the input list elements into a heap data structure. The largest value (in a max-heap) or the smallest value (in a min-heap) are extracted until none remain, the values having been extracted in sorted order. The heap's invariant is preserved after each extraction, so the only cost is that of extraction.
During extraction, the only space required is that needed to store the heap. In order to achieve constant space overhead, the heap is stored in the part of the input array that has not yet been sorted. (The structure of this heap is described at .)
Heapsort uses two heap operations: "insertion" and "root deletion". Each extraction places an element in the last empty location of the array. The remaining prefix of the array stores the unsorted elements.
Variations
*The most important variation to the simple variant is an improvement by R. W. Floyd which gives in practice about 25% speed improvement by using only one comparison in each siftup run which then needs to be followed by a siftdown for the original child; moreover it is more elegant to formulate. Heapsort's natural way of indexing works on indices from 1 up to the number of items. Therefore the start address of the data should be shifted such that this logic can be implemented avoiding unnecessary +/- 1 offsets in the coded algorithm.
*Ternary heapsort ["Data Structures Using Pascal", Tenenbaum & Augenstein, 1991, page 405, gives Ternary Heapsort as an exercise for the student. "Write a sorting routine similar to the heapsort except that it uses a ternary heap."] uses a ternary heap instead of a binary heap; that is, each element in the heap has three children. It is more complicated to program, but does a constant number of times fewer swap and comparison operations. This is because each step in the shift operation of a ternary heap requires three comparisons and one swap, whereas in a binary heap two comparisons and one swap are required. The ternary heap does two steps in less time than the binary heap requires for three steps, which multiplies the index by a factor of 9 instead of the factor 8 of three binary steps. Ternary heapsort is about 12% faster than the simple variant of binary heapsort.Fact|date=March 2007
*The smoothsort algorithm [http://www.cs.utexas.edu/users/EWD/ewd07xx/EWD796a.PDF] [http://www.cs.utexas.edu/~EWD/transcriptions/EWD07xx/EWD796a.html] is a variation of heapsort developed by
Edsger Dijkstra in 1981. Like heapsort, smoothsort's upper bound is O("n" log "n"). The advantage of smoothsort is that it comes closer to O("n") time if the input is already sorted to some degree, whereas heapsort averages O("n" log "n") regardless of the initial sorted state. Due to its complexity, smoothsort is rarely used.Comparison with other sorts
Heapsort primarily competes with
quicksort , another very efficient general purpose nearly-in-place comparison-based sort algorithm.Quicksort is typically somewhat faster, due to better cache behavior and other factors, but the worst-case running time for quicksort is O("n"2), which is unacceptable for large data sets and can be deliberately triggered given enough knowledge of the implementation, creating a security risk. See
quicksort for a detailed discussion of this problem, and possible solutions.Thus, because of the O("n" log "n") upper bound on heapsort's running time and constant upper bound on its auxiliary storage, embedded systems with real-time constraints or systems concerned with security often use heapsort.
Heapsort also competes with
merge sort , which has the same time bounds, but requires Ω(n) auxiliary space, whereas heapsort requires only a constant amount. Heapsort also typically runs more quickly in practice on machines with small or slowdata cache s. On the other hand, merge sort has several advantages over heapsort:
* Like quicksort, merge sort on arrays has considerably better data cache performance, often outperforming heapsort on a modern desktop PC, because it accesses the elements in order.
* Merge sort is astable sort .
* Merge sort parallelizes better; the most trivial way of parallelizing merge sort achieves close tolinear speedup , while there is no obvious way to parallelize heapsort at all.
* Merge sort can be easily adapted to operate onlinked list s and very large lists stored on slow-to-access media such asdisk storage ornetwork attached storage . Heapsort relies strongly onrandom access , and its poorlocality of reference makes it very slow on media with long access times.An interesting alternative to Heapsort is
Introsort which combines quicksort and heapsort to retain advantages of both: worst case speed of heapsort and average speed of quicksort.Pseudocode
The following is the "simple" way to implement the algorithm, in
pseudocode , where "swap" is used to swap two elements of the array. Notice that the arrays are zero based in this example. function heapSort(a, count) is input: an unordered array "a" of length "count" "(first place a in max-heap order)" heapify(a, count) end := count - 1 while end > 0 do "(swap the root(maximum value) of the heap with the last element of the heap)" swap(a [end] , a [0] ) "(decrease the size of the heap by one so that the previous max value will" "stay in its proper placement)" end := end - 1 "(put the heap back in max-heap order)" siftDown(a, 0, end) function heapify(a,count) is "(start is assigned the index in a of the last parent node)" start := (count - 2) / 2 while start ≥ 0 do "(sift down the node at index start to the proper place such that all nodes below" " the start index are in heap order)" siftDown(a, start, count-1) start := start - 1 "(after sifting down the root all nodes/elements are in heap order)" function siftDown(a, start, end) is input: "end represents the limit of how far down the heap" "to sift." root := start while root * 2 ≤ end do "(While the root has at least one child)" child := root * 2 "(root*2 points to the left child)" "(If the child has a sibling and the child's value is less than its sibling's...)" if child + 1 < end and a [child] < a [child + 1] then child := child + 1 "(... then point to the right child instead)" if a [root] < a [child] then "(out of max-heap order)" swap(a [root] , a [child] ) root := child "(repeat to continue sifting down the child now)" else returnThe heapify function can be thought of as building a heap from the bottom up, successively shifting downward to establish the heap property. An alternate version (shown below) that builds the heap top-down and sifts upward is conceptually simpler to grasp. This "siftUp" version can be visualized as starting with an empty heap and successively inserting elements. However, it is asymptotically slower: the "siftDown" version is "O(n)", and the "siftUp" version is "O(n log n)" in the worst case. The heapsort algorithm is "O(n log n)" overall using either version of heapify. function heapify(a,count) is "(end is assigned the index of the first (left) child of the root)" end := 1 while end < count "(sift up the node at index end to the proper place such that all nodes above" " the end index are in heap order)" siftUp(a, 0, end) end := end + 1 "(after sifting up the last node all nodes are in heap order)" function siftUp(a, start, end) is input: "start represents the limit of how far up the heap to sift." "end is the node to sift up." child := end while child > start parent := floor((child - 1) ÷ 2) if a [parent] < a [child] then "(out of max-heap order)" swap(a [parent] , a [child] ) child := parent "(repeat to continue sifting up the parent now)" else returnReferences
*
J. W. J. Williams . "Algorithm 232 - Heapsort", 1964, Communications of the ACM 7(6): 347–348.
*Robert W. Floyd . "Algorithm 245 - Treesort 3", 1964, Communications of the ACM 7(12): 701.
*Svante Carlsson , "Average-case results on heapsort", 1987, BIT 27(1): 2-17.
*Donald Knuth . "The Art of Computer Programming", Volume 3: "Sorting and Searching", Third Edition. Addison-Wesley, 1997. ISBN 0-201-89685-0. Pages 144–155 of section 5.2.3: Sorting by Selection.
*Thomas H. Cormen ,Charles E. Leiserson ,Ronald L. Rivest , andClifford Stein . "Introduction to Algorithms ", Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Chapters 6 and 7 Respectively: Heapsort and Priority Queues
* [http://www.cs.utexas.edu/users/EWD/ewd07xx/EWD796a.PDF A PDF of Dijkstra's original paper on Smoothsort]
* [http://cis.stvincent.edu/html/tutorials/swd/heaps/heaps.html Heaps and Heapsort Tutorial] by David Carlson, St. Vincent CollegeNotes
External links
* [http://tide4javascript.com/?s=Heapsort Analyze Heapsort in an online Javascript IDE]
* [http://24bytes.com/Heap-Sort.html Heapsort ]
* [http://www2.hawaii.edu/~copley/665/HSApplet.html Heapsort animated]
* [http://www.nist.gov/dads/HTML/heapSort.html NIST's Dictionary of Algorithms and Data Structures: Heapsort]
* [http://www.azillionmonkeys.com/qed/sort.html Sorting revisited]
* [http://www.cs.mu.oz.au/aia/HeapSort.html Java-based demonstration of Heapsort]
* [http://vision.bc.edu/~dmartin/teaching/sorting/anim-html/heap.html A graphical demonstration and discussion of heap sort]
* [http://coderaptors.com/?HeapSort A colored graphical Java applet] which allows experimentation with initial state and shows statistics
* [http://www2.hawaii.edu/~copley/665/HSApplet.html Another java demonstration of Heapsort]
Wikimedia Foundation. 2010.