1 / 44

CSE 326 Heaps and the Priority Queue ADT

CSE 326 Heaps and the Priority Queue ADT. David Kaplan Dept of Computer Science & Engineering Autumn 2001. Back to Queues. Some applications ordering CPU jobs simulating events picking the next search site Problems? short jobs should go first

shirin
Download Presentation

CSE 326 Heaps and the Priority Queue ADT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 326Heaps and the Priority Queue ADT David Kaplan Dept of Computer Science & Engineering Autumn 2001

  2. Back to Queues • Some applications • ordering CPU jobs • simulating events • picking the next search site • Problems? • short jobs should go first • earliest (simulated time) events should go first • most promising sites should be searched first CSE 326 Autumn 2001 2

  3. Priority Queue ADT • Priority Queue operations • create • destroy • insert • deleteMin • is_empty • Priority Queue property: for two elements in the queue, x and y, if x has a lower priority value than y, x will be deleted before y F(7) E(5) D(100) A(4) B(6) deleteMin insert G(9) C(3) CSE 326 Autumn 2001 3

  4. Applications of the Priority Q • Hold jobs for a printer in order of length • Store packets on network routers in order of urgency • Simulate events • Select symbols for compression • Sort numbers • Anything greedy CSE 326 Autumn 2001 4

  5. Naïve Priority Q Data Structures • Unsorted list: • insert: • deleteMin: • Sorted list: • insert: • deleteMin: CSE 326 Autumn 2001 5

  6. Binary Search TreePriority Q Data Structure (aka BST PQD :-) 8 insert: deleteMin: 5 11 2 6 10 12 4 7 9 14 13 CSE 326 Autumn 2001 6

  7. Heap-order property parent’s key is less than children’s keys result: minimum is always at the top Structure property complete tree: fringe nodes packed to the left result: depth is always O(log n); next open location always known 2 4 5 7 6 10 8 11 9 15 14 13 Binary HeapPriority Q Data Structure How do we find the minimum? CSE 326 Autumn 2001 7

  8. 2 4 5 7 6 10 8 11 9 15 14 13 Nifty Storage Trick 1 Calculations • children: • parent: • root: • next free: 2 3 4 7 5 6 8 9 11 12 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 13 2 4 5 7 6 10 8 11 9 15 14 13 Note: Walking the array in index order gives us level-order traversal!!! CSE 326 Autumn 2001 8

  9. 2 4 5 7 6 10 8 11 9 15 14 13 BinaryHeap::DeleteMin 2 pqueue.deleteMin() ? 4 5 7 6 10 8 11 9 15 14 13 CSE 326 Autumn 2001 9

  10. Percolate Down ? 13 4 4 5 5 7 6 10 8 7 6 10 8 11 9 15 14 13 11 9 15 14 4 4 13 5 6 5 6 7 10 8 13 7 10 8 11 9 15 14 Done! 11 9 15 14

  11. DeleteMin Code Object deleteMin() { assert(!isEmpty()); returnVal = Heap[1]; size--; newPos = percolateDown(1, Heap[size+1]); Heap[newPos] = Heap[size + 1]; return returnVal; } int percolateDown(int hole, Object val) { while (2*hole <= size) { left = 2*hole; right = left + 1; if (right <= size && Heap[right] < Heap[left]) target = right; else target = left; if (Heap[target] < val) { Heap[hole] = Heap[target]; hole = target; } else break; } return hole; } runtime: CSE 326 Autumn 2001 11

  12. 2 4 5 7 6 10 8 11 9 12 14 20 BinaryHeap::Insert pqueue.insert(3) 2 4 5 7 6 10 8 11 9 12 14 20 ? CSE 326 Autumn 2001 12

  13. Percolate Up 2 2 4 5 4 5 3 7 6 10 8 7 6 ? 8 3 11 9 12 14 20 ? 11 9 12 14 20 10 2 2 3 4 ? 4 3 7 6 5 8 7 6 5 8 11 9 12 14 20 10 11 9 12 14 20 10

  14. Insert Code void insert(Object o) { assert(!isFull()); size++; newPos = percolateUp(size,o); Heap[newPos] = o; } int percolateUp(int hole, Object val) { while (hole > 1 && val < Heap[hole/2]) Heap[hole] = Heap[hole/2]; hole /= 2; } return hole; } runtime: CSE 326 Autumn 2001 14

  15. Performance of Binary Heap In practice: binary heaps much simpler to code, lower constant factor overhead CSE 326 Autumn 2001 15

  16. Changing Priorities In many applications the priority of an object in a priority queue may change over time • if a job has been sitting in the printer queue for a long time increase its priority • unix “renice” • Sysadmin may raise priority of a critical task Must have some (separate) way to find the position in the queue of the object to change (e.g. a hash table) • No log(N) find (as with BSTs) – why not? CSE 326 Autumn 2001 16

  17. Other Priority Queue Operations decreaseKey • given a pointer to an object in the queue, reduce its priority value increaseKey • given a pointer to an object in the queue, increase its priority value remove • given a pointer to an object in the queue, remove it buildHeap • given a set of items, build a heap CSE 326 Autumn 2001 17

  18. DecreaseKey, IncreaseKey, Remove void decreaseKey(int obj) { assert(size >= obj); temp = Heap[obj]; newPos = percolateUp(obj, temp); Heap[newPos] = temp; } void remove(int obj) { assert(size >= obj); percolateUp(obj, NEG_INF_VAL); deleteMin(); } Note: changeKey functions assume that key value has already been changed! void increaseKey(int obj) { assert(size >= obj); temp = Heap[obj]; newPos = percolateDown(obj, temp); Heap[newPos] = temp; } CSE 326 Autumn 2001 18

  19. BuildHeap (Floyd’s Method) 12 5 11 3 10 6 9 4 8 1 7 2 pretend it’s a heap and fix the heap-order property! 12 5 11 3 10 6 9 4 8 1 7 2 Thank you, Floyd! CSE 326 Autumn 2001 19

  20. Build(this)Heap 12 12 5 11 5 11 3 10 2 9 3 1 2 9 4 8 1 7 6 4 8 10 7 6 12 12 5 2 1 2 3 1 6 9 3 5 6 9 4 8 10 7 11 4 8 10 7 11

  21. Finish Build(ing)(this)Heap 1 3 2 4 5 6 9 12 8 10 7 11 Runtime? CSE 326 Autumn 2001 21

  22. Complexity of Build Heap • Note: size of a perfect binary tree doubles (+1) with each additional layer • At most n/4 percolate down 1 levelat most n/8 percolate down 2 levelsat most n/16 percolate down 3 levels… O(n) CSE 326 Autumn 2001 22

  23. Thinking about Heaps Observations • finding a child/parent index is a multiply/divide by two • operations jump widely through the heap • each operation looks at only two new nodes • inserts are at least as common as deleteMins Realities • division and multiplication by powers of two are fast • looking at one new piece of data sucks in a cache line • with huge data sets, disk accesses dominate CSE 326 Autumn 2001 23

  24. Solution: d-Heaps • Each node has d children • Still representable by array • Good choices for d: • optimize performance based on # of inserts/removes • d = 2k for efficiency (array index calcs) • fit one set of children in a cache line • fit one set of children on a memory page/disk block 1 3 7 2 4 8 5 12 11 10 6 9 12 1 3 7 2 4 8 5 12 11 10 6 9 What do d-heaps remind us of??? CSE 326 Autumn 2001 24

  25. Merging Heaps Given two heaps, merge them into one heap • first attempt: insert each element of the smaller heap into the larger. runtime: • second attempt: concatenate heaps’ arrays and run buildHeap. runtime: How about O(log n) time? CSE 326 Autumn 2001 25

  26. Solution: Leftist Heaps Idea Localize all maintenance work in one small part of the heap Leftist heap: • almost all nodes are on the left • all the merging work is on the right CSE 326 Autumn 2001 26

  27. Null Path Length The null path length (NPL) of a node is the number of nodes between it and a null in the tree npl(null) = -1 npl(leaf) = 0 npl(single-child node) = 0 2 1 1 another way of looking at it: NPL is the height of complete subtree rooted at this node 0 1 0 0 0 0 0 CSE 326 Autumn 2001 27

  28. Leftist Heap Properties Heap-order property • parent’s priority value  childrens’ priority values •  minimum element is at the root Leftist property •  nodes, NPL(left subtree)  NPL(right subtree) •  tree is at least as “heavy” on the left as the right Are leftist trees complete? Balanced? Socialist? CSE 326 Autumn 2001 28

  29. Leftist Tree Examples NOT leftist leftist leftist 2 2 0 1 1 1 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 every subtree of a leftist tree is leftist, comrade! 0 0 CSE 326 Autumn 2001 29 0

  30. Right Path in a Leftist Tree is Short Theorem: If right path-length is at least r, the tree has at least 2r - 1 nodes Proof by induction: Basis: r = 1. Tree has at least one node: 21 - 1 = 1 Inductive step: Assume true for r’< r. Right subtree has a right path of at least r - 1 nodes, so it has at least 2(r-1) - 1 nodes. Left subtree must also have a right path of at least r - 1 (otherwise, there is a null path of r - 3, less than the right subtree). Again, the left has 2(r-1) - 1 nodes. All told then, there are at least: 2(r-1) - 1 + 2(r-1) - 1 + 1 = 2r - 1  Leftist tree with at least n nodes has right path of at most log n nodes CSE 326 Autumn 2001 30

  31. Merging Two Leftist Heaps Merge(T1,T2) returns one leftist heap containing all elements of the two (distinct) leftist heaps T1 and T2 merge a T1 a merge a < b L1 R1 L1 R1 b T2 b L2 R2 L2 R2 CSE 326 Autumn 2001 31

  32. Merge Continued a a npl(R’) > npl(L1) L1 R’ R’ L1 R’ = Merge(R1, T2) runtime: CSE 326 Autumn 2001 32

  33. Operations on Leftist Heaps merge two trees of total size n: O(log n) insert into heap size n: O(log n) • pretend node is a size 1 leftist heap • insert by merging original heap with one node heap deleteMin with heap size n: O(log n) • remove and return root • merge left and right subtrees merge merge CSE 326 Autumn 2001 33

  34. Merge Example merge ? 1 3 5 merge 0 0 0 7 1 ? 10 12 5 5 0 1 merge 14 0 0 0 3 10 12 0 10 12 0 0 0 7 8 0 8 8 0 14 0 8 0 12

  35. Sewing Up the Example ? ? 2 3 3 3 0 0 0 ? 7 1 1 7 7 5 5 5 0 0 0 0 0 0 0 0 14 14 14 8 10 0 10 8 10 8 0 0 12 0 12 12 Done? CSE 326 Autumn 2001 35

  36. Finally 2 2 3 3 0 0 1 1 7 7 5 5 0 0 0 0 0 0 14 14 8 8 10 10 0 0 12 12 CSE 326 Autumn 2001 36

  37. Iterative Leftist Merge • Downward Pass • Merge right paths merge 1 5 2 0 0 3 10 12 0 1 1 7 5 3 0 0 0 0 0 14 8 10 7 8 0 0 12 14 CSE 326 Autumn 2001 37

  38. Iterative Leftist Merge (part deux) 2 2 2 3 3 3 0 0 0 1 1 1 7 7 7 5 5 5 0 0 0 0 0 0 0 0 0 14 14 14 8 8 8 10 10 10 0 0 0 12 12 12 2 • Upward Pass • Fix-up Leftist Heap Property 3 0 1 7 5 0 0 0 14 8 10 What do we need to do leftist merge iteratively? 0 12

  39. (One More)Amortized Time am·or·tize To write off an expenditure for (office equipment, for example) by prorating over a certain period. time A nonspatial continuum in which events occur in apparently irreversible succession from the past through the present to the future. am·or·tized time Running time limit resulting from writing off expensive runs of an algorithm over multiple cheap runs of the algorithm, usually resulting in a lower overall running time than indicated by the worst possible case. If M operations take total O(M log N) time, amortized time per operation is O(log N) CSE 326 Autumn 2001 39

  40. Skew Heaps • Problems with leftist heaps • extra storage for NPL • two pass merge (with stack!) • extra complexity/logic to maintain and check NPL • Solution: skew heaps • blind adjusting version of leftist heaps • amortized time for merge, insert, and deleteMin is O(log n) • worst case time for all three is O(n) • merge always switches children when fixing right path • iterative method has only one pass CSE 326 Autumn 2001 40

  41. Merging Two Skew Heaps merge T1 a a merge L1 R1 L1 R1 a < b T2 b b L2 R2 L2 R2 CSE 326 Autumn 2001 41

  42. Skew Heap Example 3 3 merge merge 7 7 5 5 5 merge 14 14 10 12 10 12 10 12 8 8 3 3 7 8 7 5 14 14 10 8 12

  43. Skew Heap Code void merge(heap1, heap2) { case { heap1 == NULL: return heap2; heap2 == NULL: return heap1; heap1.findMin() < heap2.findMin(): temp = heap1.right; heap1.right = heap1.left; heap1.left = merge(heap2, temp); return heap1; otherwise: return merge(heap2, heap1); } } CSE 326 Autumn 2001 43

  44. Heaps o’ Heaps CSE 326 Autumn 2001 44

More Related