1 / 38

CSE 326: Data Structures Lecture #17 Priority Queues

CSE 326: Data Structures Lecture #17 Priority Queues. Alon Halevy Spring Quarter 2001. Heap-order property parent’s key is less than children’s keys result: minimum is always at the top Structure property complete tree with fringe nodes packed to the left

landon
Download Presentation

CSE 326: Data Structures Lecture #17 Priority Queues

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 326: Data StructuresLecture #17Priority Queues Alon Halevy Spring Quarter 2001

  2. Heap-order property parent’s key is less than children’s keys result: minimum is always at the top Structure property complete tree with fringe nodes packed to the left result: depth is always O(log n); next open location always known 2 4 5 7 6 10 8 11 9 12 14 20 Binary Heap Priority Q Data Structure

  3. 2 4 5 7 6 10 8 11 9 12 14 20 DeleteMin pqueue.deleteMin() 2 ? 4 5 7 6 10 8 11 9 12 14 20

  4. Percolate Down ? 4 4 5 ? 5 7 6 10 8 7 6 10 8 11 9 12 14 20 11 9 12 14 20 4 4 6 5 6 5 7 ? 10 8 7 12 10 8 11 9 12 14 20 11 9 20 14 20

  5. Finally… 4 6 5 7 12 10 8 11 9 20 14

  6. DeleteMin Code int percolateDown(int hole, Object val) { while (2*hole <= size) { left = 2*hole; right = left + 1; if (right <= size && Heap[right] < Heap[left]) target = right; else target = left; if (Heap[target] < val) { Heap[hole] = Heap[target]; hole = target; } else break; } return hole; } Object deleteMin() { assert(!isEmpty()); returnVal = Heap[1]; size--; newPos = percolateDown(1, Heap[size+1]); Heap[newPos] = Heap[size + 1]; return returnVal; } runtime:

  7. 2 4 5 7 6 10 8 11 9 12 14 20 Insert pqueue.insert(3) 2 4 5 7 6 10 8 11 9 12 14 20 ?

  8. Percolate Up 2 2 4 5 4 5 3 7 6 10 8 7 6 ? 8 3 11 9 12 14 20 ? 11 9 12 14 20 10 2 2 3 4 ? 4 3 7 6 5 8 7 6 5 8 11 9 12 14 20 10 11 9 12 14 20 10

  9. Insert Code int percolateUp(int hole, Object val) { while (hole > 1 && val < Heap[hole/2]) Heap[hole] = Heap[hole/2]; hole /= 2; } return hole; } void insert(Object o) { assert(!isFull()); size++; newPos = percolateUp(size,o); Heap[newPos] = o; } runtime:

  10. Performance of Binary Heap • In practice: binary heaps much simpler to code, lower constant factor overhead

  11. Changing Priorities • In many applications the priority of an object in a priority queue may change over time • if a job has been sitting in the printer queue for a long time increase its priority • unix “renice” • Alon’s story • Must have some (separate) way of find the position in the queue of the object to change (e.g. a hash table)

  12. Other Priority Queue Operations • decreaseKey • given the position of an object in the queue, reduce its priority value • increaseKey • given the position of an an object in the queue, increase its priority value • remove • given the position of an object in the queue, remove it • buildHeap • given a set of items, build a heap

  13. DecreaseKey, IncreaseKey, and Remove void decreaseKey(int pos, int delta){ temp = Heap[pos] - delta; newPos = percolateUp(pos, temp); Heap[newPos] = temp; } void increaseKey(int pos, int delta) { temp = Heap[pos] + delta; newPos = percolateDown(pos, temp); Heap[newPos] = temp; } void remove(int pos) { percolateUp(pos, NEG_INF_VAL); deleteMin(); }

  14. BuildHeapFloyd’s Method. Thank you, Floyd. 12 5 11 3 10 6 9 4 8 1 7 2 pretend it’s a heap and fix the heap-order property! 12 Easy worst case bound: Easy average case bound: 5 11 3 10 6 9 4 8 1 7 2

  15. Build(this)Heap 12 12 5 11 5 11 3 10 2 9 3 1 2 9 4 8 1 7 6 4 8 10 7 6 12 12 5 2 1 2 3 1 6 9 3 5 6 9 4 8 10 7 11 4 8 10 7 11

  16. Finally… 1 3 2 4 5 6 9 12 8 10 7 11 runtime?

  17. Complexity of Build Heap • Note: size of a perfect binary tree doubles (+1) with each additional layer • At most n/4 percolate down 1 levelat most n/8 percolate down 2 levelsat most n/16 percolate down 3 levels… O(n)

  18. Thinking about Heaps • Observations • finding a child/parent index is a multiply/divide by two • operations jump widely through the heap • each operation looks at only two new nodes • inserts are at least as common as deleteMins • Realities • division and multiplication by powers of two are fast • looking at one new piece of data terrible in a cache line • with huge data sets, disk accesses dominate

  19. Solution: d-Heaps 1 • Each node has d children • Still representable by array • Good choices for d: • optimize performance based on # of inserts/removes • choose a power of two for efficiency • fit one set of children on a memory page/disk block 3 7 2 4 8 5 12 11 10 6 9 12 1 3 7 2 4 8 5 12 11 10 6 9 What do d-heaps remind us of???

  20. New Operation: Merge Given two heaps, merge them into one heap • first attempt: insert each element of the smaller heap into the larger. runtime: • second attempt: concatenate heaps’ arrays and run buildHeap. runtime: How about O(log n) time?

  21. Idea: Hang a New Tree 2 1 + = 5 11 9 4 6 10 13 12 14 10 ? Note: we just gave up the nice structural property on binary heaps! Now, just percolate down! 2 1 5 11 9 4 6 10 13 12 14 10

  22. Problem? 1 2 = + 4 Need some other kind of balance condition… 12 15 15 18 ? 1 2 4 12 15 15 18

  23. Leftist Heaps • Idea: make it so that all the work you have to do in maintaining a heap is in one small part • Leftist heap: • almost all nodes are on the left • all the merging work is on the right

  24. Random Definition:Null Path Length the null path length (npl) of a node is the number of nodes between it and a null in the tree • npl(null) = -1 • npl(leaf) = 0 • npl(single-child node) = 0 2 1 1 0 1 0 0 0 0 0

  25. Leftist Heap Properties • Heap-order property • parent’s priority value is  to childrens’ priority values • result: minimum element is at the root • Leftist property • null path length of left subtree is  npl of right subtree • result: tree is at least as “heavy” on the left as the right Are leftist trees complete? Balanced?

  26. Leftist tree examples NOT leftist leftist leftist 2 2 0 1 1 1 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 every subtree of a leftist tree is leftist, comrade! 0

  27. Right Path in a Leftist Tree is Short 2 • If the right path has length at least r, the tree has at least 2r - 1 nodes • Proof by induction Basis: r = 1. Tree has at least one node: 21 - 1 = 1 Inductive step: assume true for r’ < r. The right subtree has a right path of at least r - 1 nodes, so it has at least 2r - 1 - 1 nodes. The left subtree must also have a right path of at least r - 1 (otherwise, there is a null path of r - 3, less than the right subtree). Again, the left has 2r - 1 - 1 nodes. All told then, there are at least: 2r - 1 - 1 + 2r - 1 - 1 + 1 = 2r - 1 • So, a leftist tree with at least n nodes has a right path of at most log n nodes 1 1 1 0 0 0 0 0 0

  28. Merging Two Leftist Heaps • merge(T1,T2) returns one leftist heap containing all elements of the two (distinct) leftist heaps T1 and T2 merge T1 a a recursive merge L1 R1 L1 R1 a < b T2 b b L2 R2 L2 R2

  29. Merge Continued a a npl(R’) > npl(L1) L1 R’ R’ L1 R’ = Merge(R1, T2) constant work at each merge; recursively traverse RIGHT right path of each tree; total work = O(log n)

  30. Operations on Leftist Heaps • merge with two trees of total size n: O(log n) • insert with heap size n: O(log n) • pretend node is a size 1 leftist heap • insert by merging original heap with one node heap • deleteMin with heap size n: O(log n) • remove and return root • merge left and right subtrees merge merge

  31. Example merge ? 1 3 5 merge 0 0 0 7 1 ? 10 12 5 5 0 1 merge 14 0 0 0 3 10 12 0 10 12 0 0 0 7 8 0 8 8 0 14 0 8 0 12

  32. Sewing Up the Example ? ? 2 3 3 3 0 0 0 ? 7 1 1 7 7 5 5 5 0 0 0 0 0 0 0 0 14 14 14 8 10 0 10 8 10 8 0 0 12 0 12 12 Done?

  33. Finally… 2 2 3 3 0 0 1 1 7 7 5 5 0 0 0 0 0 0 14 14 8 8 10 10 0 0 12 12

  34. Skew Heaps • Problems with leftist heaps • extra storage for npl • two pass merge (with stack!) • extra complexity/logic to maintain and check npl • Solution: skew heaps • blind adjusting version of leftist heaps • amortized time for merge, insert, and deleteMin is O(log n) • worst case time for all three is O(n) • merge always switches children when fixing right path • iterative method has only one pass What do skew heaps remind us of?

  35. Merging Two Skew Heaps merge T1 a a merge L1 R1 L1 R1 a < b T2 b b Notice the old switcheroo! L2 R2 L2 R2

  36. Example merge 3 3 5 merge 7 7 5 10 12 5 merge 14 14 10 3 12 10 12 8 7 8 8 3 14 7 5 14 10 8 12

  37. Skew Heap Code void merge(heap1, heap2) { case { heap1 == NULL: return heap2; heap2 == NULL: return heap1; heap1.findMin() < heap2.findMin(): temp = heap1.right; heap1.right = heap1.left; heap1.left = merge(heap2, temp); return heap1; otherwise: return merge(heap2, heap1); } }

  38. Binary Heaps d-Heaps Binomial Queues Leftist Heaps Skew Heaps Comparing Heaps

More Related