algorithm application
Download
Skip this Video
Download Presentation
Algorithm & Application

Loading in 2 Seconds...

play fullscreen
1 / 86

Algorithm & Application - PowerPoint PPT Presentation


  • 115 Views
  • Uploaded on

Algorithm & Application. Algorithm : A step-by-step procedure for solving a problem Prof. Hyunchul Shin shin@hanyang.ac.kr Hanyang University. Foundations of Algorithms. Richard Neapolitan and Kumarss Naimipour

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Algorithm & Application' - annick


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
algorithm application

Algorithm & Application

Algorithm : A step-by-step procedure for solving a problem

Prof. Hyunchul Shin

shin@hanyang.ac.kr

Hanyang University

foundations of algorithms
Foundations of Algorithms
  • Richard Neapolitan and KumarssNaimipour
  • 3rd Edition. Jones and Bartlett Computer Science, 2004
  • Time : CPU cycles
  • Storage: memory
  • Instance: Each specific assignment of values to parameters
problem
Problem

Is the number x in the list S of n numbers?

The answer is yes if x is in S and no if it is not.

(ex) S={10,7,11,5,13,8} , n=6 , and x=5 .

Solution “yes”

Algorithm : search ( S, n, x )

{

for ( i=1; i<=n; i++ )

if S[i]==x, return ( “yes” );

return ( “no” );

} /* cf. text P5 */

exchange sort
Exchange Sort

Problem : Sort n keys in nondecreasing order

Inputs : n, S[1],…,S[n]

Outputs : Sorted keys in the array S.

Algorithm: Exchange Sort

{

for( i=1; i<=n; i++ )

for( j=i+1; j<=n; j++ )

if( S[j] < S[i])

exchange S[i] and S[j]

}

algorithm exchange sort
Algorithm Exchange Sort

Algorithm: Exchange Sort (ex) n=4 S=[ 4 3 1 5]

{

for( i=1; i<=n; i++ )

for( j=i+1; j<=n; j++ )

if( S[j] < S[i])

exchange S[i] and S[j]

}

Homework

Show i , j , S , for exchange sort of

S=[ 3 8 5 9 7].

Due 1 week

3

4

1

3

1

5

3

4

3

5

4

5

matrix multiplication
Matrix Multiplication

Cn×n=An×n. Bn×n

Cij= aik. Bkj , for i<=n, j <=n.

(ex)

=

Algorithm { /*Matrix multiplication*/

for( i=1; i<=n; i++ )

for( j=1; j<=n; j++ ) {

C[i][j]=0;

for(k=1;k<=n; k++)

C[i][j]= C[i][j] + A[i][k] ×B[k][j];

}

}

fibonacci sequence
Fibonacci Sequence

f0=0

f1=1

fn= fn-1 + fn-2 for n>=2.

(ex)

f2 =f1 + f0 =1 + 0=1

f3 =f2 + f1 =1 + 1=2

f4 =f3 + f2 =2 + 1=3

f5 =f4 + f3 =3 + 2=5

fibonacci recursive
Fibonacci (Recursive)

int fib (int n)

{ /*divide-and-conquer : chap2 */

if(n<=1) return n;

else return( fib(n-1) + fib(n-2) );

}

(ex) fib(5) computation

fibonacci iterative
Fibonacci (Iterative)

Intfib_iter (int n){/*dynamic programing:chap3*/

Index i;

int f[0..n];

f[0]=0;

If(n>0){

f[1]=1;

for( i=2; i<=n; i++ )

f[i]=f[i-1]+f[i-2];

}

Return f[n];

}

Complexity (cf text p16)

Fib(100) takes 13 days.

Fib_iter(100) takes 101 n sec

complexity exchange sort
Complexity: Exchange Sort

Algorithm: Exchange Sort

{

for( i=1; i<=n; i++ )

for( j=i+1; j<=n; j++ )

if( S[j] < S[i])

exchange S[i] and S[j]

}

Basic operation: Comparison of S[j] with S[i]

Input size: n, the number of items to be sorted.

Complexity: the number of basic operations

T(n)=(n-1)+(n-2)+(n-3)+…+1

=(n-1).n/2

ЄO(n2)

complexity matrix multiplication
Complexity: Matrix Multiplication

Algorithm { /*Matrix multiplication*/

for( i=1; i<=n; i++ )

for( j=1; j<=n; j++ ) {

C[i][j]=0;

for(k=1;k<=n; k++)

C[i][j]= C[i][j] + A[i][k] ×B[k][j];

}

}

Basic operation: multiplication (innermost for loop)

Input size: n, #rows and #columns

Complexity:

T(n)=n×n×n

=n3

ЄO(n3)

memory complexity
Memory Complexity

Analysis of algorithm efficiency in terms of memory.

Time complexity is usually used.

Memory complexity is occasionally useful.

order big o
Order : Big O
  • Definition

For a given complexity function f(n), O(f(n)) is the set of complexity functions g(n) for which there exists some positive real constant c and some nonnegative integer N such that for all n ≤ N,

.

(ex)

T1(n)=(n-1).n/2 Є O(n2)

T2(n)=n3 Є O(n3)

T3(n)=10000++1000 ЄO(n2)

(cf. p29)

divide and conquer
Divide and Conquer

Top-Down Approach (p47)

Divide the problem into subproblems

Conquer subproblems

Obtain the solution from the solutions of subproblems

Binary search

Problem: Is x in the sorted array S of size n ?

Inputs: Sorted array S, a key x.

Outputs: Location of x in S

(0 if x is not in S)

binary search
Binary Search

Locationout=location(1,n);

Index location (index low, index high)

{

index mid;

if(low>high) return 0 ;

else{

mid= ;

if (x==S[mid]) return mid;

else if (x<S[mid])

return location (low,mid-1);

else return location (mid+1,high);

}

}

worst case complexity binary search
Worst-Case Complexity: Binary Search

Locationout=location(1,n);

Index location (index low, index high){

index mid;

if(low>high) return 0 ;

else{

mid= (low+high)/2;

if (x==S[mid]) return mid;

else if (x<S[mid])

return location (low,mid-1);

else return location (mid+1,high);

}

}

Basic operation: Comparison of x with S[mid]

Input size: n (#items in the array S)

W(n)=W(n/2) + 1

recursive top

call level

complexity binary search
Complexity: Binary Search

W(n)=W(n/2)+1 , for n>1 , n a power of 2

W(1)=1

It appears that W(n) = log n + 1

(Induction base)

For n=1, t1=1=log1+1

(Induction hypothesis)

Assume that W(n)=logn + 1

(Induction step)

L=W(2n)=log(2n)+1

R=W(2n)=W(n)+1=(logn + 1)+1

=logn + log2 + 1=log(2n) + 1

quick sort
Quick Sort

Sort by dividing the array into two partitions

Using a pivot item.

(ex)(first item)

Quick sort(index low , index high)

{index pivot;/*index of the pivot*/

if(high>low){

partition (low, high, pivot );

quicksort (low, pivot-1);

quicksort (pivot+1, high);

}

}

homework
Homework

Given 20 15 25 22 11 20 30 27 (n=8)

  • Mergesort as in Fig 2.2 P54
  • Quicksort as in Fig 2.3 P61
  • Partition as in Table 2.2 P62

Due 1 week

worst case complexity quick sort
Worst-case complexity: Quick sort
  • Worst case: When the array is already sorted
  • Time to partition: Tp (x) = n – 1
  • Time to sort left subarray = T(0)
  • Time to sort right subarray = T(n-1)
  • Quick Sort

T(n) = T(0) + T(n-1) + (n-1) for n > 0

T(n) = T(n-1) + (n-1), since T(0) = 0

T(n) = n(n-1)/2 ϵ O()

  • Average-case complexity: O (nlogn)
dynamic programming bottom up
Dynamic Programming (Bottom-up)
  • Dynamic programming
    • Establish a recursive property
    • Solve in bottom-up fashion by solving smaller instances first
  • (ex): Fibonacci (Iterative)
  • Divide-and-conquer
    • Divide a problem into smaller instances
    • Solve these smaller instances (blindly)
    • Examples:
      • Fibonacci (Recursive): Instances are related
      • Merge sort: Instances are unrelated
binomial coefficient
Binomial coefficient
  • Frequently, n! is too large to compute directly
  • Proof:
binomial coefficients divide and conquer
Binomial coefficients: Divide-and-conquer
  • Algorithm

/* Inefficient */

intbin (intn, intk)

{

if ( k = = 0 || n = = k)

return 1;

else

returnbin (n-1, k - 1)+bin (n - 1, k);

}

binomial coefficients
Binomial coefficients

Figure 3.1: The array B used to compute the binomial coefficient

Complexity : O(nK)

slide26

Example 3.1: Compute

Compute row 0:

{This is done only to mimic the algorithm exactly.}

  {The value B [0] [0] is not needed in a later computation.}

B [0] [0] = 1

Compute row 1:

B [1] [0] = 1B [1] [1] = 1

Compute row 2:

B [2] [0] = 1B [2] [1] = B [1] [0] + B [1] [1] = 1+1 = 2B [2] [2] = 1

Compute row 3:

B [3] [0] = 1B [3] [1] = B [2] [0] + B [2] [1] = 1+2 = 3B [3] [2] = B [2] [1] + B [2] [2] = 2+1 = 3

Compute row 4:

B [4] [0] = 1B [4] [1] = B [3] [0] + B [3] [1] = 1+3 = 4B [4] [2] = B [3] [1] + B [3] [2] = 3+3 = 6

binomial coefficient dynamic programming
Binomial Coefficient: Dynamic Programming
  • Establish a recursive property
  • Solve in bottom up fashion

Algorithm:

intbin2 (intn, intk){  index i, j;intB[0..n][0..k];  for (i = 0; i < = n; i ++)    for (j = 0; j < = minimum(i, k);  j ++)      if (j == 0 || j == i)B[i][j] = 1;      elseB[i][j] = B[i - 1][j - 1] + B[i - 1][j];  return B[n][k];}

homework1
HOMEWORK
  • Use dynamic programming approach to compute B[5][3].
  • Draw diagram like figure 3.1 (Page 94)
  • Due in 1 week
binary search tree
Binary Search Tree
  • Definition: For a given node n,
    • Each node contains one key
    • Key (node in the left subtree of n) <= Key (n)
    • Key(n) <= Key(node in the right subtree of n)
  • Optimality depends on the probability
binary search tree1
Binary Search Tree
  • Depth(n): # edges in the unique path from the root to n.
  • (Depth=level)
  • Search time = depth(key) + 1
  • The root has a depth of 0.
binary search algorithm
Binary Search Algorithm

structnodetype{

Key type key;

Nodetype* left;

Nodetype* right;

};

typeofnodetype* node_pointer;

Void search (node_pointer tree, keytypekeyin, node_pointer & p) {

{bool found=false;

p=tree;

while(!found)

if (p->key==keyin) found=true;

elseif(keyin < p->key) p=p->left;

else p=p->right;

}

greedy approach
Greedy Approach

Start with an empty set and add items to the set

until the set represents a solution.

Each iteration consists of the following

components:

A selection procedure

A feasibility check

A solution check

spanning tree
Spanning Tree

A connected subgraph that contains all the vertices and

is a tree.

graph
Graph

G=(V , E)

Where V is a finite set of vertices

and E is a set of edges

(pairs of vertices in V).

(ex)

V={v1, v2, v3, v4, v5}

E={(v1, v2,), (v1, v3,), (v2, v3,),

(v2, v4,), (v3, v4,), (v3, v5,), (v4, v5,), }

prim s algorithm
Prim’s Algorithm

Figure 4.4: A weighted graph (in upper-left corner) and the steps in Prim\'s algorithm for that graph. The vertices in Y and the edges if F are shaded at each step.

prim s algorithm1
Prim’s Algorithm

F = Ø;

for (i = 2; i <= n; i++){ // Initialize

nearest [i] = 1; // v1 is the nearest

distance [i] = W[1] [i] ; // distance is the weight

}

repeat (n - 1 times){ // Add all n - 1 vertices

min = ∞

for (i = 2; i <= n; i++)

if (0 ≤ distance [i] < min) {

min = distance [i];

vnear = i;

}

e = edge connecting vnear and nearest [vnear];

F=F ∪{e} //add e to F

distance [vnear] = - 1;

for (i = 2; i <= n; i++) //update distance

if (W[i] [vnear] < distance [i]){

distance = W[i] [vnear];

nearest [i] = vnear;

}

}

prim s spanning tree
Prim’s Spanning Tree

Complexity : O(n2)

(n-1) iterations of the repeat loop

(n-1) iterations in two for loops

T(n)= 2(n-1) (n-1)

Theorem

Prim’s algorithm always produces a minimum

Spanning tree.

dijkstra s shortest paths
Dijkstra’s Shortest Paths

Figure 4.8: A weighted, directed graph (in upper-left corner) and the steps in Dijkstra\'s algorithm for that graph. The vertices in Y and the edges in F are shaded in color at each step.

dijkstra s algorithm
Dijkstra’s Algorithm

F = Ø;

for (i = 2; i<= n; i++){ //Initialize

touch [i] = 1; // paths from V1

length [i] = W[1] [i];

}

repeat (n - 1 times){

min = ∞;

for (i = 2; i < = n; i++)

if ( 0 ≤ length [i] < min) {

min = length [i];

vnear = i;

}

e = edge from from[vnear]to vnear;

F=F ∪{e} //add e to F

for (i = 2; i < = n; i++)

if (length [vnear] + W[vnear] [i] < length [i]){

length[i] = length[vnear] + W[vnear][i];

touch[i] = vnear;

}

length[vnear] = -1;

}

complexity
Complexity

Prim’s and Dijkstra’s : O (n2)

Heap implementation : O (mlogn)

Fibonacci heap implementation : O (m + nlogn)

1.Find a minimum spanning tree for the following graph

2.Find the shortest paths from V4 to all the other vertices

scheduling
Scheduling

Minimizing the total time (waiting + service)

(ex) Three jobs : t1=5, t2=10, t3=4.

Schedule Total Time in the System

[1, 2, 3]5+(5+10)+(5+10+4) = 39

[1, 3, 2]5+(5+4)+(5+4+10) = 33

[2, 1, 3]10+(10+5)+(10+5+4) = 44

. . .

[3, 1, 2]4+(4+5)+(4+5+10) = 32

3! cases

optimal scheduling for total time
Optimal Scheduling for Total Time

Smallest service time first.

// Sort the jobs in nondecreasing order of service time

// Schedule in sorted order.

Complexity (sorting)

w(n)Є O ( nlogn )

schedule with deadlines
Schedule with Deadlines

Schedule to maximize the total profit.

Each job takes one unit of time to finish.

(ex)

JobDeadlineProfit

1 2 30

2 1 35

3 2 25

4 1 40

[1,3] : TP=30+25=55

[2,1] : TP=35+30=65

[4,1] : TP=40+30=70.(optimal)

Is highest profit first optimal?

schedule with deadlines1
Schedule with Deadlines

Profit and deadline should be considered

Problem : Maximize total profit

Input : n jobs, deadline[1..n], sorted by profits

in nonincreasing order

Output : An optimal sequence J for the jobs.

Algorithm Schedule (O(n2))

J=[1]

for (i = 2; i <= n; i++){

K = J with i added according to

nondecreasing values of deadline [i];

if (K is feasible) J = K; }

}

slide52

Suppose we have the jobs in Example 4.4. Homework : scheduling

Recall that they had the following deadlines: 1. schedule to minimize the total time.

JobDeadline Profit JobService time

1 3 40 1 7

2 1 35 2 3

3 1 30 3 10

4 3 25 4 5

5 1 20 2.Schedule with deadlines for max profit

6 3 15 Job Deadline Profit

7 2 10 1 2 30

Algorithm 4.4 does the following: 2 1 35

1.J is set to [1]. 3 2 25

2.K is set to [2, 1] and is determined to be feasible. 4 1 40

J is set to [2, 1] because K is feasible. 5 3 50

3.K is set to [2, 3, 1] and is rejected because it is not feasible.

4.K is set to [2, 1, 4] and is determined to be feasible.

J is set to [2, 1, 4] because K is feasible.

5.K is set to [2, 5, 1, 4] and is rejected because it is not feasible.

6.K is set to [2, 1, 6, 4] and is rejected because it is not feasible.

7.K is set to [2, 7, 1, 4] and is rejected because it is not feasible.

The final value of J is [2, 1, 4].

huffman code
Huffman Code

Variable-length binary code for data compression

Prefix code : No codeword constitutes the beginning of another codeword.

(ex) 0 1 is the code for ‘a’

0 1 1 can not be a code ( for ‘b’ ).

prefix code
Prefix Code

Figure 4.10: The binary character code for Code C2 in Example 4.7 appears in (a), while the one for Code C3 (Huffman) appears in (b).

variable length prefix code
Variable Length (Prefix) Code

Bits(C1)=16(3)+5(3)+12(3)+17(3)+10(3)+25(3)=255

Bits(C2)=16(2)+5(5)+12(4)+17(3)+10(5)+25(1)=231

Bits(C3)=16(2)+5(4)+12(3)+17(2)+10(4)+25(2)=212

huffman code optimal
Huffman Code (Optimal)

Figure 4.11: Given the file whose frequencies are shown in Table 4.1, this shows the state of the subtrees, constructed by Huffman\'s algorithm, after each pass through the for-i loop. The first tree is the state before the loop is entered

huffman algorithm
Huffman Algorithm

Priority queue : Highest priority (lowest frequency)

Element is removed first Homework

for(i=1; i<=n-1; i++){

remove(PQ,p);

remove(PQ,q);

r=new nodetype;

r->left=p;

r->right=q;

r->frequency=p->frequency + q ->frequency;

insert(PQ, r);

}

remove(PQ, r)

return r;

Priority queue (heap) Initialization O(n)

Each heap operation O(logn)

Huffman algorithm complexity O(nlogn).

knapsack problem
Knapsack Problem

Let s={item 1, item 2, …, item n}

wi=weight of itemi

pi =profit of itemi

W =max weight the knapsack can hold

Determine a subset A of S such that

(ex)item1 :$50, 5kg ($50/5 = 10)

item2 :$60, 10kg ($60/10 = 6)

item3 :$140, 20kg ($140/20 = 7)

=30 kg

example 0 1 knapsack
Example :0-1 Knapsack

Figure 4.13: A greedy solution and an optimal solution to the 0-1 Knapsack problem.

dynamic programming 0 1 knapsack
Dynamic Programming : 0-1 Knapsack

for i>0 and w>0, let P[i][w] be the optimal profit obtained

when choosing items only from the first i items under the

restriction that the total weight cannot exceed w,

Max profit = P[n][ ]

P[n][ ] can be computed from 2D array P with rows(0 to n) and

Columns(0 to ).

P[0][w]=0

P[i][0]=0

example dynamic prog knapsack
Example :Dynamic Prog.(knapsack)

(ex)item1 :$50, 5kg ($50/5 = 10)

item2 :$60, 10kg ($60/10 = 6)

item3 :$140, 20kg ($140/20 = 7)

=30 kg

P[3][30]

w3 =20

P[2][30] P[2][10]

w2 =10 w2 =10

P[1][30] P[1][20] P[1][10] P[1][0]

$50 $50 $50 $0

example dynamic prog
Example : Dynamic Prog.

(ex)item1 :$50, 5kg ($50/5 = 10)

item2 :$60, 10kg ($60/10 = 6)

item3 :$140, 20kg ($140/20 = 7)

=30 kg

P[3][30] $200

P3 =$140

P[2][30] $110 P[2][10] $60

P2 =$60 P2 =$60

P[1][30] P[1][20] P[1][10] P[1][0]

$50 $50 $50 $0

complexity dynamic prog knapsack
Complexity : Dynamic Prog.(Knapsack)

(n-i)th row : 2i entries are computed

Total number of entries

=1+2+22+…+2n-1

=2n-1

Complexity : O(2n)

backtracking
Backtracking

◆ Path finding in a maze

●If dead end, pursue another path

●If a sign were positioned near the beginning of the path, the time saving could be enormous

◆ Backtracking

●After determining that a node can lead to nothing

but dead ends, we go back (backtrack) to the parent node and proceed with the search on the

next child.

●Pruning the nonpromisingsubtree

4 queens problem
4 Queens Problem

Figure 5.5: The actual chessboard positions that are tried when backtracking is used to solve the instance of the n-Queens problem in which n = 4. Each nonpromising position is marked with a cross.

n queens problem
n-Queens Problem

void queens (index i)

{

index j;

if (promising (i))

if (i == n)

cout << col [1] through col [n];

else

for (j = 1; j <= n; j++){ // See if queen in

col [i + 1] = j; // (i + 1) st row can be

queens (i + 1); // positioned in each of

// the n columns.

}

}

bool promising (index i)

{

index k;

bool switch;

k = 1;

switch = true; // Check if any queen threatens

while (k < i && switch){ // queen in the ith row.

if (col [i] == col [k] || abs (col [i] - col [k] == i --k)

switch = false;

k++;

}

return switch;

}

branch and bound
Branch-and-Bound

◆Exponential-time complexity in the worst case

●Dynamic programming

●Backtracking

◆ Branch-and-bound algorithm

●Are improvement on the backtracking algorithm

●No limit in the way of traversing the tree

(Best-first or breadth-first)

●Used only for optimization problems

(Bound determines whether the node is promising)

branch and bound1
Branch-and-Bound

◆ A node is nonpromising if (upper) bound is less than or equal to maxprofit (value of best solution found up to that point).

(ex) 0-1 Knapsack

weight(profit): weight profit sum of items up to the node

Promising? (Bound should be computed to decide)

Sorted items by(pi/wi)

branch and bound 0 1 knapsack
Branch-and-Bound:0-1 Knapsack

◆Promising?

If a node is at level i, and the node at level

k is the one whose weight would bring the weight above , then

totweight=weight + wj

bound=(profit + Pj)+( - totweight)∙Pk/wK

◆ Nonpromising

if bound < maxprofit

or weight >

b b example
B&B Example

0–1 Knapsack problem( =16)

Ordered according to pi/wi

i Piwi pi/wi

1 $40 2 $20

2 $30 5 $6

3 $50 10 $5

4 $10 5 $2

b b knapsack breath first 16
B&B Knapsack : Breath-First ( =16)

Figure 6.2: The pruned state space tree produced using breadth-first search with branch-and-bound pruning in Example 6.1. Stored at each node from top to bottom are the total profit of the items stolen up to that node, their total weight, and the bound on the total profit that could be obtained by expanding beyond the node. The node shaded in color is the one at which an optimal solution is found.

b b knapsack b est first
B&B Knapsack: Best-First

Figure 6.3: The pruned state space tree produced using best-first search with branch-and-bound pruning in Example 6.2. Stored at each node from top to bottom are the total profit of the items stolen up to the node, their total weight, and the bound on the total profit that could be obtained by expanding beyond the node. The node shaded in color is the one at which an optimal solution is found.

problem solving approaches
Problem Solving Approaches

◆ Behavioral approach

● Relationship between a stimulus (input) and response(output).

● Without speculating about the intervening Process.

◆ Information processing approach

● Based on the process that intervenes between input and output an leads to a desired goal from an initial state.

● Thinking to achieve a desired goal.

Rubinstein & Firstenberg, Patterns of Problem Solving, Prentice

Hall, 1995

model of memory
Model of Memory

forgetting

0.1-0.5 sec

forgetting

◆ Sensory register

● Important information to higher-order systems

● The rest quickly fades

◆Short-term memory or working memory

● Limited capacity (bottleneck)

● n±2 unrelated items (n digit phone number )

◆Long-term memory

● A network of interconnecting ideas, concepts, and facts.

short term memory stm
Short-Term Memory (STM)

◆ Limited working memory

● 3×4 is easy.

● 5+3×144 is hard, since STM can not retain all the Subcalculations.

● You can not remember a long sentence.

◆ Information in STM is replaced with competing information.

◆ Information will be transferred to LTM or lost.

long term memory ltm
Long-Term Memory(LTM)

◆ A network of interconnecting ideas

● Learning new information:

Integrating that information within the structure.

● The richer the cognitive structure already set up in LTM, the

easier it is to learn new information.

● Familial topic is easy.

◆ Multiple relationships among pieces of information that are

stored(=>creative thinking)

● The richness and complexity leads to the easiest types of

retrieval from memory.

● LTM can not “fill up”.

forgetting
Forgetting

◆ Two theories of forgetting

● Changes during storage causing the information to decay.

● Failure to retrieve the information.

◆ Effective forgetting to update memories

● Need to know where we parked the car today (not yesterday).

◆ Difficult or impossible

● When a friend tells you a secret and adds

“forget I ever said anything”.

heap priority queue
Heap & Priority Queue

◆ Priority Queue:

The highest priority element is always removed first. A PQ can be implemented as a linked list, but more efficiently as a heap.

◆Heap:

A heap is an essentially complete binary tree such that

  • The values come from an ordered set.
  • Heap property is satisfied.

Value(parent node) >= Value(child node)

a heap
A Heap

Essentially complete binary tree (of depth d)

  • Complete binary tree down to a depth of d-1
  • Nodes with depth d are as far to the left as possible

Heap property: Value(parent) >= Value(child)

A heap

s iftdown
siftdown
  • Input tree: heap property except the root
  • Output: a heap

voidsiftdown(heap& H) // H starts out having the

{ // heap property for all

nodeparent, largerchild; // nodes except the root.

// H ends up a heap.

parent = root of H;

largerchild = parent\'s child containing larger key;

while(key at parent is smaller than key at largerchild){

exchange key at parent and key at largerchild;

parent = largerchild;

largerchild= parent\'s child containing larger key;

}

}

sfitdown
sfitdown
  • Procedure siftdown sifts 6 down until the heap property is restored.
remove root
Remove Root
  • Remove the key at the root and restore the heap property.

keytyperoot (heap& H)

{

keytypekeyout;

keyout = key at the root;

move the key at the bottom node to the root; // Bottom node is

delete the bottom node; // far-right leaf.

siftdown(H); // Restore the

returnkeyout; // heap property.

}

  • Given a heap of n keys, place the keys in sorted array S.

voidremovekeys (intn, heapH, keytypeS[]) O (nlog n)

{

indexi;

for(i = n; i >= 1; i--)

S[i] = root (H);

}

make heap
Make Heap
  • Transform all subtrees whose roots have depth d-i into heaps for i=1,2,…,d. Complexity O(n)

voidmakeheap (intn, heap&H) // H ends-up a heap.

{

indexi;

heapHsub; // Hsub ends up a heap.

for (i = d - 1; i >= 0; i--) // Tree has depth d.

for (all subtreesHsub whose roots have depth i)

siftdown (Hsub);

}

make heap1
Make Heap
  • Using siftdown to make a heap from an essentially complete binary tree. After the steps shown, the right subtree, whose root has depth d-2, must be made into a heap, and finally the entire tree must be made into a heap.
make heap2
Make Heap
  • More with depth d-2
  • Depth d-3
heapsort
Heapsort

voidheapsort (intn,

heapH, // H ends up a heap.

keytypeS[])

{

makeheap(n, H);

removekeys(n, H, S);

}

A heap The array representation of the heap.

ad