1 / 52

CS420 lecture one Problems, algorithms, decidability, tractability

CS420 lecture one Problems, algorithms, decidability, tractability. Problems, solutions, algorithms. In this course we study questions such as: given a problem, how do we find an (efficient) algorithm solving it? How do we measure the complexity (time, space requirements) of an algorithm.

star
Download Presentation

CS420 lecture one Problems, algorithms, decidability, tractability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS420 lecture oneProblems, algorithms, decidability, tractability

  2. Problems, solutions, algorithms • In this course we study questions such as: • given a problem, how do we find an (efficient) algorithm solving it? • How do we measure the complexity (time, space requirements) of an algorithm. • Problem: set of instances of a question • is 2 a prime?, .., is 4 a prime?, is 2100000 a prime • Solution: set of answers • yes, .., no, no

  3. Algorithm • Problem: question + input(s) • Algorithm: effective procedure • mapping input to output • effective: unambiguous, executable • Turing defined it as: "like a Turing machine" • program = effective procedure • the size of the problem: an integer n, • # inputs (for sorting problem) • #digits of input (for prime problem) • sometimes more than one integer

  4. Is there an algorithm for each problem? No • the problem needs to be effectively specified "how many angels can dance on the head of a pin?" not effective • even if it is effectively specified, there is not always an algorithm to provide an answer

  5. Ulam's problem f(n): if (n==1) return 1 else if (odd(n)) return f(3*n+1) else return f(n/2) Try a few inputs, does f(n) always stop?

  6. Ulam's problem f(n): if (n==1) return 1 else if (odd(n)) return f(3*n+1) else return f(n/2) f(1) f(2), f(4), f(8), f(2n) .... f(3) f(5) f(7) f(9)

  7. Ulam's problem f(n): if (n==1) return 1 else if (odd(n)) return f(3*n+1) else return f(n/2) -Nobody has ever found an n for which f does not stop -Nobody has ever found a proof: (so there can be no algorithm deciding this.)

  8. Halting problem is undecidable • Given a program P an an input x will P(x) stop? we can prove: the halting problem cannot be solved ie there is no algorithm Halt(P,x) that for any program P and input x decides whether P(x) stops.

  9. Undecidability A problem P(x) is undecidable, if there is no algorithm solving P for any legal x

  10. Halting problem is undecidable Assume there is a program P(P1,D) that for any P1 and D, outputs Yes (if P1(D) halts) or No (if P1(D) loops) Then construct Q'(P2): if (P(P2,P2)) loop else halt Now execute Q'(Q') if P(Q', Q') says Q'(Q') halts it loops if P(Q', Q') says Q'(Q') loops it halts CONTRADICTION -> P does not exist!!

  11. Verification/equivalence undecidable • Given a specification S and a program P, there is no algorithm that decides whether P executes according to S • given two programs P1 and P2, there is no algorithm that decides x: P1(x)=P2(x) • Does this mean we should not build program verifiers?

  12. Intractability • Suppose we have a program, • does it execute a in a reasonable time? • Eg, towers of Hanoi. Three pegs, one with n smaller and smaller disks, move (1 disk at the time) to another peg without ever placing a larger disk on a smaller f(n) = # moves for tower of size n Monk: before a tower of Hanoi of size 100 is moved, the world will have vanished

  13. Example: Tower of Hanoi, move all disks to third peg without ever placing a larger disk on a smaller one.

  14. Example: Tower of Hanoi, move all disks to third peg without ever placing a larger disk on a smaller one.

  15. Example: Tower of Hanoi, move all disks to third peg without ever placing a larger disk on a smaller one. f(n) = f(n-1) + …

  16. Example: Tower of Hanoi, move all disks to third peg without ever placing a larger disk on a smaller one. f(n) = f(n-1) + 1 + … Recursive Algorithms

  17. Example: Tower of Hanoi, move all disks to third peg without ever placing a larger disk on a smaller one. f(n) = f(n - 1) + 1 + f(n - 1) f(n) = 2f(n - 1) + 1, f(1) = 1

  18. f(n): #moves • f(n) = 2f(n-1) + 1, f(1)=1 • f(1) = 1, f(2) = 3, f(3) = 7, f(4) = 15 • f(n) = 2n-1 How can you show that? • Can you write an iterative Hanoi algorithm?

  19. Was the monk right? • 2100 moves, say 1 per second..... How many years???

  20. Was the monk right? • 2100 moves, say 1 per second..... 2100 ~ 1030 ~ 1025 days ~ 3.1022 years more than the age of the universe

  21. Is there a better algorithm?

  22. Is there a better algorithm? • Pile(n-1) must be off peg1 and on one other peg before disk n can be moved to its destination • so (inductively) all moves are necessary

  23. Algorithm complexity • Measures in units of time and space • Linear Search X in dictionary D i=1; while (not at end and X!= D[i]) {i++} • We don't know if X is in D, and we don't know where it is, so we can only give worst or average time bounds • We don't know the time for atomic actions, so we only determine Orders of Magnitude

  24. time complexity • In the worst case we search all of D, so the loop body is executed n times • In the average case the loop body is executed about n/2 times. Why?

  25. time complexity • In the worst case we search all of D, so the loop body is executed n times • In the average case the loop body is executed about n/2 times. In average case analysis we sum the products of the probability of each outcome and the cost of that outcome, here

  26. Units of time • 1 microsecond ? • 1 machine instruction? • #code fragments that take constant time?

  27. Units of time • 1 microsecond ? no, too specific and machine dependent • 1 machine instruction? no, still too specific • #code fragments that take constant time? yes

  28. unit of space • bit? • int?

  29. unit of space • bit? very detailed but sometimes necessary • int? nicer, but dangerous: we can code a whole program or array (or disk) in one arbitraryint, so we have to be careful with space analysis (take value ranges into account when needed)

  30. Orders of magnitude • If an algorithm with input size n takes less than c.n2 steps to execute, we say it is an order n squared algorithm, notation O(n2) • In general g(n) = O(f(n)) if there is a constant c such that g(n) <= c.f(n) for all but a finite number of values for n. In other words, there is a n0such that g(n) <= c.f(n) for all n>n0

  31. Worst / average case complexity • Worst case : maximal number of steps / bits / words an algorithm needs for inputs of size n • Average case: the expected number of steps / bits / words an algorithm needs:

  32. Is there a better algorithm? • Is there a better algorithm than linear search? • What does that mean?

  33. Is there a better algorithm? • Is there a better algorithm than linear search? • What does that mean? 2*better? better order of magnitude? better worst case? better average case?

  34. Is there a better algorithm? • Is there a better algorithm than linear search? • What does that mean? 2*better? NO better order of magnitude? better worst case? better average case? usually better worst case

  35. Binary Search • Because dictionaries are sorted BS(x,lo,hi): if (lo>hi) not found else { m = (lo+hi)/2; if (x==D[m]) found else if (x<D[m]) BS(x,lo,m-1) else BS(x,m+1,hi) }

  36. Binary Search • Because dictionaries are sorted BS(x,lo,hi): if (lo>hi) not found else { m = (lo+hi)/2; if (x==D[m]) found else if (x<D[m]) BS(x,lo,m-1) else BS(x,m+1,hi) } why m-1 and m+1?

  37. Binary search worst case time • Every 'chop' takes half of the remaining possibilities away • The number of times we can divide n>0 by 2 until it gets to 0: log2(n) so, worst case O(log(n)) why?

  38. Is there an even better algorithm? • uhhh...

  39. Is there an even better algorithm? • now we have to consider all possible algorithms • Lower bound of a problem • given a set of "allowed steps" • taking constant time • lower bound is minimal worst case complexity of any algorithm, using only the allowed steps, solving it

  40. Searching in an ordered array • allowed steps: • comparison to /assignment of array element • arithmetic on index • some action on found or not found eg NOT if (isElement(x,D)) ... why NOT?

  41. Event / decision trees • Given these steps each algorithm is associated with a set of event or decision trees: • comparison is an internal node with two children • (not) found is a leaf • other ops are branches leading to new decisions or leaves • For some input a path through the tree to a particular leaf is taken

  42. event tree for linear search compare found not found found found action compare found not found found found action ... compare found not found found found action found not found action

  43. event tree for binary search found found compare found action found not found found found found compare compare found action found action not found not found

  44. Height of the event tree • The event trees for any search are binary trees. Why? • Any tree associated with the search has at least n+1 leaves (n found, 1 not found) • Theorem: A binary tree with n leaves has a height of at least log(n). Check out some cases. • There is at least one path from root to a leaf of length at least log(n), so any search algorithm takes at least O(log(n)) time for some of its inputs. (In the next lecture we will use big Omega for lower bounds.)

  45. Lower bound for search • Lower bound for search in an ordered array is O(log(n)) • There is no asymptotically better search than binary search • But what if I know more, eg search in an array with a fixed range of values...

  46. Upper and lower bounds • Upper bound of a problem: • the lowest worst case complexity of any known algorithm solving for the problem. • eg search: O(log(n)) binary search • Lower bound of a problem • Maximum (the minimum number of steps we can prove any algorithm will need to execute) • eg any search algorithm takes at least O(log(n)) steps

  47. Closed problems, algorithmic gaps • Closed problem lower bound = upper bound eg search in ordered array • Algorithmic gap lower bound < upper bound eg NP Complete problems

  48. NP Complete problems • NPC: a certain class of problems with algorithmic gaps upper bound: exponential lower bound: polynomial Trial and error type algorithms are the only ones we have so far to find an exact solution

  49. Some NPC problems • TSP: Travelling Salesman given cities c1,c2,...,cn and distances between all of these, find a minimal tour connecting all cities. • SAT: Satisfiability given a boolean expression E with boolean variables x1,x2,...,xn determine a truth assignment to all xi making E true

  50. Back tracking • Back tracking searches (walks) a state space, at each choice point it guesses a choice. • In a leaf (no further choices) if solution found OK, else go back to last choice point and pick another move. • A Non Deterministic algorithm is one with the ability to select the sequence of right guesses (avoiding back track)

More Related