1 / 36

Theory Of Computation

Theory Of Computation. Dr. Adam P. Anthony Lectures 25 and 26. Overview. Computer Science: do we need computers? Computation Theory Functions Turing Machines Universal Programming Languages The Halting problem. Computer Science and Computers.

platt
Download Presentation

Theory Of Computation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Theory Of Computation Dr. Adam P. Anthony Lectures 25 and 26

  2. Overview • Computer Science: do we need computers? • Computation Theory • Functions • Turing Machines • Universal Programming Languages • The Halting problem

  3. Computer Science and Computers Computer science is no more about computers than astronomy is about telescopes. • Edgser W. Dijkstra

  4. Computability • Insight: the computation is separate, in concept, from the computer • A computer, then, is just some object that can carry out the computation • Humans • Brain, often supplemented by pencil, paper • Charles Babbage • Difference engine, analytical engine • Controlled using clockwork-type components • ENIAC • Controlled using vacuum tubes • Intel 8080 • Controlled using micro-transistors

  5. Strength of Computers • Can a simple calculator help you find your way around cleveland? • How about a (dumb) phone? • Aside from making calls • How about a smart phone? • How about a laptop? • How about a desktop? • Which of these count as computers?

  6. Specific-Purpose vs. General Computers • Some ‘computers’ are designed only to achieve a limited number of specific tasks, and to do that either at high speed or at a low cost: • Digital Phones (Cell and otherwise) • Encryption chips too! • Various scientific measuring devices • Others are considered General Purpose Computers • Anything that can be computed, can be done on one of these machines

  7. Theory of Computation • The theory of computation aims to answer the following questions: • What is a general purpose computer? • What problems can I solve with a general purpose computer? • Is this specific computer general purpose? • Given a general purpose computer, how difficult will it be to solve a specific problem?

  8. Alan Turing: “The Father of Computer Science” • Successful mathematician • Cryptographer • Helped build some early (classified) computation devices • Many ideas predated the first computers • Turing Machine • Computability • Helped define what is possible on a computer, and what is not

  9. Computable Functions • A function is a mapping of inputs to outputs • Sum(2+2) = 4 • Feet-Centimeter(500) = 15,240 • Sort([1,3,2,3,6,9,7,8,0]) = [0,1,2,3,3,6,7,8,9] • Father(Bill Smith) = Edward Smith • Some functions are computable • Given the input, an algorithmic process can always be applied to get an exact answer for the output • A general purpose computer can compute any computable function, and no others

  10. Turing Machines • Control Unit: The actual machine • Tape: infinitely long memory • Read/Write Head: Used to read information from the tape, erase information, write new information • Reads one character at a time • Moves left/right one position at a time • State: Description of current situation, based on tape values State = START

  11. How a Turing Machine Works • Each new Turing Machine has an alphabet of characters that it understands, and a set of states that help it make decisions • Given the state the current character read by the the read/write head, and a program of execution, the Control Unitdecides to: • Stop running (HALT state) • Write over the current character • Move one space left/right • Change States

  12. A More Complex Machine • Alphabet: {0,1,*} • A single binary number is represented as *101010* • States: ADD,RETURN,CARRY,OVERFLOW,HALT • Program to increase a positive binary integer by 1:

  13. Useful Turing Machine Facts • Multiple Turing machines are no more powerful (though possibly faster) than a single Turing Machine • Any Turing Machine can ‘simulate’ another Turing Machine • Result: We can use unambiguous complex commands in the control unit’s program! • Command: “Move 5 Spaces to the left” • Turing Machine reads: “Execute the Turing Machine routine that moves 5 spaces to the left” • Theoretically speaking, one should typically demonstrate the sub-program is computable first

  14. Church-Turing Thesis Any function that can be computed using a Turing Machine is also computable using any other general purpose computer (i.e., the function is computable) SO WHAT??? Who Cares?

  15. Impact of Church-Turing Thesis • If Power = ‘number of functions I can compute,’ then a Turing machine is the most powerful computer imaginable • Or, at least, it ties with any other computer • It computes ALL of the computable functions! • If a Turing machine can’t solve a problem, then neither can a real computer, no matter how ‘powerful’ it is

  16. Modern Computers As Turing Machines • Control Unit = Processor • Alphabet = {0,1} • States = Op Codes • Read/Write Head = BUS • Programs = Software • Tape = RAM • Infinite????? • No, but for most purposes it is long enough to solve the problem • Tape = External Storage • Only limited by the number of natural resources we can obtain from the entire universe (so, probably infinite!)

  17. Bare-Bones Computer Language • Programming languages usually market their ‘features’ • Meant to make programming easier • Bare-Bones Language: • Only includes features that are 100% necessary to be equivalent to a Turing machine: • Variable names: all variables are in binary • clear statement: set a variable = 0 (clear X;) • incrstatement: increase a variable by 1 ( incr X; ) • decrstatement: decrease a variable by 1 (decr X;) • While/end: continue execution until a variable = 0 • while X not 0 do; …. end;

  18. Group Work! • Can you use Bare-Bones to: • Set the variable Z = 4? • Add X + Y = Z? (use one variable each for X,Y,Z) • Copy the value of X into Y?

  19. About the Bare-Bones Language • Computer scientists have proven that any computer that can execute the Bare-Bones language is equivalent in power to a Turing Machine • Heaven forbid! • Useful conclusion: • Any programming language does at least the same as Bare-Bones (hopefully more!) will also be Turing Equivalent • The extra features are just for convenience

  20. Where We’ve Been • Computers are just tools for completing computations • Theory of computation: what is possible/impossible for all computers? What is computable? • Turing Machine: imaginary ‘all powerful’ computer • Church-Turing thesis states no computer can do better • Modern computers are equivalent to Turing Machines • Any algorithm we implement on a computer is computable

  21. Where We’re headed • It’d be nice to know, before we start if a problem is noncomputable • Halting problem as an example • Even if a problem is computable, it would be nice to know in advance if it is easy or hard to solve • Even if we can solve a problem, it would be nice to know how long it will take to solve it • Save effort in solving complex problems • Take advantage of complexity

  22. The Halting Problem • Some problems can’t be solved. • Consider: Given the source code for any computer program, can you analyze the code and decide if it will it ever stop running?

  23. The Halting Problem Does this program halt? int X = 3 while( X > 0) x = x -1

  24. The Halting Problem Does this program halt? int X = 3 repeat x = x +1; until x = 0

  25. The Halting Problem How about this program? virtual void estimate_sigmas(){ sigmas = std::vector< std::vector<Matrix> >(num_clusters); for(inti = 0; i<num_clusters; i++){ sigmas[i] = std::vector<Matrix>(num_clusters); } for(inti = 0; i<num_clusters; i++){ for(int j = 0; j<num_clusters; j++){ sigmas[i][j] = zero_matrix<double>(proper_size,proper_size); } } Vector temp_v(proper_size); edge_iteratorebg,end; intc_i,c_j; for(tie(ebg,end) = edges(data); ebg!=end; ++ebg){ if(data[*ebg].type == edge_type && data[*ebg].exists){ c_i = data[source(*ebg,data)].clustering(); c_j = data[target(*ebg,data)].clustering(); temp_v = get_edge_vector(*ebg) - ic_means[c_i][c_j] sigmas[c_i][c_j] = sigmas[c_i][c_j] + outer_prod(temp_v,trans(temp_v))/observed_edge_prob(c_i,c_j); } } }

  26. Computability Explained • Sometimes, we can work out answers for simple, example inputs of hard problems, but: • What algorithm did you use to decide for the first two programs? • Can you generalize it to the third? • To prove something is not computable, we’ll use the following strategy: • Assume that there is an algorithm that can solve the problem all the time • Show that, regardless of how the algorithm works, that there is at least one case where the algorithm will fail • Contradicts part 1, which claimed it ‘always works’

  27. The Halting Problem Is Not Computable (PROOF!) • Assume, for the sake of contradiction, that there exists a computer program that, given any other computer program as input, can tell us if it stops: • STOPS(Program) • Make a new program: • Opposite(X): • If STOPS(X), then run forever. • Otherwise, stop! • Opposite is a program, which itself accepts program code as input. What happens when we try to run • Opposite(Oppisite)? • If STOPS(Opposite), then Opposite will run forever • Otherwise, stop!

  28. Halting Problem Implications • Existence proof: Since there’s one program that exists which we can’t compute if it halts, then there may be (probably are) others • If there’s one problem that seems computable, but is not, then there are others • Look up Wang Tiles for an interesting example! • Program Analysis: “Does my program compute X?” • Any place in the code where X is computed, add a HALT command • Changes to “Does my program ever halt?”

  29. Algorithmic Complexity • Algorithmic Complexity refers to how many resources (time and memory) a computer will need to solve a problem • How long will it take to process all the data? • How much space (Memory) will we need? • If we use more space, will it take less time? • Are some problems harder to solve than others? • Can we figure that out before we try to solve them? • How can we take advantage of complexity?

  30. Problems Vs. Solutions • It’s one thing to take a specific algorithm and say it’s complex (or not): PROCEDURE Add(X,Y): sum = X + Y RETURN sum • Because solving the problem, and doing so efficiently, are two different things: PROCEDURE BadAdd(X,Y): Z = 1000000000 sum = 0 REPEAT Z = Z - 1 UNTIL Z = 0 sum = X + Y RETURN sum • To say that a PROBLEM is difficult, you need to prove that there are no easy ways to solve it

  31. Run-Time Complexity • Looked at briefly in chapter 5 • Principal method for analyzing algorithm complexity: • How many steps does it take to complete the entire algorithm? • Steps are often based on the size of the input: PROCEDURE add-all(L): sum = 0 count = 0 WHILE count < length(L): sum = sum + L[count] RETURN sum • How many steps to add a list with 10 numbers? 1000 numbers?

  32. Space Complexity • Many algorithms only need enough space to hold the input data • Procedure add-all (L) Only needs enough space to store L • Others, because the problem is more difficult, use supplementary data • EX: Binary Search Trees • Still others use extra memory to be faster • EX: Dynamic Programming Fibonacci sequence

  33. Complexity Classes • The most reoccurring algorithmic runtimes are (in order) constant, log(N), N, N*log(N), N2, N3, and an • A polynomial problem is any problem for which the best known algorithm for solving it has time complexity that is no worse than a polynomial function f(N) = Nd where d can be any number and N is the size of the input. • All problems that we can solve with an exact solution is a reasonable amount of time are in the polynomial class • Problems outside this class are referred to as intractible • For short, we refer to the entire set of all polynomial problems in the world a the set P

  34. Complexity Classes, continued. • A Nondeterministic machine: is a theoretical machine that just knows how to solve a problem, no matter how hard it may be • A Nondeterministic Polynomial Problem is a problem for which the best known algorithm for solving has a polynomial runtime, but its execution would require a nondeterministic machine • Another intuition: these are problems for which finding the solution is hard, but checking the solution for correctness is easy • We refer to the set of problems in this domain as NP NP DOES NOT MEAN “NOT POLYNOMIAL”

  35. NP-Completeness • In General, the class of problems in NP consists of problems that are difficult, but useful • Traveling Salesman example—best solution is exponential • Within a single class, some problems are harder than others • In P, it is harder to sort a list than it is to add two numbers • In the class NP, we identify a set of problems that are the most difficult to solve in the entire set • Called NP-Complete problems • The speed at which we can solve these problems determines how fast we can solve the lesser problems

  36. P vs. NP • All problems that are in the set P must also be in the set NP • Why? • Big, unknown question in Computer Science: • DOES P = NP??? • What does it mean if P = NP? • One approach: Find a polynomial solution for an NP-Complete problem • Thousands have tried, all have failed • Most people believe, but can’t prove P  NP

More Related