1 / 59

Theory of Computing

Theory of Computing. By, R.Karunamoorthi , AP / CSE. A simple “ computer ”. SWITCH. f. BATTERY. on. start. off. f. input: switch output: light bulb actions: f for “flip switch” states: on , off. bulb is on if and only if there was an odd number of flips. Another “ computer ”.

Download Presentation

Theory of Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Theory of Computing By, R.Karunamoorthi, AP / CSE

  2. A simple “computer” SWITCH f BATTERY on start off f input: switch output: light bulb actions:f for “flip switch” states:on, off bulb is on if and only if there was an odd number of flips

  3. Another “computer” 1 off 1 start off 1 2 2 BATTERY 2 2 1 2 on off 1 inputs: switches 1 and 2 actions:1 for “flip switch 1” actions:2 for “flip switch 2” states:on, off bulb is on if and only if both switches were flipped an odd number of times

  4. 1 5 BATTERY 2 3 A design problem ? 4 Can you design a circuit where the light is on if and only if all the switches were flipped exactly the same number of times? • Such devices are difficult to reason about, because they can be designed in an infinite number of ways • So they are represented as abstract computational devices, or automata, to answer such questions

  5. Some devices

  6. Why Study Automata Theory? Finite automata are a useful model for important kinds of hardware and software: • Software for designing and checking digital circuits. • Lexical analyzer of compilers. • Finding words and patterns in large bodies of text, e.g. in web pages. • Verification of systems with finite number of states, e.g. communication protocols.

  7. Why Study Automata Theory? (2) The study of Finite Automata and Formal Languages are intimately connected. Methods for specifying formal languages are very important in many areas of CS, e.g.: • Context Free Grammars are very useful when designing software that processes data with recursive structure, like the parser in a compiler. • Regular Expressions are very useful for specifying lexical aspects of programming languages and search patterns. Automata are essential for the study of the limits of computation. Two issues: What can a computer do at all? (Decidability) What can a computer do efficiently? (Intractability)

  8. Applications Pattern recognition Prog. languages Supervisory control Computer-Aided Verification Comm. protocols Compiler circuits Quantum computing ... Theoretical Computer Science Automata Theory, Formal Languages, Computability, Complexity …

  9. Computation memory CPU

  10. temporary memory input memory CPU output memory Program memory

  11. Example: temporary memory input memory CPU output memory Program memory compute compute

  12. temporary memory input memory CPU output memory Program memory compute compute

  13. temporary memory input memory CPU output memory Program memory compute compute

  14. temporary memory input memory CPU Program memory output memory compute compute

  15. Automaton temporary memory Automaton input memory CPU output memory Program memory

  16. Different Kinds of Automata Automata are distinguished by the temporary memory • Finite Automata: no temporary memory • Pushdown Automata: stack • Turing Machines: random access memory

  17. Finite Automaton temporary memory input memory Finite Automaton output memory Example: Vending Machines (small computing power)

  18. Pushdown Automaton Stack Push, Pop input memory Pushdown Automaton output memory Example: Compilers for Programming Languages (medium computing power)

  19. Turing Machine Random Access Memory input memory Turing Machine output memory Examples: Any Algorithm (highest computing power)

  20. Power of Automata Finite Automata Pushdown Automata Turing Machine Less power More power Solve more computational problems

  21. CHOMSKY HIERARCHY OF GRAMMARS

  22. Context-Free Languages • A language class larger than the class of regular languages • Supports natural, recursive notation called “context-free grammar” • Applications: • Parse trees, compilers • XML Context-free (PDA/CFG) Regular (FA/RE)

  23. An Example • A palindrome is a word that reads identical from both ends • E.g., madam, redivider, malayalam, 010010010 • Let L = { w | w is a binary palindrome} • Is L regular? • No. • Proof: • Let w=0N10N (assuming N to be the p/l constant) • By Pumping lemma, w can be rewritten as xyz, such that xykz is also L (for any k≥0) • But |xy|≤N and y≠ • ==> y=0+ • ==> xykz will NOT be in L for k=0 • ==> Contradiction

  24. Terminal But the language of palindromes… is a CFL, because it supports recursive substitution (in the form of a CFG) • This is because we can construct a “grammar” like this: • A  • A  0 • A1 • A  0A0 • A  1A1 Same as: A  0A0 | 1A1 | 0 | 1 |  Variable or non-terminal Productions How does this grammar work?

  25. How does the CFG for palindromes work? An input string belongs to the language (i.e., accepted) iff it can be generated by the CFG • Example: w=01110 • G can generate w as follows: • A => 0A0 • => 01A10 • => 01110 G: A  0A0 | 1A1 | 0 | 1 |  Generating a string from a grammar: Pick and choose a sequence of productions that would allow us to generate thestring. At every step, substitute one variablewith one of its productions.

  26. Context-Free Grammar: Definition • A context-free grammar G=(V,T,P,S), where: • V: set of variables or non-terminals • T: set of terminals (= alphabet U {}) • P: set of productions, each of which is of the form V 1 | 2 | … • Where each i is an arbitrary string of variables and terminals • S  start variable CFG for the language of binary palindromes: • G=({A},{0,1},P,A) • P: A  0 A 0 | 1 A 1 | 0 | 1 | 

  27. Examples • Parenthesis matching in code • Syntax checking • In scenarios where there is a general need for: • Matching a symbol with another symbol, or • Matching a count of one symbol with that of another symbol, or • Recursively substituting one symbol with a string of other symbols

  28. Example #2 • Language of balanced paranthesis e.g., ()(((())))((()))…. • CFG? G: S (S) | SS |  How would you “interpret” the string “(((()))()())” using this grammar?

  29. Example #3 • A grammar for L = {0m1n | m≥n} • CFG? G: S  0S1 | A A  0A |  How would you interpret the string “00000111” using this grammar?

  30. Example #4 A program containing if-then(-else) statements if Conditionthen StatementelseStatement (Or) ifConditionthenStatement CFG? S iCtSeS | iCtS | a Cb

  31. More examples • L1 = {0n | n≥0 } • L2 = {0n | n≥1 } • L3={0i1j2k | i=j or j=k, where i,j,k≥0} • L4={0i1j2k | i=j or i=k, where i,j,k≥1}

  32. String membership How to say if a string belong to the language defined by a CFG? • Derivation • Head to body • Recursive inference • Body to head Example: • w = 01110 • Is w a palindrome? Both are equivalent forms G: A  0A0 | 1A1 | 0 | 1 |  A => 0A0 => 01A10 => 01110

  33. Recursive inference G: A  0A0 | 1A1 | 0 | 1 |  • Body to head w = 01110 A A A

  34. Simple Expressions… • We can write a CFG for accepting simple expressions • G = (V,T,P,S) • V = {E,F} • T = {0,1,a,b,+,*,(,)} • S = {E} • P: • E ==> E+E | E*E | (E) | F • F ==> aF | bF | 0F | 1F | a | b | 0 | 1 • Id • Constants • Expressions like E + E or E–E or (E) or id

  35. Generalization of derivation • Derivation is head ==> body • A==>X (A derives X in a single step) • A ==>*G X (A derives X in a multiple steps) • Transitivity: IFA ==>*GB, and B ==>*GC, THEN A ==>*G C

  36. Context-Free Language • The language of a CFG, G=(V,T,P,S), denoted by L(G), is the set of terminal strings that have a derivation from the start variable S. • L(G) = { w in T* | S ==>*G w }

  37. Left-most & Right-most Derivation Styles G: E => E+E | E*E | (E) | F F => aF | bF | 0F | 1F |  Derive the string a*(ab+10) from G: E =*=>G a*(ab+10) • E • ==> E * E • ==> F * E • ==> aF * E • ==> a * E • ==> a * (E) • ==> a * (E + E) • ==> a * (F + E) • ==> a * (aF + E) • ==> a * (abF + E) • ==> a * (ab + E) • ==> a * (ab + F) • ==> a * (ab + 1F) • ==> a * (ab + 10F) • ==> a * (ab + 10) • E • ==> E * E • ==> E * (E) • ==> E * (E + E) • ==> E * (E + F) • ==> E * (E + 1F) • ==> E * (E + 10F) • ==> E * (E + 10) • ==> E * (F + 10) • ==> E * (aF + 10) • ==> E * (abF + 0) • ==> E * (ab + 10) • ==> F * (ab + 10) • ==> aF * (ab + 10) • ==> a * (ab + 10) Left-most derivation: Right-most derivation: Alwayssubstituteleftmostvariable Alwayssubstituterightmostvariable

  38. Leftmost vs. Rightmost derivations Q1) For every leftmost derivation, there is a rightmost derivation, and vice versa. True or False? Q2) Does every word generated by a CFG have a leftmost and a rightmost derivation? Q3) Could there be words which have more than one leftmost (or rightmost) derivation? True - will use parse trees to prove this Yes – easy to prove (reverse direction) Yes – depending on the grammar

  39. Gpal: A => 0A0 | 1A1 | 0 | 1 |  Language - CFG & CFL • Theorem: A string w in (0+1)* is in L(Gpal), if and only if, w is a palindrome. • Proof: Use induction • on string length for the IF part (w is a palindrome, w is in L(Gpal)) • |w| = 1 then L(Gpal) ={0,1, } • |w| ≥ 2 then w is palindrome; hence w = wr Hence wxwris in L(Gpal) • On length of derivation for the ONLY IF part (w is in L(Gpal) then w is a palindrome) • P = {0,1, } • (0x0)r = 0xr0 = 0x0

  40. Sentential form • Strings derived from start symbols are sentential forms S =*= > α is a sentential form • S =*=lm> α then leftmost sentential form • S =*=rm> α then rightmost sentential form

  41. Parse Trees • Each CFG can be represented using a parse tree: • Each internal node is labeled by a variable in V • Each leaf is terminal symbol • For a production, A X1X2…Xk, then any internal node labeled A has k children which are labeled from X1,X2,…Xkfrom left to right • Descendant, ancestor • Yield – concatenate leaves of the PT and concatenate from left Parse tree for production and all other subsequent productions: A  X1..Xi..Xk A X1 … Xi … Xk

  42. Examples E A E + E 0 0 A F F Recursive inference 1 A 1 a 1 Derivation  Parse tree for 0110 Parse tree for a + 1 G: E => E+E | E*E | (E) | F F => aF | bF | 0F | 1F | 0 | 1 | a | b G: A => 0A0 | 1A1 | 0 | 1 | 

  43. Parse Trees, Derivations, and Recursive Inferences How a CFG works wrt strings (w). If w meets condition at tail. Then it meets condn at head. Production: A  X1..Xi..Xk A X1 … Xi … Xk Recursive inference Derivation Parse tree Left-mostderivation Derivation Right-mostderivation Recursiveinference

  44. Interchangeability of different CFG representations • Parse tree ==> left-most derivation • DFS left to right • Parse tree ==> right-most derivation • DFS right to left • ==> left-most derivation == right-most derivation • Derivation ==> Recursive inference • Reverse the order of productions • Recursive inference ==> Parse trees • bottom-up traversal of parse tree

  45. G: A => 0A0 | 1A1 | 0 | 1 |  • Recursive inference ==> Parse trees • bottom-up traversal of parse tree • Basis • Induction ( 2 steps) • PT to derivation (lm – DFS) • A==> X1X2…..XiXi+1….Xk • i) A=lm=> w1w2…..wiXi+1….Xk • ii) Xi ==>α1 ==> α2 ….==>wi A 0 0 A Recursive inference 1 A 1 A => 0A0 => 01A10 => 01110  Parse tree for 0110

More Related