1 / 27

PART III

PART III. Turing Machines and Effective Computability. PART III Chapter 1. Turing Machines. Turing machines. the most powerful automata (> FAs and PDAs ) invented by Turing in 1936 can compute any function normally considered computable Turing-Church Theses:

wyatt-stone
Download Presentation

PART III

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PART III Turing Machines and Effective Computability

  2. PART III Chapter 1 Turing Machines

  3. Turing machines • the most powerful automata (> FAs and PDAs ) • invented by Turing in 1936 • can compute any function normally considered computable • Turing-Church Theses: • Anything (function, problem, set etc.) that is (though to be) computable is computable by a Turing machine (i.e., Turing-computable). • Other equivalent formalisms: • post systems (string rewriting system) • PSG (phrase structure grammars) : on strings • m-recursive function : on numbers • l-calculus, combinatory logic: on l-term • C, BASIC, PASCAL, JAVA languages,… : on strings

  4. Informal description of a Turing machine 1. Finite automata (DFAs, NFAs, etc.): • limited input tape: one-way, read-only • no working-memory • finite-control store (program) 2. PDAs: • limited input tape: one-way, read-only • one additional stack as working memory • finite-control store (program) 3. Turing machines (TMs): • a semi-infinite tape storing input and supplying additional working storage. • finite control store (program) • can read/write and two-way(move left and right) depending on the program state and input symbol scanned.

  5. Turing machines and LBAs 4. Linear-bounded automata (LBA): special TMs • the input tape is of the same size as the input length (i.e., no additional memory supplied except those used to store the input) • can read/write and move left/right depending on the program state and input symbol scanned. • Primitive instructions of a TM (like +,-,*, etc in C or BASIC): 1. L, R // moving the tape head left or right 2. a G, // write the symbol a G on the current scanned position depending on the precondition: 1. current state and 2. current scanned symbol of the tape head

  6. The model of a Turing machine memory is a one-dimensional tape left-end input: x additional working memory x1 x2 x3 x4 x5 .. xn …. no right-end for TM r/w & movable tape head permitted actions: 1. write 2. move left/right depending on scanned symbol and current state . . initial state . . accept final state current state reject final state control store (program)

  7. The structure of a TM instruction: • An instruction of a TM is a tuple: (q, a, p, d)  Q x G x Q x (G U {L,R}) where • q is the current state • a is the symbol scanned by the tape head • (q,a) define a precondition that the machine may encounter • (p,d) specify the actions to be done by the TM once the machine is in a condition matching the precondition (i.e., the symbol scanned by the tape head is ‘a’ and the machine is at state q ) • p is the next state that the TM will enter • d is the action to be performed: • d = b  G means “write the symbol b to the tape cell currently scanned by the tape head”. • d = R (or L) means “move the tape head one tape cell in the right (or left, respectively) direction. • A Deterministic TM program d is simply a set of TM instructions (or more formally a function: d: Q x G --> Qx (G U{L,R}))

  8. Formal Definition of a standard TM (STM) • A deterministic 1-tape Turing machine (STM) is a 9-tuple M = (Q,S,G, [,, d, s, t,r ) where • Q : is a finite set of (program) states with a role like labels in traditional programs • G : tape alphabet • S  G : input alphabet • [ G - S : The left end-of-tape mark • G - S is the blank tape symbol • s  Q : initial state • t  Q : the accept state • r  t  Q: the reject state and • d: (Q - {t,r})x G --> Qx(G U {L,R}) is a total transition function with the restriction: if d(p, [ ) =(q, d) then d = R. i.e., the STM cannot write any symbol at left-end and never move off the tape to the left.

  9. Configurations and acceptances • Issue: h/w to define configurations like those defined at FAs and PDAs ? • At any time t0 the TM M’s tape contains a semi-infinite string of the form Tape(t0) = [y1y2…ym  ….. (ym ) • Letw denotes the semi-infinite string:  ….. Note: Although the tape is an infinite string, it has a finite canonical representation: y, where y = [y1…ym (with ym ) A configuration of the TM M is a global state giving a snapshot of all relevant info about M’s computation at some instance in time.

  10. Formal definition of a configuration Def: a cfg of a STM M is an element of CFM =def Q x { [ y | y  (G-{[})*} x N // N = {0,1,2,…} // When the machine M is at cfg (p, z, n) , it means M is 1. at state p 2. Tape head is pointing to position n and 3. the input tape content is z. Obviously cfg gives us sufficient information to continue the execution of the machine. Def: 1. [Initial configuration:] Given an input x and a STM M, the initial configuration of M on input x is the triple: (s, [x, 0) 2. If cfg1 = (p, y, n), then cfg1 is an accept configuration if p = t (the accept configuration), and cfg1 is an reject cfg if p = r ( the reject cfg). cfg1 is a halting cfg if it is an accept or reject cfg.

  11. One-step and multi-step TM computations • one-step Turing computation ( |--M) is defined as follows: • |--M CFM2 s.t. 0. (p,z,n) |--M (q,snb(z), n) if d(p,zn) = (q, b) where b G 1. (p,z,n) |--M (q,z, n-1) if d(p,zn) = (q, L) 2. (p,z,n) |--M (q,z, n+1) if d(p,zn) = (q, R) • where snb(z) is the resulting string with the n-th symbol of z replaced by ‘b’. • ex: s4b( [baaacabc ) = [baabcabc • s6b( [baa ) = [baab • |--M is defined to be the set of all pairs of configurations each satisfying one of the above three rules. Notes: 1. if C=(p,z,n) |--M (q,y,m) then n 0 and m  0 (why?) 2. |--M is a function [from nonhalting cfgs to cfgs] (i.e., if C |--M D & C |--M E then D=E). 3. define |--nM and |--*M (ref. and tran. closure of |--M) as usual.

  12. Accepting and rejecting of TM on inputs • x S is said to be accepted by a STM M if icfgM(x) =def (s, [x, 0) |--*M (t,y,n) for some y and n • I.e, there is a finite computation (s, [x, 0) = C0 |--M C1 |-- M …. |--M Ck = (t,y,n) starting from the initial configuration and ending at an accept configuration. • x is said to be rejected by a STM M if (s, [x, 0) |--*M (r,y,n) for some y and n • I.e, there is a finite computation • (s, [x, 0) = C0 |--M C1 |-- M …. |--M Ck = (t,y,n) • starting from the initial configuration and ending at a reject configuration. Notes: 1. It is impossible that x is both accepted and rejected by a STM. (why ?) 2. It is possible that x is neither accepted nor rejected. (why ?)

  13. Languages accepted by a STM Def: 1. M is said to halt on input x if either M accepts x or rejects x. 2. M is said to loopon x if it does not halt on x. 3. A TM is said to be totalif it halts on all inputs. 4. The language accepted by a TM M, L(M) =def {x in S* | x is accepted by M, i.e., (s, [xw ,0) |--*M (t, -,-) } 5. If L = L(M) for some STM M ==> L is said to be recursively enumerable (r.e.) 6. If L = L(M) for some total STM M ==> L is said to be recursive 7. If ~ L=defS* - L = L(M) for some STM M (or total STM M) ==> L is said to be Co-r.e. (or Co-recursive, respectively)

  14. Some examples Ex1: Find a STM to accept L1 = { w # w | w  {a,b}* } note: L1 is not CFL. The STM has tape alphabet G = {a, b,#, -,, [} and behaves as follows: on input z = w # w  {a,b,#}* 1. if z is not of the form {a,b}* # {a,b}* => goto reject 2. move left until ‘[‘ is encountered and in that case move right 3. while I/P = ‘-’ move right; 4. if I/P = ‘a’ then 4.1 write ‘-’; move right until # is encountered; Move right; 4.2 while I/P = ‘-’ move right 4.3 case (I/P) of { ‘a’ : (write ‘-’; goto 2); o/w: goto reject } 5. if I/p = ‘b’ then … // like 4.1~ 4.3 6. If I/P = ‘#’ then // All symbols left to # have been compared 6.1 move right 6.2 while I/P = ‘-” move right 6.3 case (I/P) of {‘’: goto Accept; o/w: go to Reject }

  15. More detail of the STM Step 1 can be accomplished as follows: 1.1 while (~# /\ ~) R; // or equivalently, while (a \/ b\/[) R if  => reject // no # found on the input if # => R; 1.2 While ( ~# /\ ~ ) R; if => goto accept [or goto 2 if regarded as a subroutine] if # => goto Reject; // more than one #s found Step 1 requires only two states:

  16. ~# /\ ~  cnd R ACs p q ~# /\ ~  # R  R  # R r t Graphical representation of a TM means: if (state = p) /\ (cnd true for i/p) then 1. perform ACs and 2. go to q ACs can be primitive ones: R, L, a,… or another subroutine TM M1. Ex: the arc from s to s implies the existence of 4 instructions: (s, a, s, R), (s,b,s,R), (s, [,s,R), and (s,-, s,R) s u

  17. Tabular form of a STM • Translation of the graphical form to tabular form of a STM d G Q The rows for t & r indeed need not be listed!!

  18. ~# /\ ~  R step 2. ~# /\ ~  step 3. # step 4. R  R ~[ L r r r r t ~# - -  R # R [ # a a - - R R R ~a ~#  - b # b x - R R - ~b - step 1. step 5.  # R ~  /\~- step 6. The complete STM accepting L1

  19. R.e. and recursive languages Recall the following definitions: 1. M is said to halt on input x if either M accepts x or rejects x. 2. M is said to loopon x if it does not halt on x. 3. A TM is said to be totalif it halts on all inputs. 4. The language accepted by a TM M, L(M) =def {x ∈S* | x is accepted by M, i.e., (s, [x w ,0) |--*M (t, -,-) } 5. If L = L(M) for some STM M ==> L is said to be recursively enumerable (r.e.) 6. If L = L(M) for some total STM M ==> L is said to be recursive 7. If ~ L=defS* - L = L(M) for some STM M (or total STM M) ==> L is said to be Co-r.e. (or Co-recursive, respectively)

  20. Recursive languages are closed under complement Theorem 1: Recursive languages are closed under complement. (i.e., If L is recursive, then ~L = S* - L is recursive.) pf: Suppose L is recursive. Then L = L(M) for some total TM M. Now let M* be the machine M with accept and reject states switched. Now for any input x, • x  ~L => x L(M) => icfgM(x) |-M* (t,-,-) => • icfgM*(x) |-M** (r*,-,-) => x  L(M*). • x ~L => x  L(M) => icfgM(x) |-M* (r,-,-) => • icfgM*(x) |-M** (t*,-,-) => x  L(M*). Hence ~L = L(M*) and is recursive. Note.The same argument cannot be applied to r.e. languages. (why?) Exercise: Are recursive sets closed under union, intersection, concatenation and/or Kleene’s operation ?

  21. Some more termonology Set : Recursive and recursively enumerable(r.e.) predicate: Decidability and semidecidability Problem: Solvability and semisolvabilty • P : a statement about strings ( or a property of strings) • A: a set of strings • Q : a (decision) Problem. We say that 1. P is decidable <==> { x | P(x) is true } is recursive 2. A is recursive <==> “x  A” is decidable. 3. P is semidecidable <==> { x | P(x) is true } is r.e. 4. A is r.e. <==> “x  A” is semidecidable. 5. Q is solvable <=> Rep(Q) =def {“P” | P is a positive instance of Q } is recursive. 6. Q is semisolvale <==> Rep(Q) is r.e..

  22. The Chomsky Hierarchy • Relationship of Languages, Grammars and machines

  23. type 3 (regular langs) CFLs (type 2 langs) CSLs (type 1 Langs) Recursive Languages Recursively Enumerable(type 0) languages All Languages The Chomsky Hierarchy

  24. Phrase-structure grammar Def.: A phrase-structure grammar G is a tuple G=(N, S, S, P) where • N, S, and S are the same as for CFG, and • P, a finite subset of (NUS)* N (NUS)* x (NUS)* , is a set of production rules of the form: • a b where • a ∈ (NUS)* N (NUS)* is a string over (NUS)* containing at least on nonterminal. • b ∈ (NUS)* is a string over (NUS)*. Def: G is of type • 1 (context-sensitive) if S  e or |a| ≤ |b|. • 2 (context-free) if a∈N and b e or S  e. • 3 (right linear) if every rule is one of the forms: • A  a B or A  a (a ≠ e) or S  e.

  25. Derivations • Derivation G ⊆ (NUS)* x (NUS)* is the least set of pairs such that : ∀ x,y ∈ (S UN)*, a  b ∈ P, xay G xby. • Let *G be the ref. and tran. closure of G. • L(G) : the languages generated by grammar G is the set: L(G) =def {x ∈ S* | S *G x }

  26. Example • Design CSG to generate the language L={0n1n2n | n ≥ 0 }, which is known to be not context free. Sol: Consider the CSG G1 with the following productions: S e , S  0SA2 2A  A2, 0A01 1A11 For G1 we have S  0SAB …0k(A2)k * 0kAk2k 0k1k2k ∴ L ⊆ L(G1). Also note that • if S * a ==> #0(a) = #(A|1)(a) = #(2)(a). • if S* a ∈ {0,1,2}* => ak = 0 ==> aj = 0 for all j < k. • ak = 1 => aj = 1 or 0 for all j < k. • Hence a must be of the form 0*1*2* => a ∈ L. QED

  27. Lemma 1 : if S *a ∈ S*, then it must be the case that S * 0* S (A + 2)* 0* (A+2)* * 0*1 (A+2)* * 0*1*2*.

More Related