1 / 48

Randomized Distributed Decision

Yes. No. Yes. Randomized Distributed Decision. No. No. No. Pierre Fraigniaud , Amos Korman , Merav Parter and David Peleg. No. No. DISC 2012. The Basic Questions. What global information can be deduced from local structure? Does randomization help ? To what extent?.

jonny
Download Presentation

Randomized Distributed Decision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Yes No Yes Randomized Distributed Decision No No No Pierre Fraigniaud, Amos Korman, MeravParterand David Peleg No No DISC 2012

  2. The Basic Questions • What global information can be deduced from • local structure? • Does randomization help? • To what extent?

  3. Outline • The LOCAL Model • Related Work • Decision Problems • Randomized Local Decision • Contributions • Open Problems

  4. The LOCAL model G (0,0) • Input: • A pair (G, ) : • G connected graph • vector of local inputs. • * (1,1) (0,1) 15 16 14 (1,0) (1,1) 20 (0,0) (0,1) 19 (0,1) 14 18 (0,0) (1,0) 19 17 6 11 (0,0) (1,1) (0,0) 10 7 13 8 9 5 (0,0) (0,1) (0,1) 3 4 (0,1) 1 12 2 (0,1) (0,1) (0,1) (1,1) *To distinguish nodes, assume an ID assignment .

  5. The LOCAL model • Simultaneous wakeup, fault-free synchronous communication. • Computation: • In each round, every processor: • Receives messages from neighbors. • Computes (internally). • Sends messages to its neighbors. • Complexity measure: • number of communication rounds. • No restriction on memory, local computation and message size. (1,0) 7 (1,1) (1,0) 10 (1,1) 8 (1,0) 5 (0,0) (1,1) 11 6 (1,1) 2 3 9 (0,0) (1,1) 1 (1,1) 4 12

  6. Outline • The LOCAL Model • Related Work • Decision problems • Randomized local decision • Contribution • Open problems

  7. The Impact of randomization in local computation • Negative Indications: • Naor and Stockmeyer [STOC ’93] : Define the LCL* class. • Every constant time algorithm for constructing LCL can be derandomized. • Naor [SIAM Disc. Maths ‘96] • Randomization does not help for 3-coloring the ring. * Restricted to constant time, constant degree and constant alphabet.

  8. The Impact of randomization in local computation Positive Indications: ( Randomly in O(logn ) w.h.p. Alon, Babai, Itai[J. Alg. ’86], Luby [SIAM J. Comput. ’86] Deterministically in . Panconesi, Srinivasan [J. Algorithms, ‘96] Local Decision Tasks [Fraigniaud, Korman, Peleg, FOCS’11]

  9. Distributed Complexity Theory • Locally checkable proofs. • [M. GÖÖs and J. Suomela. PODC’11.] • Decidability Classes for Mobile Agents Computing. • [P. Fraigniaud and A. Pelc. Proc. 10th LATIN, 2012.] • Locality and Checkabilityin Wait-free Computing. • [P. Fraigniaud, S. Rajsbaum, and C. Travers. DISC’11.] • Local Distributed Decision. • [P. Fraigniaud, A. Korman, and D. Peleg. FOCS’11]

  10. Outline • The LOCAL Model • Related Work • Decision problems • Randomized local decision • Contribution • Open problems

  11. Local Decision Tasks [Fraigniaud, Korman, PelegFOCS’11] Goal: nodes need to collectively decide whether the instance they live in belongs to a given distributed language.

  12. Distributed Languages Def: A distributedlanguage is a decidable collection of instances. Coloring=. At-Most-One-Selected={(G,x) s.t∑xi1}. MIS=.

  13. Local Decision Tasks [FKP11] G (0,0) • Input: • A pair (G, ) : • G connected graph • vector of local inputs. • * • Language L.. • Output: Yes\ No (1,1) (0,1) 15 16 14 (1,0) (1,1) 20 (0,0) (0,1) 19 (0,1) 14 18 (0,0) (1,0) 19 17 (0,0) 6 11 (0,0) (1,1) 10 7 13 8 9 5 (0,0) (0,1) (0,1) (0,1) 3 4 1 12 2 (0,1) (0,1) (1,1) (0,1)

  14. Local Decision [FKP11] 15 16 14 9 20 23 9 18 9 9 19 Yes, No 17 u 9 10 13 6 9 7 8 9 5 12 3 4 1 2

  15. The Global Picture of Local Decision G The final decision is the conjunction of the output. (0,0) (1,1) (0,1) Yes No Yes (1,0) (1,1) Yes (0,0) (0,1) Yes No Yes (0,1) Yes (0,0) (1,0) Yes No No (0,0) (1,1) (0,0) Yes Yes Yes (0,0) (0,1) (0,1) Yes Yes Yes (0,1) No (0,1) (0,1) (0,1) (1,1) Yes Yes No Yes

  16. The Local Decision (LD) Class A local deciderA for language is a local alg. such that : Everyone says yes : At least one says no (for every Id assignment ). LD(t) (Local Decision) Class P analogue Class of languages that have a t-rounds local decider.

  17. Example: Coloring Coloring=.

  18. Very few languages can be decided locally At-Most-One-Selected (AMOS-1)={(G,x) s.t ∑xi1}. (0) (0) (0) (0) (0) (1) (0) Extension: Use randomness to decide

  19. Outline • The LOCAL Model • Decision problems • Randomized local decision • Related Work • Contribution • Open problems

  20. Randomized Local Decision 15 16 14 9 20 23 9 18 9 9 19 Yes, No 17 u 9 10 13 6 9 7 8 9 5 12 3 4 1 2

  21. Randomized Local Decision A (p,q)-decider for language L is a local 2-sidederror Monte Carlo algorithm, suchthat: : Everyone says yes with probability* ≥p : At least one says no with probability* ≥q. BPLD(p,q,t) (Bounded Probability Local Decision) Class BPP analogue Class of languages that have a t-rounds (p,q)-decider. * The probabilities are taken over all coin tosses performed by the nodes.

  22. The Question • What’s the connection between BPLD(p,q,t) classes? • Can one boost the success probability of a (p,q)-decider?

  23. Doesrandomizationhelp in local decision? [FKP11] p2+q=1 is sharpthreshold for hereditary languages* Randomization threshold No No p2+q=1 q (``no” probability) Yes p (``yes” probability) * Languages that are closed under inclusion.

  24. If p2+q  1 randomizationhelps! [FKP11] At-Most-One-Selected (AMOS-1) Yes w.p Yes Yes Yes Yes Yes Yes • 0-round (p,q)-decider • every unmarked node says “yes” with probability 1; • every marked node says “yes” with probability p.

  25. AMOS-1 At-Most-One-Selected (AMOS-1) Yes w.p YES Instance Yes Yes Yes Yes Yes Yes Yes Probability that everyone says yes ≥ p

  26. AMOS-1 At-Most-One-Selected (AMOS-1) NO Instance Yes w.p Yes w.p Yes Yes Yes Yes Yes Probability that at least one says no≥ 1-p2.

  27. Outline • The LOCAL Model • Decision problems • Randomized local decision • Related Work • Contribution • Open problems

  28. (1) Contribution Randomization threshold Determinism No q Any language on a path topology Randomization p

  29. (2) Contribution q Determinism Randomization p

  30. The Bkhierarchy Bk Class of languages that have a (p,q)-decider s.t where k is integer. p1+1/k+q1 Bk(t)

  31. Theorem: The Bkhierarchy is strict B1(t) ALL Determinism p2+q>1 Determinism (B1, ~P) p3/2+q>1 B2 p4/3+q>1 q (“no” success probability) B3 BPLD (~BPP) p+q>1 B ALL p (“yes” success probability)

  32. At-Most-k-Selected (AMOS-k) At-Most-k-Selected= Determinism B2 Lemma: Bk+1 \ Bk. q Bk+1 AMOS-1 B AMOS-k ALL p

  33. At-Most-2-Selected (AMOS-2) YES Instance p4/3+q>1 Yes w.p Yes w.p B2 AMOS-2 B3 Yes Yes Yes Yes Yes Probability that everyone says yes ≥ p

  34. At-Most-2-Selected (AMOS-2) NO Instance Yes w.p Yes w.p Yes w.p Yes Yes Yes Yes Probability that at least one says no (q) ≥ 1-p3/2 Thus p4/3 +q>1AMOS-2

  35. The Challenge of a (p,q)-decider Instance Space for language L PIllegal:= probability to accept I’ I’ I’ No I Yes Plegal:= probability to accept I I If p3/2+q > 1 then

  36. Tool: -Secure Zone • Instance (G,x) • A t-round (p,q)-decider A

  37. Tool: -Secure Zone • Instance (G,x) • A t-round (p,q)-decider A 2t probability that one says no <δ

  38. Tool: -Secure Zone 2t Everyone says yes with probability Everyone says yes with probability • and are independent. q < Probability that one says NO <

  39. Tool: -Secure Zone Claim: Every large enough legal subpath contains a -Secure subpath. 2t probability that one says no <δ All nodes say yes with probability >p

  40. At-Most-2-Selected B2 • Assume towards contradiction • that there exists a t-round (p,q)- decider A s.t p3/2+q > 1. • Define

  41. At-Most-2-Selected B2 The nodes execute the t-round (p,q) decider A. Probability that everyone says ``yes” P1 P3 P2 NO 2t 2t The probability that one says no at most )/2 P1 P3 P2

  42. At-Most-2-Selected B2 NO P2 P1 P3 2t 2t YES P1 P3

  43. At-Most-2-Selected B2 A is a (p,q) decider such that YES P1 P3 NO P2 P1 P3 2t 2t Since), contradiction!

  44. B∞(t) ≠ ALL for every t=o(n) Tree= Assume, towards contradiction the existence of a (p,q)-decider A s.t p+q >1. Define

  45. Tree B∞(t) for every t=o(n) The nodes of the path execute A. 11 11 10 10 Yes Instances 12 12 1 3 4 5 6 7 2 8 9 9 The probability that one says no at most 2t 7 8 9 1 3 4 5 9 2 6 The probability that everyone says yes n-2t

  46. Tree B∞(t) for every t=o(n) No instance Yes Instances 11 11 11 10 10 10 1 3 4 5 6 7 2 8 9 9 1 12 12 12 2 3 Prob. to say no at most 4 5 9 7 8 9 1 3 4 5 9 2 6 6 8 7 Prob. to say yes at least p Contradiction!

  47. Outline • The LOCAL Model • Related Work • Decision problems • Randomized local decision • Contribution • Open problems

  48. Towards Distributed Computational Complexity Theory • Does the class Bk+1(t) actually collapses • to Bk(t) or there exist intermediate classes? • The power of a decoder: • Decoder dealing with other interpretations, and more values • (not only ``yes” and ``no”) • Randomization and nondeterminism: • Interplay between certificate size and success guarantees. q Randomization p

More Related