1 / 32

Some thoughts about the role of Randomness in algorithms

Some thoughts about the role of Randomness in algorithms. Guy. Using randomization in algorithms. Why add randomization to such an organized process as algorithms? History. We needed to select primes with say 512 bits but check that they are prime. The density of prime numbers is

morela
Download Presentation

Some thoughts about the role of Randomness in algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some thoughts about the role of Randomness in algorithms Guy

  2. Using randomization in algorithms Why add randomization to such an organized process as algorithms? History. We needed to select primes with say 512 bits but check that they are prime. The density of prime numbers is about 1/ln n of the first n numbers are prime for a large n.

  3. Typical computation • The probability that in 10·ln n random choices we do not get a prime is: (1-1/ln n) 10·ln n< 1/n10 We once learned in calculus that (1-1/lk) k<1/e

  4. Checking if a number is prime • But how do we check if a random choice is prime? • I will show a simple randomized algorithm to check if n is prime, given that n is not a Carmichael number. • Rather known, a that is not zero modulo p satisfies ap-1 1 mod p • But it is not if and only if. Truly unfortunate.

  5. Carmichael numbers • Rather unfortunately there are non prime numbers so that an-1 1 mod n for every a! • But such numbers are rare. • Lets do something that was unthinkable 40 years ago: • 1) Draw at random ‘many’a{2,3,…,p-1} • 2)Checkan-1 1 mod n for all of them • 3) If all tests work then output ‘Prime’. • 4) If one of the tests fail output ‘Composite’.

  6. Say that n is not a Carmichael number • What is the probability of mistake? • If says COMPOSITE it is surely right • Now let a such that an-1is not 1 mod n • Bad witnesses: numbers b such that bn-1 1 mod n • How many of the numbers are bad witnesses?

  7. A mapping: • Let b be a bad witness. Define the mapping: • b a·b • If b is bad then (ab)n-1=an-1 which is not 1 mod n • The mapping is one to one: ba=b’a mod n implies that a(b’-b)=0 mod n which means: • b=b’. This also means that at least half are good witnesses

  8. The world of single shot good and bad winnesses GOOD BAD Even if one of the randomizedwitnesses is good we find out that n is not a prime.

  9. What is the probability that we do not hit a good witness for 300 consecutive independent times? • The number of bad witnesses is at most the number of good witnesses • Thus the probability of not hitting a bad witness in one trial is ½ • And in 300 trials (1/2)300 • 2300 is more then the number of atoms in the known universe

  10. The cursing stage • Michael Rabin suggested one sided error randomized algorithms. Around 78. • People have made fun of him. Badly. • Said this is math. An algorithm is not an algorithm if errs. • History shines on the bold: now randomization is totally central in algorithms. • Rabin won the Turing award. Certainly not just for the above! • An interesting question: if we are allowed to err with low probability, can we solve more things?

  11. Apparently, randomization can not solve non polynomial problems • Most people think that RP=P • A long standing example: finding if a number is prime or not, had for the longest time, only randomized algorithm. • A sensation:In 2002 polynomial time algorithm to test primality. • Manindra Agrawal, Neeraj Kayal and Nitin Saxena. primality test, in Õ((log n)7.5) time. • Lenstra and Pomerance : Õ((log n)6) time.

  12. Lets not call it quits yet • A question that has only RP answer as far as I know • Consider a matrix with symbols • We want to know if the determinant is zero or not (it can be zero if terms cancel) • Expansion of symbolic matrix takes exponential time (depends on permutations) • We have a simple RP algorithm for it

  13. A well known theorem • Consider a multi variable polynomial of degree d. For example: • x1·x2·x3+(-x1)x3+x4·x5·x6·(-x2) has degree 4. • Take a finite field F and a set S  F • Let the values of the xi be chosen at random from S • Then if the polynomial is not zero the probability that these choices give 0 is at most d/|S|

  14. Proof by induction • n=1 talks on a polynomial of degree d. • We know a polynomail of degree d has at most d roots and thus the claim is clear. • Consider a polynomial Q(x1,….,xn) as •  i≤k x1iQi(x2,…,xn). • LetPbe the polynomial that multiplies xk • This polynomial is zero with probability at most (d-k)/|S|.

  15. Proof continued • Say that the randomized choice did not produce 0 for P so, P(r2,r3,…,rn)0 • The other choices has been made so we get a polynomial and by the above of degree k. • The probability for a zero is at most k/S. • Pr(value 0| Q=0)·Pr(Q=0)+Pr(value  0|P0)·Pr(Q 0))≤Pr(Q=0)+ Pr(value 0|Q0)·Pr(Q0))=(d-k)/|S|+k/S=d/|S|.

  16. An application • It is enough to choose |S|≥2n and we get the same situation as in checking primes: one side error that ‘just does not happen’ • A deterministic algorithm is not known for that as far as I know • Why should we care: perfect matching on a bipartite graph. • Consider the matrix A of a bipartite graph B(U,V,E).Aij=1 iff (I,j) EforiUandj V

  17. Finding perfect matching in a bipartite graph • Change the 1 in Aij to xij • Recall what a determinant is: • det(A)=sign(Π)·A1Π(1) · A1Π(2) · · · ·AnΠ(n) • Note that xij can not cancel • The sum above is not zero only if a perfect matching exists (otherwise one in the multiplication above is 0) • Remark: This problem (for general graphs) can be solved in RNC but not known to be in NC!

  18. Randomized algorithm are faster • A famous problem: finding the median in an unsorted set of numbers. Assume that the numbers are pairwise disjoint and n is odd. • The median is the element that will be in the middle if we sort the array. But we want to find it without sorting. • There is a randomized algorithm with expected running time 2n. • Dorit Dor, Michael Tarsi: 2n not possible for deterministic. Lower bound (2+ε)n. • 3n is easy. But (3-ε)·n possible.

  19. Probability is used for things not related to probability at all • Let G(V,E) be a graph. A set U is an independent set if for all u,vU (u,v)E A B C G D E F H For example {A,D,C} and {B,F,E} are independent sets

  20. Finding an independent set of size n/(d+1) with d the average degree in the graph • Randomly order all vertices on a line. This sample space has n! points. • We say that v is good if non of its neighbors appear after v. • The set of good vertices is independent. • What is the probability that v is good? • Clearly 1/(dv+1)

  21. Proof continued • The sum v 1/(dv+1) is minimized when all dv are the same (convexity) • Define xv=1 if v S and xv=0 otherwise • |S|=  xv as xv gives 1 exactlyif in S • E(|S|)= E(xv)= v 1/(dv+1)≥ n/(d+1) • This is called the probabilistic method • Contains strong tools like the Lovats Local Lemma, Martingales, and more.

  22. Using randomization: Huge improvements • In on-line algorithm: an exponential gap for the problem of cashing. • Distributed algorithm: exponential gap for Byzantine agreement. • Exponential gaps for fast routing. • There are sub exponential randomized algorithm for the simplex (but not deterministic ones). • Cryptography is mostly based on randomization.

  23. On the philosophy of proofs • Say that two communicate and send messages. X wants to send a proof or evidence that something holds to Y. • For: the graph G is not an expander, can send only a subset of the vertices. • Can we convince Y that two given graphs are not the same (they are the same if some permutation of one gives the other)? • Possible if randomization at Y is added

  24. Like convincing Y that X is not color blind (red green) • The verifier prepares a slide. Writes a circle and fills it with red in one side. • In the other side draws the same circle with the same diameter and the same center but fills it in green on the second side. • The verifies chooses many times 0 or 1 at random. If 0 shows the red side. If 1 the green side. If the prover gets it right in 300 trials she probably is not color blind…..

  25. The PCP theorem: the AMAZING POWER OF PROBABILITY • A prover sends (in a certain special form) a proof that some input x belongs to an NPC language L • The prover looks only at randomly chosen CONSTANT number of bits from the proof!! • Uses only log n randomization. • If xL the verifier will say yes with probability 1 • If xL the verifier will claim that xL with probability at most ½. • So if we have good proofs, written on a specific way, we don’t even have to read them all to check that they are correct. Food for thought.

  26. The interesting behavior of random walks • A random walk on a line. • A line is n 0 1 2 3 4 5 If the walk is on 0 it goes into 1. Else it goes to i+1 or toi-1 with probability 1/2 What is the expected number of steps to go to n?

  27. The expected time function • T(n)=0 • T(i)=(T(i+1)+T(i-1))/2, i0 • T(0)=1+T(1) • Add allequation gives T(n-1)=2n-1. • From that we get:T(n-2)=4n-4 • T(i)=2(n-i)n-(n-i)2 • T(0)=n2

  28. The random algorithm for 2-SAT • 2-SAT : Start with an arbitrary assignment • Let C be a non satisfied clause. Choose one of the two literals of C and flip its value. • We know that if the variables are x1 and x2 the optimum disagrees with us on x1 or on x2. • Distance to OPT: with probability ½ smaller by 1 and with probability ½ larger by 1 (worst case). Thus E(RW)=n2

  29. P/poly • The expected number of steps to cover a general graph is less than 2n3 • Say that we have O(log n) memory. How can we tell if s and t are in the same CC? • Do a random walk of length 4n3. If we foundtthen we get a correct answer. If t and sare the the same CC, Pr≥½ for correct answer. • If do 2n4 steps, Pr≥1-1/n for right answer.

  30. A universal sequence • For simplicity, let us assume that the graph is d-regular • Each regular graph can label the edges from its side with 1,2,….d. • A universal sequence: a sequence of numbers so that each one of them belongs to {1,2,…,d} so that for every regular graph there is a portion of the sequence that will cover the graph. • The probabilistic method: O(n3·d·log n) length universal sequence.

  31. What is P/Poly? • In NPC problems given XL we get an advise f(X) and then we can check fast if indeed XL. • The finding Woody Allen in a huge party explanation. • P/Poly is quite different: the same advice for all instances of size n. Then we should be able to solve. Much stronger. NP P/Poly unless the the polynomial hierarchy collapses.

  32. The s and t are in the same CC problem: Surprise! • What happens if we don’t want P/Poly nor allow randomization? • The new sensation: Omer Reingold. From the Weizmann institute: The problem of s,t connectivity can be solved in DETERMINISTIC O(log n) space. • Like Harry Hoo said in ‘Get Smart’: AMAIZING.

More Related