1 / 57

Hashing Out Random Graphs

Hashing Out Random Graphs . Nick Jones Sean Porter Erik Weyers Andy Schieber Jon Kroening. Introduction. We will be looking at some applications of probability in computer science, hash functions, and also applications of probability with random graphs. Hash Functions.

lei
Download Presentation

Hashing Out Random Graphs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hashing Out Random Graphs • Nick Jones • Sean Porter • Erik Weyers • Andy Schieber • Jon Kroening

  2. Introduction • We will be looking at some applications of probability in computer science, hash functions, and also applications of probability with random graphs.

  3. Hash Functions • We are going to map at set of n records, denoted , r1, r2, … rn, in m, m > n, locations with only one record in each location in m. • A hashing function is a function that maps the record values into the m locations.

  4. We use a sequence of hash functions, denoted h1, h2, h3, …, to map the ri records in the m locations. • The records are placed sequentially as indicated below: • h1(r1) = m1. • h1(r2), h2(r2), h3(r3), …

  5. Every time we are unsuccessful in placing a record (because it is already full), a collision occurs. • We will let the random variable X denote the number of collisions that occur when placing n records. • We would like to find E[X] and Var(X).

  6. These values are very hard to figure out but we can come up with a formula for each of these two problems. • In order to do this we need to define some other random variables.

  7. Yk = #of collisions in placing rk • Therefore, (geometric with p = (m-k+1)/m)

  8. We can then find E[Zk].

  9. We would also like to find Var(X).

  10. We now know the formula for E[X] and the Var(X).

  11. Alfred Renyi March 30, 1921– Feb. 1, 1970 49 years old

  12. The Hungarian mathematician spent six months in hiding after being forced into a Fascist Labor Camp in 1944 • During that time he rescued his parents from a Budapest prison by dressing up in a soldiers uniform • He got his Ph.D. at the University of Szeged in Hungary

  13. Renyi worked with Erdös on Random Graphs, they published joint work • He worked on number theory and graph theory, which led him to results about the measures of the dependency of random variables

  14. Paul Erdös “A Mathematician is a machine for turning coffee into theorems”

  15. Born: March 26, 1913 May have been the most prolific mathematician of all time Written and Co-Authored over 1475 Papers

  16. Erdös was born to two high school math teachers His mother kept him out of school until his teen years because she feared its influence At home he did mental arithmetic and at three he could multiply numbers in his head

  17. Fortified by espresso Erdös did math for 19 hours a day, 7 days a week He devoted his life to a single narrow mission: uncovering mathematical truth He traveled around for six decades with a suit case looking for mathematicians to pick his brain His motto was: “Another roof, another proof”

  18. “Property is a nuisance” • “Erdös posed and solved thorny problems in number theory and other areas and founded the field of discrete mathematics which is a foundation of computer science” • Awarded his doctorate in 1934 at the University of Pazmany Peter in Budapest

  19. Graphs • A graph consists of a set of elements V called vertices and a set E of pairs of vertices called edges • A path is a set of vertices i,i1,i2,..,ik,j for which (i,i1),(i1,i2),..,(ik,j) Є E is called a path from i to j

  20. Connected Graphs • A graph is said to be connected if there is a path between each pair of vertices • If a graph is not connected it is called disconnected

  21. Random Graphs • In a random graph, we start with a set of vertices and put in edges at random, thus creating paths • So an interesting question is to find P(graph is connected) such that there is a path to every vertex in the set

  22. James Stirling

  23. Who is James Stirling? • Lived 1692 – 1770. • Family is Roman Catholic in a Protestant England. • Family supported Jacobite Cause. • Matriculated at Balliol College Oxford • Believed to have studied and matriculated at two other universities but this is not certain. • Did not graduate because h refused to take an oath because of his Jacobite beliefs. • Spent years studying, traveling, and making friends with people such as Sir Isaac Newton and Nicolaus(I) Bernoulli.

  24. Methodus Differentialis • Stirling became a teacher in London. • There he wrote the book Methodus Differentialis in 1730. • The book’s purpose is to speed up the convergence of a series. • Stirling’s Formula is recorded in this book in Example 2 of Proposition 28.

  25. Stirling’s Formula • Used to approximate n! • Is an Asymptotic Expansion. • Does not converge. • Can be used to approximate a lower bound in a series. • Percentile error is extremely low. • The bigger the number inserted, the lower the percentile error.

  26. Stirling’s Formula Error Probability • About 8.00% wrong for 1! • About 0.80% wrong for 10! • About 0.08% wrong for 100! • Etc… • Percentile Error is close to so if the formula is multiplied by , it only gets better with errors only at .

  27. Probability Background • Normal Distribution and Central Limit theorem • Poisson Distribution • Multinomial Distribution

  28. The Normal Distribution • A continuous random variable x with pdf

  29. Normal Distribution

  30. Normal Distribution • Note: When the mean = 0 and standard deviation = 1, we get the standard normal random variable • Z~N(0,1)

  31. Central Limit Theorem • If X1, X2,… are independent identically distributed with common mean µ, and standard deviation σ, then

  32. Central Limit Theorem

  33. Poisson Distribution

  34. Multinomial Distribution • n independent identical trials of events A1, A2,…,Ak with probabilities P1,P2,...Pk • Define Xi = number times Ai occurs j=1…k • (X1+X2+…+Xk = n) then,

  35. Multinomial Distribution • Where n is sum of ni

  36. Connected Graphs • Recall: A random graph G consists of vertices, V={1,2,…,n}, random variables x(i) where i=1,..,n along with probabilities

  37. Connected Graphs • The set of random edges is then which is the edge emanating from vertex i

  38. Connected Graphs • The probability that a random graph is connected P {graph is connected} = ? • A special case: suppose vertex 1 is ‘dead’ (doesn’t spawn an edge) • N = 2

  39. Dead Vertex Lemma • Consider a random graph consisting of vertices 0,1,2,..,r and edges , i=1,2,…,r where are independent and , j=0,1,..,r

  40. Dead Vertex Lemma

  41. Maximal Non-Self Intersecting(MNSI) • Consider the maximal non-self intersecting path emanating from vertex 1:

  42. Maximal Non-Self Intersecting(MNSI) • Define and set

  43. Maximal Non-Self Intersecting(MNSI) • By using the MNSI path as the Dead Vertex Lemma,

  44. Conditional Probability

  45. Conditional Probability

  46. Conditional Probability

  47. Conditional Probability

  48. Conditional Probability

  49. Poisson Distribution

More Related