1 / 21

EEL 4930 §6 / 5930 §5, Spring ‘06 Physical Limits of Computing

http://www.eng.fsu.edu/~mpf. EEL 4930 §6 / 5930 §5, Spring ‘06 Physical Limits of Computing. Slides for a course taught by Michael P. Frank in the Department of Electrical & Computer Engineering. Course Introduction Moore’s Law vs. Modern Physics Foundations

alfonsok
Download Presentation

EEL 4930 §6 / 5930 §5, Spring ‘06 Physical Limits of Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. http://www.eng.fsu.edu/~mpf EEL 4930 §6 / 5930 §5, Spring ‘06Physical Limits of Computing Slides for a course taught byMichael P. Frankin the Department of Electrical & Computer Engineering

  2. Course Introduction Moore’s Law vs. Modern Physics Foundations Required Background Material in Computing & Physics Fundamentals The Deep Relationships between Physics and Computation IV. Core Principles The two Revolutionary Paradigms of Physical Computation V. Technologies Present and Future Physical Mechanisms for the Practical Realization of Information Processing VI. Conclusion Physical Limits of ComputingCourse Outline Currently I am working on writing up a set of course notes based on this outline,intended to someday evolve into a textbook M. Frank, "Physical Limits of Computing"

  3. Part II. Foundations • This first part of the course quickly reviews some key background knowledge that you will need to be familiar with in order to follow the later material. • You may have seen some of this material before. • Part II is divided into two “chapters:” • Chapter II.A. The Theory of Information and Computation • Chapter II.B. Required Physics Background M. Frank, "Physical Limits of Computing"

  4. Chapter II.A. The Theory of Information and Computation • In this chapter of the course, we review a few important things that you need to know about: • §II.A.1. Combinatorics, Probability, & Statistics • §II.A.2. Information & Communication Theory • §II.A.3. The Theory of Computation M. Frank, "Physical Limits of Computing"

  5. Section II.A.1: Basic Elements of Combinatorics, Probability, and Statistics Topics covered in this section: • Basic Combinatorical Laws • Sum and product rules • Rules for counting sequences, permutations, and combinations • Basic Probability Theory • Events, Probabilities, Conditional & Mutual Probabilities • Basic Statistical Quantities • Expected Value, Variance, Standard Deviation M. Frank, "Physical Limits of Computing"

  6. Subsection II.A.1.a: Basic Laws of Combinatorics Sum and Product Rules Rules for Counting Sequences, Permutations, and Combinations

  7. Combinatorics • Combinatorics is the mathematical study of how to quickly count the number of ways to combine entities together in a specified fashion. • In combinatorics, we are always (explicitly or implicitly) counting the cardinality or number of elements |X| in some set X of “possibilities,” where each possibility is a particular way of combining entities together in the designated fashion. • Example problem: How many ways are there to deal out a hand of standard playing cards that are all of the suit “clubs” ()? (Order doesn’t matter.) • Mathematically, the problem can be interpreted as saying that we are supposed to find the value of |X|, where X = {hands H | H is a set of 5 different cards all of suit } • We’ll see how to solve this problem shortly. M. Frank, "Physical Limits of Computing"

  8. Sum Rule for Disjoint Unions • Theorem: Sum rule. Suppose each possible arrangement is of one of two distinct kinds, and there are y possibilities of the first kind, and z of the second kind. Then there are x = y + z total arrangements. • Mathematically: Let X = Y Z and let Y  Z = . Let x = |X|, y = |Y|, z = |Z|. Then x = y + z. • Example: My home movie collection consists entirely of comedies and action movies. I own 3 comedies and 4 action movies. None of my movies are action-comedies. How many movies do I have? • Answer: 3+4 = 7. M. Frank, "Physical Limits of Computing"

  9. Product Rule for Ordered Pairs • Theorem:Product rule. Suppose there is a one-to-one correspondence between the possible arrangements and ordered pairs of entities of two kinds (possibly the same kind), where there are y entities of the first kind and z of the second kind. (The two kinds of entities do not need to be disjoint.) Then there are x = y·z total arrangements. • Mathematically: Let there be a one-to-one map f:X→YZ, where YZ = {(a,b)|aY, bZ}. Then |X| = |Y|·|Z|. • Example: A “meal deal” at a certain restaurant consists of a choice of one appetizer and one entrée. The restaurant has 6 different appetizers, 3 entrées, and a bowl of chili which can be served as either an appetizer or an entrée (or as both). How many different “meal deals” could one order? • Answer: (6+1)·(3+1) = 7·4 = 28. Note that the sets Y and Z did not need to be disjoint (unlike in the case with the sum rule). M. Frank, "Physical Limits of Computing"

  10. Exponential Rule for Sequences This will lead to the logarithmic measure of information & entropy… • Theorem:Exponential rule. Suppose the arrangements correspond to sequences of n items, where any of y items could go at each position in the sequence. (Repetition of items is allowed, and the order of items matters.) Then there are x = yn possible arrangements. • Mathematically:|{(a1,a2,…,an)|i: aiY}| = |Y|n. • Proof: By repeated application of product rule. • Example: How many different 4-digit PIN numbers are there? • Answer: 104 = 10,000 M. Frank, "Physical Limits of Computing"

  11. Rule for Permutations • Definition: A k-permutation of a set Y is a sequence of k elements of Y in which no element appears more than once. • Theorem:Permutation rule. If |Y|=y, then there are P(y,k) = y!/(y−k)!k-permutations of the set Y. • Proof: Using the product rule on a sequence of items from sets of size y, y−1, …, down to y−k+1. • Example: A railroad yard has 20 different cars in it. How many ways are there to assemble a train of 5 cars to take away? (If the order of the cars matters.) • Answer: 20!/(20−5)! = 20·19·18·17·16 = 1,860,480. M. Frank, "Physical Limits of Computing"

  12. Rule for Combinations • Definition: A k-combination of a set Y is a subset consisting of k elements of Y. • Theorem: Combination rule. If |Y|=y, then there are C(y,k) = P(y,k)/k! = y!/[k!(y−k)!]k-combinations of Y. • Proof: The set of all k-permutations can be partitioned into disjoint subsets, each consisting of the k! different k-permutations of each k-combination. • Example: In the previous example, what if the order of cars in the train does not matter? • Answer: 20!/(15!·5!) = 15,504. • Example: How many hands of 5 clubs are there? • Answer: C(13,5) = 13!/(8!5!) = 1,287 M. Frank, "Physical Limits of Computing"

  13. Subsection II.A.1.b: Basic Probability Theory Events, Probabilities, Conditional and Mutual Probabilities

  14. Events & Probabilities • In statistics, an eventE is any possible situation (occurrence, state of affairs) that might or might not be the actual situation. • The proposition P = “the event E occurred” (or will occur) could turn out to be either true or false. • The probability of an event E is a real number p in the range [0,1] which gives our degree of belief in the truth of proposition P, i.e., the proposition that E will/did occur, where • The value p = 0 means that P is false with complete certainty, and • The value p = 1 means that P is true with complete certainty, • The value p = ½ means that the truth value of P is completely unknown • That is, as far as we know, it is equally likely to be either true or value. • The probability p(E) is also the fraction of times that we would expect the event E to occur in a repeated experiment. • That is, on average, if the experiment could be repeated infinitely often, and if each repetition was independent of the others. • If the probability of E is p, then we would expect E to occur once for every 1/p independent repetitions of the experiment, on average. • We’ll call 1/p the improbabilityi of E, and write i(E) = 1/p(E) M. Frank, "Physical Limits of Computing"

  15. Joint Probability • Let X and Y be events, and let XY denote the event that events X and Yboth occur together (that is, “jointly”). • Then p(XY) is called the joint probabilityof X and Y. • Product rule: If X and Y are independent events, then p(XY) = p(X) · p(Y). • This follows from basic combinatorics. • It can also be considered a definition of what it means for X and Y to be independent. M. Frank, "Physical Limits of Computing"

  16. Event Complements, Mutual Exclusivity, Exhaustiveness • For any event E, its complement~E is the event that event E does not occur. • Complement rule:p(E) + p(~E) = 1. • Two events E and F are called mutually exclusive if it is impossible for E and F to occur together. • That is, p(EF) = 0. • Note that E and ~E are always mutually exclusive. • A set S = {E1, E2, …} of events is exhaustive if the event that some event in S occurs has probability 1. • Note that S = {E, ~E} is always an exhaustive set. • Theorem: The sum of the probabilities of any exhaustive set S of mutually exclusive events is 1. M. Frank, "Physical Limits of Computing"

  17. Conditional Probability • Let XY be the event that X and Y occur jointly. • Then the conditional probability of X given Y is defined by p(X|Y) :≡ p(XY) / p(Y). • It is the probability that if we are given that Y occurs, then X would also occur. • Bayes’ rule:p(X|Y) = p(X) · p(Y|X) / p(Y). r(XY) Space of possible outcomes Event Y Event XY Event X M. Frank, "Physical Limits of Computing"

  18. Mutual Probability Ratio • The mutual probability ratio of X and Y is defined asr(XY) :≡ p(XY)/[p(X)p(Y)]. • Note that r(XY) = p(X|Y)/p(X) = p(Y|X)/p(Y). • I.e., r is the factor by which the probability of either X or Y gets boosted upon learning that the other event occurs. • WARNING: Many authors define the term “mutual probability” to be the reciprocal of our quantity r. • Don’t get confused! I call that “mutual improbability ratio.” • Note that for independent events, r = 1. • Whereas for dependent, positively correlated events, r > 1. • And for dependent, anti-correlated events, r < 1. M. Frank, "Physical Limits of Computing"

  19. Subsection II.A.1.c: Basic Statistical Quantities Norm, Variance, Standard Deviation

  20. Expectation Values • Let S be an exhaustive set of mutually exclusive events Ei. • This is sometimes known as a “sample space.” • Let f(Ei) be any function of the events in S. • This is sometimes called a “random variable.” • The expectation value or “expected value” or norm of f, written Ex[f] or even just f, is just the mean or average value of f(Ei), as weighted by the probability of the event Ei. • WARNING: The “expected value” may actually be quite unexpected, or even impossible to occur! • It’s not the ordinary English meaning of the word “expected.” • Expected values combine linearly: af+g = af + g. M. Frank, "Physical Limits of Computing"

  21. Variance & Standard Deviation • The variance of a random variable f isσ2(f) = (f − f)2 • The expected value of the squared deviation of f from the norm. (The squaring makes it positive.) • The standard deviation or root-mean-square (RMS) difference of f isσ(f) = [σ2(f)]1/2. • This is comparable, in absolute magnitude, to a typical value of f − f. M. Frank, "Physical Limits of Computing"

More Related