1 / 26

Function Computation in Simple Broadcast Networks

Function Computation in Simple Broadcast Networks. Presented by Lei Ying. Model. Network Model Broadcast network: a transmission by one node can be received by each other node. Noisy channels: the channel between any pair of nodes is a binary symmetric channel with error probability p .

fast
Download Presentation

Function Computation in Simple Broadcast Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Function Computation in Simple Broadcast Networks Presented by Lei Ying

  2. Model • Network Model • Broadcast network: a transmission by one node can be received by each other node. • Noisy channels: the channel between any pair of nodes is a binary symmetric channel with error probability p. • There are N nodes in the network. Receiver (Fusion Center) Data Node (Sensor)

  3. Model • Data Model • Each node has one bit (bi)--- “0” or “1.” • The goal of the network is to compute some functions based on the nodes’ data. (1) Parity Computation: (i bi)(mod 2) (Gallager’88) (2) Threshold Detection: (i bi) T (Kushilevitz & Mansour’ 98)

  4. Goal • Design algorithms such that limN!1Pr(Correct parity computation (Threshold Detection))>1- • Goal: Find out how many transmissions we need (one transmission means transmission of one bit). • Trivial lower bound: Each node has to transmit each bit once, so it requires Ntransmissions. • Trivial upper bound:O(N ln(N)). Notations: f(n)=O(g(n)) means f(n)· cg(n) for n¸ m f(n)=(g(n)) means f(n)¸ cg(n) for n¸ m f(n)=(g(n)) means f(n)=O(g(n)) and f(n)=(g(n)).

  5. Lower and Upper Bounds • Lower bound: Since the channels are noisy, N transmissions are not sufficient. • Upper bound: Lemma 1: Suppose one bit of data is transmitted m times over a binary symmetric channel with error probability p, and the receiver decodes the bit using majority rule. Then, the probability of decoding error is no greater than (4p(1-p))m/2.

  6. Upper Bounds Proof: Define m independent binary random variables {Ii}, where Ii=0 with probability p and Ii=1 with probability 1-p. Using Chernoff’s bound, we have

  7. Upper Bounds Upper Bound: If one node broadcasts its bit aln(N) times, then the fusion center can decode the bit correctly with probability 1-(4p(1-p))a/2 ln(N). There are N nodes, using the union bound, we have that the probability the fusion center obtains all bits correctly is 1-N (4p(1-p))a/2 ln(N) . Choose a large enough such that (4p(1-p))a/2<e-2, then 1-N (4p(1-p))a/2 ln(N)>1-N/N2! 1.

  8. Upper Bound So O(N ln(N)) is enough. Can the number of transmissions be further reduced? Difficulty: each node has only one bit, so block coding cannot be used. Yes! Idea: Note that it is a broadcast network, when one node transmits, all other nodes can hear the transmission. So the other nodes collectively could make a good decision.

  9. Main Results Parity computation: In [Gallager’88], it develops an algorithm that requires O(N lnln(N)) transmissions. Threshold detection: In [Kushilevitz & Mansour’ 98], it provides an algorithm that requires (N) transmissions.

  10. Parity Computation • Algorithm: (1) Partition N nodes into subsets such that each subset has a1ln(N) nodes. (2) Each node broadcasts its bit a2lnln(N) times. (3) Each node computes the parity of its subset, and broadcasts the parity once. (4) The fusion center decodes the parities for each subset, and then obtain the parity of the network. • Number of transmissions: a2Nlnln(N)+N.

  11. Parity Computation Algorithm: (1) Partition N nodes into subsets such that each subset has a1ln(N) nodes. (2) Each node broadcasts its bit a2lnln(N) times. (3) Each node computes the parity of its subset, and broadcasts the parity once. (4) The fusion center decodes the parities for each subset, and then obtains the parity of the network. • Error Probability: (1) Choose (4p(1-p))a2/2<e-2. For each node, it obtains the parity of its subset with probability error less than a1ln(N) (4p(1-p))a2/2 lnln(N) < a1/ln(N).

  12. Parity Computation Algorithm: (1) Partition N nodes into subsets such that each subset has a1ln(N) nodes. (2) Each node broadcasts its bit a2lnln(N) times. (3) Each node computes the parity of its subset, and broadcasts the parity once. (4) The fusion center decodes the parities for each subset, and then obtains the parity of the network. • Error Probability: (1) For each node, it obtains the parity of its subset with probability error less than a1ln(N) (4p(1-p))a2/2 lnln(N) < a1/ln(N). (2) For each subset, the fusion center receives the parity of that subset from a1ln(N) nodes, and the error probability of each bit is less than (1-a1/ln(N))p+(1-p)a1/ln(N)=p+. Note that  can be arbitrarily small when N goes to infinity

  13. Parity Computation • Error Probability: (2) The fusion center receives the parity of a subset from a1ln(N) nodes, and the error probability of each bit is less than p+. (3) The parity of each subset can be obtained with error probability less than (4(p+)(1-p-))a1/2 ln(N). There are N/(a1ln(N)) subsets, so the error probability is N/(a1ln(N)) (4(p+)(1-p-))a1/2 ln(N) Choose (4(p+)(1-p-))a1/2 · e-2, the fusion center obtains the correct parity with probability at least 1/(a1N ln(N)). Algorithm: (1) Partition N nodes into subsets such that each subset has a1ln(N) nodes. (2) Each node broadcasts its bit a2lnln(N) times. (3) Each node computes the parity of its subset, and broadcasts the parity once. (4) The fusion center decodes the parities for each subset, and then obtains the parity of the network.

  14. Threshold Detection • Goal: For each node (not only fusion center), we want to know (i bi) T (Kushilevitz & Mansour’ 98) • Algorithm: (1) Each node broadcasts its bit M times. (2) So each node receives MN bits, let Ai be the sum of the bits node I received, and i=1 if Ai>f(T); and i=0 otherwise. Transmit i once (3) Let i be the majority value of the bits received; transmit i once. (4) Let Fi be the majority value of the bits received. The node decides the threshold is reached if Fi=1, and is not reached if Fi=0.

  15. Threshold Detection • f(T)=? • Lemma 1: suppose i bi=L, then (1) {Ai} are i.i.d; (2) E[Ai]=NMp +2M2 L, where M2=M(1-2p)/2 Proof: E[Ai] = M(L(1-p)+(N-L)p) = M(L(1-2p)+Np) = NMp+2M2L.

  16. Threshold Detection • Consider the worst case: i bi=T ori bi=T-1. • Choose f(T)=NMp+2M2T-M2. L=T+1 NMp+2M2T+2M2 L=T-1 NMp+2M2T-2M2 NMp+2M2T-4M2 L=T-2 NMp+2M2T L=T NMp+2M2T-M2

  17. Threshold Detection Lemma 2: Let {Xi} be N i.i.d. binary random variables, and p=Pr[Xi=1]. Then, Pr[i Xi<(p-)N]<e-22 N Lemma 3: Let Xi be N independent, binary random variables, and let =E[i Xi]. Then, Pr[i Xi<(/N-)N]<e-22 N

  18. Threshold Detection • Algorithm: • Each node broadcast its bit M times. • (2) i=1 if Ai>f(T) and i=0 otherwise. Transmit i once. • (3) Let i be the majority value of the bits received; transmit i once. • (4) Let Fi be the majority value of the bits received. Lemma 4: If L¸ T, If L T-1, After step (2), each node can correctly detect the threshold with a probability

  19. Threshold Detection • Algorithm: • Each node broadcast its bit M times. • (2) i=1 if Ai>f(T); and i=0 otherwise. Transmit i once. • (3) Let i be the majority value of the bits received; transmit i once. • (4) Let Fi be the majority value of the bits received. • Now with probability , each node has the correct i. • Recall that {i} are i.i.d. since {Ai} are i.i.d. • Lemma 5. At the end of step (2), with probability at least 1-e-c2M2/2, at least nodes has the correct i. Proof: Suppose L¸ T, i=1 is correct, by Lemma 3

  20. Threshold Detection • Algorithm: • Each node broadcast its bit M times. • (2) i=1 if Ai>f(T); and i=0 otherwise. Transmit i once. • (3) Let i be the majority value of the bits received; transmit i once. • (4) Let Fi be the majority value of the bits received. • Lemma 6: Assume that there are at least nodes have the correct i, then after step (3), with probability 1-e-2q2N, at least (1-2q)N nodes has the correct i, where Proof: First, consider the probability that i is correct.

  21. Threshold Detection • Algorithm: • Each node broadcast its bit M times. • (2) i=1 if Ai>f(T); and i=0 otherwise. Transmit i once. • (3) Let i be the majority value of the bits received; transmit i once. • (4) Let Fi be the majority value of the bits received. • Suppose ji is the out of the binary symmetric channel between node j and node i with input i.

  22. Threshold Detection • Algorithm: • Each node broadcast its bit M times. • (2) i=1 if Ai>f(T); and i=0 otherwise. Transmit i once. • (3) Let i be the majority value of the bits received; transmit i once. • (4) Let Fi be the majority value of the bits received. • Now i=1 with probability at least 1-q. Pr(ii>(1-2q)N)= Pr(ii>(1-q-q)N) 1-e-2q2N • Suppose there are at least (1-2q)N nodes have the correct i=1. Suppose gji is the out of the binary symmetric channel between node j and node i with input gi.

  23. Algorithm: • Each node broadcast its bit M times. • (2) i=1 if Ai>f(T); and i=0 otherwise. Transmit i once. • (3) Let i be the majority value of the bits received; transmit i once. • (4) Let Fi be the majority value of the bits received. Threshold Detection • Use the union bound, the probability that all Fi are correct is • Choose large M, we have that the probability that all nodes have the correct Fi is at least 1-. • The number of transmissions is MN+N+N=(M+2)N=(N)

  24. Thanks

  25. Finding the States of All Nodes • Partition the N nodes as follows: • Each subset contains a1ln(N). • Each node belongs to a1ln(N) subsets • No two nodes belong to more than one subset. • Each node associates with one subset, and each subset has only one node associate with it.

  26. Finding the States of All Nodes • Algorithm: (1) Each node broadcasts its bit a2lnln(N) times. (2) Each node computes the parity of it associated subset, and broadcasts the parity once. (3) Fusion center determines from step (1). Suppose node i belongs to subset {Si, k}, and the parities obtained at the fusion center are {Pi,k}. Then, The state of node i is the majority of .

More Related