1 / 32

Grassmannian Packings for Efficient Quantization in MIMO Broadcast Transmission

Grassmannian Packings for Efficient Quantization in MIMO Broadcast Transmission. MIMO Broadcast Transmission Examples GRM(m) for MIMO Broadcast Systems transmission to mobiles with orthogonal channel vectors transmission to mobiles with almost orthogonal channel vectors

Download Presentation

Grassmannian Packings for Efficient Quantization in MIMO Broadcast Transmission

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grassmannian Packings for Efficient Quantization in MIMO Broadcast Transmission • MIMO Broadcast Transmission • Examples GRM(m) for MIMO Broadcast Systems • transmission to mobiles with orthogonal channel vectors • transmission to mobiles with almost orthogonal • channel vectors • Simulation Results • Algebraic Construction of GRM(m) Alexei Ashikhmin and RaviKiran Gopalan Bell Labs Texas Instrument

  2. Base Station • The Base Station (BS): • chooses some mobiles, for example mobiles 1,2,3 • forms and using computes a precoding matrix • transmits to mobiles 1,2,3 using the precoding matrix MIMO Broadcast Transmission is a quantization code

  3. is the channel vector of is the channel vector of is the channel vector of Requirements for a quantization code • should provide good quantization (for given size ) • should afford a simple decoding • should have many sets of M orthogonal codewords (bases of ) BS If are pairwise orthogonal then signals sent to do not interfere with each other

  4. Base Station • Mobiles quantize: • Base Station strategy – among find orthogonal codewords, say , and transmit to the corresponding mobiles 1,3,5 • The channel vectors of these mobiles will be almost orthogonal

  5. orthogonal codewords • If the number of mobiles (channel vectors) is large, e.g. , then • with a high probability all codewords will be occupied Let us have a quantization code If a channel vector is quantized into we say that is occupied and mark by • In this case even if we have only a few sets of orthogonal codewords, we easily find a set of occupied orthogonal codewords

  6. The number of mobiles is small, say • Still if there are many sets of orthogonal codewords, there is a chance to find occupied orthogonal codewords • For example, let • be sets of orthogonal codewords. Then

  7. Example: The number of antennas The first code in the family: (for practical applications we add four vectors to the code to make the code size 64) (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) (1, 0, 1, 0), (0, 1, 0, 1), (1, 0, -1, 0), (0, -1, 0, 1) (1, 0, -i, 0), (0, 1, 0, -i), (1, 0, i, 0), (0, 1, 0, i) (1, 1, 0, 0), (0, 0, 1, 1), (1, -1, 0, 0), (0, 0, -1, 1) (1, -i, 0, 0), (0, 0, 1, -i), (1, i, 0, 0), (0, 0, 1, i) (1, 0, 0, 1), (0, 1, 1, 0), (1, 0, 0, -1), (0, 1, -1, 0) (1, 0, 0, -i), (0, 1, i, 0), (1, 0, 0, i), (0, 1, -i, 0) (1, 1, 1, 1), (1, -1, 1, -1), (1, 1, -1, -1), (1, -1, -1, 1) (1, 1, -i, -i), (1, -1, -i, i), (1, 1, i, i), (1, -1, i, -i) (1, -i, 1, -i), (1, i, 1, i), (1, -i, -1, i), (1, i, -1, -i) (1, -i, -i, -1), (1, i, -i, 1), (1, -i, i, 1), (1, i, i, -1) (1, -i, -i, 1), (1, i, -i, -1), (1, -i, i, -1), (1, i, i, 1) (1, -i, 1, i), (1, i, 1, -i), (1, -i, -1, -i), (1, i, -1, i) (1, 1, 1, -1), (1, -1, 1, 1), (1, 1, -1, 1), (1,-1,-1,-1) (1, 1,-i, i), (1, -1, -i, -i), (1, 1, i, -i), (1, -1, i, i) 105 orthogonal bases

  8. The number of mobiles • The bases form the constant weight code (n=60, |C|=105, w=4). • With probability 0.65 will find four orthogonal occupied codewords • With probability 0.349 will find three orthogonal occupied codewords

  9. Examples (continued) 1. The number of orthogonal bases is 105. Each codeword belongs to 7 bases. The bases form the constant weight code (n=60, |C|=105, w=4). 2. The number of orthogonal bases is 1076625. Each codeword belongs to 7975 bases. The bases form the constant weight code (n=1080, |C|=1076625, w=8) If K is small that the probability to find M occupied orthogonal codewords is also small What to do? - Use almost orthogonal codewords

  10. Bases form a full size MUB set Mutually Unbiased Bases (MUB) Def. Orthonormal bases of are mutually unbiased if for any we have Theorem The number of MUBs Def. (i.e. ) is a full size MUB set.

  11. MUB sets form a constant weight code C (n=15, |C|=6, w=5) • If K is small the chance that M occupied codewords are covered by • an MUB set is significantly higher than that they are covered by a basis

  12. There are 840 full size MUB sets , each belongs to 56 full size MUB sets

  13. K=1000 GRM(3) Yoo and Goldsmith greedy alg. with RVQ RVQ with Reg. ZF RVQ with ZF Simulation Results All results for M=8, i.e. the number of Base Station antennas is 8 GRM(3)

  14. GRM(3), GRM(3), If K=50 typically we can find 5 or 6 occupied codewords

  15. GRM(3) greedy alg.

  16. Transmission to Transmission to are orthogonal and are orthogonal

  17. Construction of GRM(m) • GRM(m) is a code in • There are two methods for construction of GRM(m): • Group theoretic approach – a particular case of the Operator Reed-Muller codes (A.Ashikhmin and A.R.Calderbank, ISIT 2005) • Coding theory approach

  18. Group Theoretic Construction of GRM(m) Pauli matrices: where

  19. Def. Vectors and are orthogonal (with respect • to the symplectic inner product) if • Construction of GRM(m) • is a set of orthogonal independent vectors • . • Lemma 2 The operator is an orthogonal projector on a subspace ,

  20. Coding Theory approach for construction of GRM(m) • GRM(m) is obtained by merging of • Binary Reed-Muller codes RM(r,m) • 2. Reed-Muller codes ZRM(2,m) codes over • ZRM(2,m) is generated by the Boolean functions:

  21. ZRM(2,m) is generated by the Boolean functions • Let us construct the code from ZRM(2,m) by mapping • For example

  22. 3. r=m-2=0: take the only minimum weight codeword of RM(r,m)=RM(0,m): (1,1,1,1) and substitute into its nonzero positions codewords of Merging RM(r,2) and CRM(2,2) into GRM(2) r changes from m=2 to 0: • r=m=2: take the all minimum weight codewords of RM(r,m)=RM(2,2): • r=m-1=1: substitute codewords of • into the minimum weight codewords of RM(r,m)=RM(1,2) (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) Minimum weight codeword of RM(1,2): Codewords of GRM(2): (1,i) (1,-i) (1,1) (1,-1) (1,i,0,0) (0,1,i,0) (0,1,-i,0) (1,-i,0,0) (1,1,0,0) (0,1,1,0) (1,1,0,0) (0,1,1,0) (1,-1,0,0) (0,1, -1,0)

  23. r=0, minimum weights v codewords of RM(2,2) r=1, minimum weights v codewords of RM(1,2) v +codewords of CRM(2,1) r=2, minimum weights v codewords of RM(0,2) v +codewords of CRM(2,2) (1,0,0,0),(0,1,0,0),(0,0,1,0),(0,0,0,1) (1,1,0,0),(1,i,0,0),(1,-1,0,0),(1,-i,0,0) (1,0,1,0),(1,0,i,0),(1,0,-1,0),(0,1,0,-i) (1,0,0,1),(1,0,0,i),(1,0,0,-1),(1,0,0,-i) (0,1,1,0),(0,1,i,0),(0,1,-1,0),(0,1,-i,0) (0,1,0,1),(0,1,0,i),(0,1,0,-1),(1,0,-i,0) (0,0,1,1),(0,0,1,i),(0,0,-1,1),(0,0,1,-i) (1,1,1,1), (1,-1,1,-1), (1,1,-1,-1), (1,-1,-1,1), (1,1,-i,-i), (1,-1,-i,i), (1,1,i,i), (1,-1,i,-i), (1,-i,1,-i), (1,i,1,i), (1,-i,-1,i), (1,i,-1,-i), (1,-i,-i,-1), (1,i,-i,1), (1,-i,i,1), (1,i,i,-1), (1,-i,-i,1), (1,i,-i,-1), (1,-i,i,-1),(1,i,i,1), (1,-i,1,i), (1,i,1,-i), (1,-i,-1,-i), (1,i,-1,i), (1,1,1,-1), (1,-1,1,1), (1,1,-1,1), (1,-1,-1,-1), (1,1,-i,i), (1,-1,-i,-i), (1,1,i,-i), (1,-1,i,i)

  24. Theorem Example: Theorem (Inner product distribution of GRM(m)). For any we have and the number of such that is Example: in GRM(2) there are 15 vectors such that in GRM(3) there are 315 vectors such that

  25. Theorem For any basis there exist bases such that is an MUB set. TheoremThe maximum root-mean-square (RMS) inner product is

  26. GRM(3), |GRM(3)|=1080 Random Code C, |C|=1080 • Complex • multiplications0 8*1080 • Complex • summations1500 7*1080 Decoding Example M=8

  27. The number of mobiles is large, say In this case, even if we have only one set of orthogonal vectors, say we are doing fine Mobiles quantize: If some channel vector is quantized into we say that is occupied

  28. Example: The number of BS antennas M=4, hence (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) (1, 0, 1, 0), (0, 1, 0, 1), (1, 0, -1, 0), (0, -1, 0, 1) (1, 0, -i, 0), (0, 1, 0, -i), (1, 0, i, 0), (0, 1, 0, i) (1, 1, 0, 0), (0, 0, 1, 1), (1, -1, 0, 0), (0, 0, -1, 1) (1, -i, 0, 0), (0, 0, 1, -i), (1, i, 0, 0), (0, 0, 1, i) (1, 0, 0, 1), (0, 1, 1, 0), (1, 0, 0, -1), (0, 1, -1, 0) (1, 0, 0, -i), (0, 1, i, 0), (1, 0, 0, i), (0, 1, -i, 0) (1, 1, 1, 1), (1, -1, 1, -1), (1, 1, -1, -1), (1, -1, -1, 1) (1, 1, -i, -i), (1, -1, -i, i), (1, 1, i, i), (1, -1, i, -i) (1, -i, 1, -i), (1, i, 1, i), (1, -i, -1, i), (1, i, -1, -i) (1, -i, -i, -1), (1, i, -i, 1), (1, -i, i, 1), (1, i, i, -1) (1, -i, -i, 1), (1, i, -i, -1), (1, -i, i, -1), (1, i, i, 1) (1, -i, 1, i), (1, i, 1, -i), (1, -i, -1, -i), (1, i, -1, i) (1, 1, 1, -1), (1, -1, 1, 1), (1, 1, -1, 1), (1,-1,-1,-1) (1, 1,-i, i), (1, -1, -i, -i), (1, 1, i, -i), (1, -1, i, i) 105 orthogonal bases

  29. Example: The number of BS antennas M=4, hence (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) (1, 0, 1, 0), (0, 1, 0, 1), (1, 0, -1, 0), (0, -1, 0, 1) (1, 0, -i, 0), (0, 1, 0, -i), (1, 0, i, 0), (0, 1, 0, i) (1, 1, 0, 0), (0, 0, 1, 1), (1, -1, 0, 0), (0, 0, -1, 1) (1, -i, 0, 0), (0, 0, 1, -i), (1, i, 0, 0), (0, 0, 1, i) (1, 0, 0, 1), (0, 1, 1, 0), (1, 0, 0, -1), (0, 1, -1, 0) (1, 0, 0, -i), (0, 1, i, 0), (1, 0, 0, i), (0, 1, -i, 0) (1, 1, 1, 1), (1, -1, 1, -1), (1, 1, -1, -1), (1, -1, -1, 1) (1, 1, -i, -i), (1, -1, -i, i), (1, 1, i, i), (1, -1, i, -i) (1, -i, 1, -i), (1, i, 1, i), (1, -i, -1, i), (1, i, -1, -i) (1, -i, -i, -1), (1, i, -i, 1), (1, -i, i, 1), (1, i, i, -1) (1, -i, -i, 1), (1, i, -i, -1), (1, -i, i, -1), (1, i, i, 1) (1, -i, 1, i), (1, i, 1, -i), (1, -i, -1, -i), (1, i, -1, i) (1, 1, 1, -1), (1, -1, 1, 1), (1, 1, -1, 1), (1,-1,-1,-1) (1, 1,-i, i), (1, -1, -i, -i), (1, 1, i, -i), (1, -1, i, i) 105 orthogonal bases

  30. Merging of RM(r,m) and CRM(2,m) changes from m to 0: • r=m=2: take the all minimum weight codewords of RM(r,2)=RM(2,2): • r=m-1=1: substitute codewords of • into minimum weight codewords of RM(r,2)=RM(1,2) • 3. r=m-2=0: take the only minimum weight codeword of RM(r,m)=RM(0,m): • (1,1,1,1) and substitute into its nonzero positions codewords of (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) Minimum weight codeword of RM(1,2): Codewords of G-ZRM(2): (1,i) (1,-i) (1,1) (1,-1) (1,i,0,0) (0,1,i,0,0) (0,1,-i,0) (1,-i,0,0) (1,1,0,0) (0,1,1,0) (1,1,0,0) (0,1,1,0) (1,-1,0,0) (0,1, -1,0)

  31. Lemma 1The operatoris an orthogonal projector, • Def. Vectors and are orthogonal (with respect • to the symplectic inner product) if • is a set of orthogonal vectors • . • Lemma 2 The operator is an orthogonal projector on a subspace

  32. Bases form a full size MUB set Mutually Unbiased Bases (MUB) Def. Orthonormal bases of are mutually unbiased if for any we have Theorem The number of MUBs Def. (i.e. ) is full size MUB set.

More Related