1 / 21

Context-Based Adaptive Binary Arithmetic Coding in the H.264/AVC Video Compression Standard

Context-Based Adaptive Binary Arithmetic Coding in the H.264/AVC Video Compression Standard. Detlev Marpe, Heiko Schwarz, and Thomas Wiegand. IEEE Transactions on Circuits and Systems for Video Technology, JULY 2003. Outline. Introduction

aquila
Download Presentation

Context-Based Adaptive Binary Arithmetic Coding in the H.264/AVC Video Compression Standard

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Context-Based Adaptive Binary Arithmetic Coding in the H.264/AVC Video Compression Standard Detlev Marpe, Heiko Schwarz, and Thomas Wiegand IEEE Transactions on Circuits and Systems for Video Technology, JULY 2003

  2. Outline • Introduction • The CABAC Framework • Binarization • Context Modeling • Binary Arithmetic Coding • Example of detailed CABAC

  3. Introduction (1) • CAVLC • Baseline profile • CABAC • Main profile • Compared to CAVLC, CABAC typically provides a reduction in bit rate between 5%~15%.

  4. Introduction (2) • Binarization • Context modeling • Binary arithmetic coding entropy_coding_mode = 1 Binarization Context modeling Binary arithmetic coding

  5. C0 0 1 C1 0 1 “5”-“30” C2 C3 1 0 1 0 “0” “3” “2” “1” Binarization • Alphabet reduction • Reduce a nonbinary syntax to a unique intermediate binary codeword. • similar to converting a data symbol into a variable length code but the binary code is further encoded prior to transmission. • Nothing is lost in terms of modeling. • P(“3”) = P(C0)(“0”)  P(C1)(“0”)  P(C2)(“1”)

  6. Binarization (2) • No multiplications needed • Adaptive m-ary arithmetic coding requires at least two multiplications for each symbol. • Enable context modeling on a subsymbol level. • Conditional probabilities can be used for the most frequently observed bins, whereas others use zero-order probability model.

  7. Binarization – four basic schemes (1) • Unary code (U) • x = 4  11110 • Truncated unary code (TU) • x = 4, S = 5  11110 • x = 5, S = 5  111110 • kth order Exp-Golomb code (EGk) while (1) { if ( x >= (1<<k) ) { put ( 1 ) x = x – (1<<k) k++ } else { put ( 0 ) while ( k-- ) put ( (x>>k) & 0x01 ) break } } prefix part termination of prefix part suffix part

  8. Binarization – four basic schemes (2) • Fixed-length code (FL) • S = 7  log27 = 3 • Is applied to uniform distribution

  9. Binarization – concatenation of basic schemes (1) • Coded_block_pattern • Which blocks contain nonzero transform coefficients in a MB • Prefix: 4-bit FL for luminance • Suffix: TU with S = 2 for chrominance • Motion vector difference • Prefix: TU with S = 9 for |mvd| < 9 • Suffix: EG3 for |mvd - 9| if |mvd|  9 • Sign bit

  10. Binarization – concatenation of basic schemes (2) • Transform coefficient level • Prefix: TU with S = 14 for |mvd| • Suffix: EG0 for |mvd - 14| if |mvd|  14

  11. Context Modeling (1) • A "context model" is a probability model for one or more bins of the binarized symbol. • This model may be chosen from a selection of available models depending on the statistics of recently-coded data symbols. • The context model stores the probability of each bin being "1" or "0".

  12. Context Modeling (2) • Four basic design types • Two neighboring syntax elements in the past of the current syntax element • The prior coded bins (b0, b1, …bi-1) • mb_type and sub_mb_type • The position in the scanning path • Significant map • The accumulated number of encoded levels • Coefficient levels B A C C0 0 1 C1 0 1 mb_type (P/SP slices) “5”-“30” C2 C3 Residual data only 1 0 1 0 “0” “3” “2” “1”

  13. Context Modeling - Context index γ(1) • The entity of probability models can be arranged in a linear fashion such that each model can be identified by a so-called context index γ. • According to each context index γ, the probability model is determined by (αγ ,βγ) for 0≤ γ ≤398. • 6 bits for αγ and 1 bit for βγ. • αγis the probability state index and the (binary) βγrepresents the most probable symbol (MPS). 64 representative probability values

  14. Context Modeling - Context index γ(2) • 0 to 72 are related to syntax elements of macroblock, sub-macroblock, prediction modes of special and temporal as well as slice-based and macroblock-based control information. • γ=ΓS+χS.. • ΓS denotes the context index offset, the lower value of the range. • χS denotes the context index increment of a given syntax element S. • 73 to 398 are related to the coding of residual data. • Significant_coeff_flag and last_significant_coeff_flag are conditioned on the scanning position. • Coded_block_pattern: γ=ΓS+χS.. • Others:γ=ΓS+ΔS(ctx_cat)+χS. Here the context category (ctx_cat) dependent offset ΔSis employed.

  15. Context Modeling - Context index γ(3) • Values ofΔSdepending on context category and syntax element

  16. Binary arithmetic coding • An arithmetic coder encodes each bin according to the selected probability model. • Binary arithmetic is based on the principal of recursive interval subdivision. • Another distinct feature in H.264/AVC is its simplicity bypass coding mode (assumed to be uniformly distributed). 1/6 1/4 2/3 1/2 1/2 1/2 5/6 3/4 1/2 1/2 1/2 1/3

  17. Example of detailed CABAC – motion vector difference (1) • Binarization • Prefix: TU (|mvdx|< 9) • Suffix: EG3 (|mvdx|  9) • |mvdx| = 10  prefix 8 use TU and suffix 2 use EG3 • Context model • One of 3 model is selected for bin 1, based on previous coded MVD values. • e=|mvdA|+|mvdB|

  18. Example of detailed CABAC – motion vector difference (2) • The remaining bins are coded using one of 4 further context models:

  19. Example of detailed CABAC – mb_type and sub_mb_type (2) • Binarization • Context model • C0…C3 • C’0…C’2

  20. Experimental result • In our experiments, we compare the coding efficiency of CABAC to the coding efficiency of the baseline entropy coding method of H.264/AVC. The baseline entropy coding method uses the zero-order Exp-Golomb code for all syntax elements with the exception of the residual data, which are coded using the coding method of CAVLC. • Bit-rate savings of 9% to 14% are achieved, where higher gains are obtained at lower rates.

More Related