1 / 33

Fundamental concepts in video

Fundamental concepts in video. Dr. Wissam Alkhadour. Types of Video Signals. Three types of video signals are explained in this module. Component Signal. Composite Signal. S-Video Signal. Component Signal.

lbauman
Download Presentation

Fundamental concepts in video

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fundamental concepts in video Dr. Wissam Alkhadour

  2. Types of Video Signals • Three types of video signals are explained in this module. • Component Signal. • Composite Signal. • S-Video Signal

  3. Component Signal Higher-end video systems make use of three separate video wires for the Red, Green, Blue channel. Each color channel is sent as a separate video signal. Most computer systems use Component Video, with separate signals for R, G, and B signals.

  4. Component Signal • For any color separation scheme, Component Video gives the best color reproduction since there is no crosstalk between the three channels. Component video requires more bandwidth and good synchronization of the three components

  5. Composite Signal • Color (chrominance) and intensity (luminance) signals are mixed into a single carrier wave. • Chrominance is a composition of two color components (I and Q, or U and V). In NTSC TV, e.g., I and Q are combined into a chroma signal, and a color subcarrier is used to put chroma signal at the high-frequency end of the signal shared with the luminance signal

  6. Composite Signal • The chrominance and luminance components can be separated at the receiver end and then the two color components can be further recovered. • When connecting to TVs or VCRs, Composite Video uses only one wire and video color signals are mixed, not sent separately.

  7. Composite Signal • The audio and sync signals are additions to this one signal. Since color and intensity are wrapped into the same signal, some interference between the luminance and chrominance signals is inevitable

  8. S-Video Signal • Separated video or Super-video uses two wires, one for Luminance & another for Chrominance signal. As a result, there is less crosstalk between the color information and the crucial gray-scale information. The reason for placing luminance into its own part of the signal is that Black-and-white information is most crucial for visual perception.

  9. Comparison between types of video signals

  10. Analog Video • An analog signal f(t) samples a time-varying image signal, Progressive scanning traces through a complete picture (a frame) row-wise for each time interval . • In TV and in some monitors and multimedia standards as well, another system, called Interlaced scanning is used. The odd-numbered lines are traced first, and then even-numbered lines are traced

  11. Analog Video • Solid lines traced from P to Q is known as Odd field, Similarly Doted line traces in the frame is known as Even field . Odd and Even fields - two fields make up one Frame. • Odd lines starts at top left corner of Frame (P) & ends at bottom center point (T), similarly Even lines starts at center point of Frame (U) & ends at lower right corner point (V). • First solid (Odd) lines are traced, then Even field starts at U and ends at V.

  12. Analog Video Representations • There are three analog video representation techniques as follows: 1) NTSC Video. 2) PAL Video. 3) SECAM Video

  13. Analog Video NTSC Video • NTSC (National Television System Committee) TV standard is mostly used in North America and Japan. It uses the familiar 4:3 aspect ratio (i.e., the ratio of picture width to its height) and uses 525 scan lines per frame at 30 frames per second (fps). NTSC follows interlaced scanning system & each frame is divided into two fields with 262.5 lines/field.

  14. Analog Video NTSC Video • NTSC uses YIQ color model, It uses quadrate modulation to combine I & Q signals into single chroma signal which is known as Color Sub carrier. • NTSC uses band width of 6.0 MHz, in which various sub carriers used frequencies as follows: -Picture Carrier is at 1.25 MHz - Audio Sub Carrier frequency is 4.5 MHz

  15. Analog Video NTSC Video • NTSC composite signal is composition of Luminance signal (Y) and Chroma signal (C). • Decoding composite signal at receiver: -Separating Y & C, Low pass filters can be used to extract Y signal. -Apply Low pass filter to obtain I and Similarly C is multiplied by 2.sin(FSCt) to get Q

  16. PAL Video • PAL (Phase Altering Line) TV standard originally invented by germen scientist. It uses 625 scan lines per frame at 25 fps with 4:3 aspect ratio, PAL uses YUV color model with 8 MHz channel, allocating band width of 5.5 MHz to Y and 1.8 MHz to each U & V

  17. PAL Video • Chroma signals have altering signs as +U & -U in successive scan lines hence the name Phase Altering Line. The signals in consecutive lines are averaged so as to cancel chroma signal for separating Y and C. It uses Comb filter at receiver

  18. PAL Video • SECAM (Systeme Electronique Couleur Avec Memoire) is invented by French for TV broadcast. It uses 625 scan lines per frame at 25 fps with 4:3 aspect ratio & interlaced fields. • SECAM and PAL are similar, differing slightly in their color coding scheme. In SECAM, U & V signals are modulated using separate color sub carriers at 4.25 MHz and 4.41 MHz. They are sent in alternate lines that is, only one of U or V signals will be sent on each scan line.

  19. PAL Video

  20. Digital Video The advantages of digital representation for video: • Video can be stored on digital devices or in memory, ready to be processed & integrated to various multimedia applications • Direct access is possible, which makes nonlinear video editing achievable as a simple • Repeated recording does not degrade image quality • Ease of encryption and better tolerance to channel noise.

  21. Digital Video Representation • Chroma Sub-sampling: • Human vision cannot distinguish black & white image filled with color information in spatial resolution. It uses YCbCr color model for sub-sampling, where Y is luminance signal gives gray image, combined signal of Cb, Cr known as Chorma contains color information . Y signal will present in all samples of pixels, but Chorma may not present in all samples

  22. Digital Video Representation • Chroma Sub-sampling: How many pixel values per four original pixels contains chroma signal are refered as Chroma sub sampling. There are four schemes for chroma sub-sampling as follows based on chroma sampling: • 4:4:4 Scheme: indicates that No chroma sub-sampling is used - Each pixel's Y, Cb and Cr values are transmitted - All four pixel have Y,Cb,Cr values out of four pixels that gives More quality images

  23. Digital Video Representation • 4:2:2 Scheme: indicates horizontal sub sampling of the Cb, Cr signals by a factor of 2 -Four pixels horizontally labeled as 0 to 3, all four Ys are sent, and every two Cb's & two Cr's are sent as (Cb0, Y0) (Cr0,Y1) (Cb2, Y2) (Cr2, Y3) (Cb4, Y4) -Only two pixels have Y, Cb, Cr values out of four pixels that gives Medium quality images

  24. Digital Video Representation • 4:1:1 Scheme: indicates horizontal sub sampling by a factor of 4 -Four pixels horizontally labeled as 0 to 3, all four Ys are sent, every one Cb's & one Cr's are sent -Only one pixel have all Y, Cb, Cr values out of four pixels that gives Poor quality images

  25. Digital Video Representation • 4:2:0 Scheme: sub-samples in both the Horizontal & Vertical dimensions by a factor of 2 . • Scheme 4:2:0 along with other schemes is commonly used in JPEG and MPEG CIF and QCIF

  26. CIF Standards • CIF stands for Common Intermediate Format specified by the CCITT. The idea of CIF is to specify a format for lower bit rate. It uses a progressive (non-interlaced) scan. QCIF stands for Quarter-CIF. All the CIF/QCIF resolutions are evenly divisible by 8, and all except 88 are divisible by 16; this provides convenience for block-based video coding in H.261 and H.263

  27. HDTV (High Definition TV) • The first generation of HDTV was based on an analog technology developed by Sony and NHK in Japan in the late 1970s. MUSE (MUltiple sub-Nyquist Sampling Encoding) was an improved NHK HDTV with hybrid analog/digital technologies that was put in use in the 1990s.

  28. HDTV (High Definition TV) • It has 1,125 scan lines, interlaced (60 fields per second), and 16:9 aspect ratio. Since un-compressed HDTV will easily demand more than 20 MHz bandwidth, which will not fit in the current 6 MHz or 8 MHz channels, various compression techniques are being investigated. For video, MPEG-2 is chosen as the compression standard. For audio, AC-3 is the standard

  29. The salient difference between conventional TV and HDTV • HDTV has a much wider aspect ratio of 16:9 instead of 4:3. HDTV moves toward progressive (non-interlaced) scan, Interlacing introduces serrated edges to moving objects and flickers along horizontal edges. The FCC has planned to replace all analog broadcast services with digital TV broadcasting by 2006

  30. The salient difference between conventional TV and HDTV • SDTV (Standard Definition TV): the current NTSC TV or higher. • EDTV (Enhanced Definition TV): 480 active lines or higher • HDTV (High Definition TV): 720 active lines or higher.

More Related