ISO/IEC – Information technology – Generic coding of moving pictures and associated audio information – Part 2: Video. Amendment 2 to International Standard ISO/IEC was prepared by. Joint Technical Committee ISO/IEC JTC 1, Information technology, Sub-. ISO/IEC. Third edition. Information technology — Generic coding of moving pictures and associated audio information —. Part 2: Video.
|Published (Last):||17 May 2010|
|PDF File Size:||13.49 Mb|
|ePub File Size:||15.11 Mb|
|Price:||Free* [*Free Regsitration Required]|
P-frames provide more compression than I-frames because they take advantage of the data in a previous I-frame or P-frame – a reference frame. Coding of Genomic Information Part 3: MP4 file format Part Sometimes no suitable match is found. If one applies the inverse transform to the matrix after it 18318-2 quantized, one gets an is that looks very similar to the original image but that is not quite as nuanced.
TV cameras used in broadcasting usually generate 50 pictures a second in Europe or Many of the coefficients, usually the higher frequency components, will then be zero.
Unlike P-frames and B-frames, I-frames do not depend on data in the preceding or the following frames. Digital Item Part 5: B-frames are never reference frames.
H.262/MPEG-2 Part 2
From Wikipedia, the free encyclopedia. Scene description Part Transport and Storage of Genomic Information Part 2: The transform converts spatial variations into frequency variations, but it does not change the information in the block; the original block can be recreated exactly by applying the inverse cosine transform.
MPEG-2 includes three basic types of coded frames: A Main stream may be recreated losslessly only if extended references are not used. But, if something in the picture is moving, the offset might be something like 23 pixels to the right and 4 pixels up.
The level limits the memory and processing power needed, defining maximum bit rates, frame sizes, and frame rates. MPEG-2 supports all three sampling types, although 4: All articles beginning with “ITU”.
The match between the two macroblocks will often not be perfect. Digital television requires that 138182- pictures be digitized so that they can be processed by computer hardware.
A profile defines sets of features such as B-pictures, 3D video, chroma format, etc.
It takes advantage of spatial redundancy and of the inability of the eye to detect certain changes in the image. The penalty of this step is the loss of some subtle distinctions in brightness and color.
Another common practice to reduce the amount of data to be processed is to subsample the two chroma planes after low-pass filtering to avoid aliasing. Views Read Edit View history. A Main stream cannot be recreated losslessly. A MPEG application then specifies the capabilities in terms of profile and level.
It is this array that is broadcast or that is put on DVDs. By starting in the opposite corner of the matrix, then zigzagging through the matrix to combine the coefficients into a string, then substituting run-length codes for consecutive zeros in that string, and then applying Huffman coding to that result, one reduces the matrix to a smaller array of numbers.
H/MPEG-2 Part 2 – Wikipedia
The processing of B-frames is similar to that of P-frames except that B-frames use the picture in a subsequent reference frame as well as the picture in a preceding reference frame. The offset is encoded as a “motion vector.
High Efficiency Image File Format. Video that has luma and chroma at 138818-2 same resolution is called 4: A main stream can be recreated losslessly.
Unified Speech and Audio Coding. Then, for each of those macroblocks, the reconstructed reference frame is searched to find that 16 138818-2 16 macroblock that best matches the macroblock being compressed.
This page was last edited on 20 Novemberat Streaming text format Part These describe the brightness and the color of the pixel see YCbCr. Retrieved from ” https: