MPEG-4 compression in digital cameras based on DSP

This article refers to the address: http://

Although digital cameras (DSCs) have been on the market for only a few years, they have revolutionized the consumer electronics imaging industry. Currently, about one-third of the cameras sold worldwide are digital cameras, and their share is steadily rising. As multi-megapixel DSCs generate increasingly higher resolution images and begin to challenge traditional film cameras, consumer DSCs are also providing intelligent operating modes that help users capture better conditions under all conditions. photo. Video mode has also become a standard feature of consumer DSCs, enabling users to quickly take multiple photos to select better snapshots while also enabling them to save short clips of major events. In addition, DSC is also integrated with mobile phones to enable fast transfer of still pictures and clips anytime, anywhere.

As the ever-changing DSC market continues to diversify, developers must continue to leverage the power of technological innovation to segment their products. One of today's innovations is the introduction of MPEG-4 video compression technology in consumer DSCs based on high-performance digital signal processors (DSPs). The MPEG-4 standard enables DSC to efficiently provide video and other modes of operation, increase the number of stored video clips, and support the robust and reliable transmission of video images. DSPs provide the computational performance required for MPEG-4 encoding and decoding in low-cost camera products, especially those with DSPs that support fast image processing architectures. Programmability allows developers to use the same DSP platform throughout the DSC product line to optimize image pipes for different products through software.

New compression standard

The DSC tradition relies on the JPEG compression standard, which is designed to store still images and has become popular through the Internet. In compression, JPEG uses discrete cosine transform (DCT) and quantization techniques to effectively eliminate most of the spatial redundancy from the data description of the smallest coding unit (MCU) containing an 8x8 pixel array. This algorithm then uses entropy or variable length coding (VLC) techniques to further reduce the stored and transmitted image data. The step of decompressing the image is the opposite. Depending on the image content, although the compression ratio varies from image to image, the JPEG algorithm can generally compress pixel data by an order of magnitude without losing visual integrity.

Various MPEG standards for animation and video use the same intra-frame techniques as JPEG to compress basic still images or I-frames, and then use additional inter-frame techniques to eliminate temporal redundancy in subsequent frames. The inter-frame technique actually involves compressing a 16x16 pixel macroblock of each successive frame to a macroblock of the previous frame, and then using motion estimation and compensation techniques to describe the frame-to-frame motion of the macroblock. These predicted frames or P frames only need to describe their changes from the previous frame. The I frame is then encoded periodically at intervals determined by the application.

Figure 1 illustrates the steps involved in general MPEG video compression. The intra-frame compression step (DCT, quantization, and VLC) from input to output at the top of the image is sufficient to generate an I-frame. In order to create a P frame, the just encoded frame must be decoded and stored in the local frame buffer to enable block-by-block compression of past frames into future frames (ie, motion estimation) to achieve interframe compression. Video decoding involves the steps in the lower part of the figure (inverse quantization, inverse DCT, motion compensation). In addition to the illustrations, the MPEG standard also features an audio compression-decompression algorithm with a separate process.


Figure 1: MPEG video compression flowchart

MPEG-4: Multimedia Standard

The MPEG standard is evolving to accommodate emerging video applications. The original MPEG-1 standard was developed for mass storage and system retrieval, such as interactive CD-ROMs and VCDs. Since then, the standard has been modified in MPEG-2 to support higher resolutions, a wider range of formats, and digital encoding associated with HDTV. Due to its use in DVDs, MPEG-2 is more popular. Driven by the requirements of the video database, the MPEG-7 standard specifies a content representation for information search.

MPEG-4 was developed for interactive multimedia applications, including those for multimedia applications over wireless connectivity. It shares algorithms with the basic H.263 video compression standard. Compared to earlier MPEG standards, MPEG-4 provides better compression for higher density images and provides higher error resilience for more robust and reliable transmissions. In addition, MPEG-4 supports the introduction of object types in frames, allowing different images and graphics units to be independently specified, compressed, transmitted and recombined. However, the standard object support functionality remains to be developed with practical implementations. Until then, most MPEG-4 applications, including DSCs, can continue to be based on a single object that typically corresponds to a complete rectangular frame of the image.

High compression efficiency

The compression ratio of a particular clip varies greatly from subject to subject, but in general MPEG compression can increase the compression ratio of the subsequent form of a JPEG frame, Motion JPEG (M-JPEG), by an order of magnitude at the same resolution. Further compression comes from the adoption of inter-frame technology. Video frames are typically about 100,000 pixels (352 x 288 pixels, CIF resolution) or about 25,000 pixels (176 x 144 pixels, QCIF resolution), rather than the JPEG-related 2 to 5 in general. Megapixels. Although this reduction in resolution is unacceptable in high-quality photos, it is sufficient for many consumer DSC products, especially considering that it enables the capture of photo video.

The MPEG-4 algorithm takes advantage of the refinement capabilities in compression technology to reduce the previous MPEG ratio by approximately 20%. Advanced MPEG-4 compression compresses 15 frames per second (fps) of QCIF video images from 4.5 Mbps of raw video data to less than 64 kbps while maintaining proper viewing quality. In DSC, MPEG-4 compression allows the camera to store video images that are several times larger than M-JEPG in memory.

Better fault tolerance

MPEG-4 integrates a variety of new technologies to improve fault-tolerant resiliency, and fault-tolerant resiliency is a useful feature as people are increasingly transmitting photos and clips captured with DSC. With the increasing popularity of DSC phones, strong and reliable transmission has become an essential requirement. MPEG-4's fault-tolerant resiliency technologies include:

  • More resynchronization tags that divide the transmitted data into small video packets, enabling the receiver to recover various transmission errors with minimal data loss;
  • a header extension code that indicates the header of each packet to prevent potential header information loss due to corruption of the first video packet in the video frame containing significant header information;
  • Dividing video data into motion and texture (space) data, facilitating recovery from errors by increasing the probability that the portion of data is received;
  • Reversible VLC, allowing the receiver to decode backwards and forwards from the resynchronization flag to recover as many images as possible after a transmission error occurs;
  • Error concealment techniques for space and time errors (several techniques are specified in MPEG-4 that complement and not be part of this algorithm).

Demand for DSP performance and flexibility

The MPEG-4 compression and decompression algorithm requires much more processing power than JPEG due to the additional steps involved in interframe motion estimation and compensation. Therefore, the image processing engine in DSC must be able to achieve higher performance levels. Although ASIC can do this, it is not easy to integrate into the imaging pipeline of different DSC products; on the other hand, programmable DSP can not only provide the performance required by MPEG-4 algorithm, but also optimize different systems through software. In addition, the same DSP can be programmed to execute the JPEG algorithm for use in higher resolution DSCs. As a result, the entire DSC product line can be based on a single DSP platform, saving significant development time and cost while facilitating product segmentation.

DSP example with imaging architecture

Texas Instruments' TMS320DM270 digital media processor is a high-performance DSP specifically designed for imaging applications such as DSC. The DM270 is based on a multi-processor architecture that uses an ARM7 32-bit RISC microcontroller to handle non-imaging functions and is used as the primary controller for the entire system, while processing the audio encoding and decoding with a programmable C54xTM DSP core. . In addition, the DM270 integrates a programmable coprocessor specifically designed to handle most high-volume imaging tasks. One of the coprocessors, the SIMD Image Processing Engine (iMX), performs motion estimation and compensation in DCT, inverse DCT, and many other processing operations. Other coprocessors perform variable length encoding/decoding, quantization, and inverse quantization.

Figure 2 shows the main functional blocks and processes of the DM270. In addition to the main processor, the device integrates a cache, image block buffer, and controller for external memory, CCD, LCD or TV output, and other communication interfaces through a variety of general purpose I/O pins. . Dedicated image pre-processing hardware eliminates some of the task burden of the main processor, such as white balance, auto exposure, and auto focus.

Externally only SDRAM is needed to complement the DSC's image processing engine. Since MPEG must maintain additional frames for motion estimation and compensation, encoding requires approximately 110 kilobytes of SDRAM at QCIF (176x144) resolution. With its highly integrated and dedicated architecture, the DM270 can handle MPEG-4 encoding over 30fps at CIF (352x288) resolution while processing over 50% of pixels during HVGA (640x240) resolution over 30fps. The device also supports other major video, audio and voice standards used in multimedia and can be combined with DSPs designed for use as a mobile phone engine.


Figure 2: DM270 Architecture

DSC phones and other emerging applications

In the fast-changing video imaging consumer electronics market, the importance of overestimating programming flexibility is no exaggeration. DSC is rapidly evolving and integrating into new applications. One of them is a mobile phone with an integrated camera that provides capture and transmission of still images and video clips. The system is now on the market. MPEG-4 data can be embedded in the Multimedia Messaging Service (MMS) protocol stack, making it easy to transmit video packets over the wireless network using industry standards for wireless IP network information.

Developers may also want the flexibility that DSP brings to design other types of camera systems that are also available in wireless products, such as systems that support H.324-based video conferencing. The video conferencing radiotelephone encodes and decodes video using H.263 or MPEG-4. In addition, Session Initiation Protocol (SIP) support may be required to integrate messaging and video conferencing. Future developments in MPEG-4, such as object functions, may require reprogramming of already-introduced units and units under development. One such development is the higher compression density of the emerging MPEG-4 AVC (Advanced Video Coding) standard (also known as the H.264 standard). Programmable DSPs enable support for all of these and other standards, helping imaging system developers segment their products and inspire new market demands.

In terms of image quality, DSC still needs a certain amount of time to compete with the highest quality traditional film cameras. However, in the low-end market, DSC can provide video and other features that traditional cameras can't match. Currently, DSC developers are exploring MPEG-4 from higher compression ratios and higher fault tolerance to help them deliver more of what consumers expect. Programmable DSPs provide the performance needed to implement MPEG-4 algorithms on low-cost DSCs, providing developers with the flexibility they need to meet a variety of needs in highly differentiated markets. DSP-based MPEG-4 compression opens the door to future integration of DSCs with wireless devices and support for other new applications. With MPEG-4 and DSP, low-cost consumer DSCs will continue to usher in a glorious future. Let us look into the future!

Solar Lantern with MP3 & Radio

Solar Lantern with Radio and Funcion MP3,Solar Lantern with Mobile Phone Charging,Solar Lantern for Camping

Junrui Lighting Co., Ltd. , http://www.china-outdoor-light.com