Design Con 2015

CRC testing in video applications

& -September 10, 2013

Introduction
Quantifying the impact of a small engineering change in a complex video signal chain can often be a thankless task; evaluating whether a lower cost digital video cable has degraded system performance; whether a power supply tweak has increased the system's jitter tolerance; whether an alternate PLL configuration has provided greater power supply noise immunity; typical challenges which the design and production engineers of today's video product design and manufacturers must overcome.

Although numerous video evaluation tools are available to assist in such activities, these often consume significant portions of capital budgets, take time to setup, require training to operate properly, and offer results which can be difficult to interpret. A simple error detection algorithm such as a Cyclic Redundancy Check (CRC) can be used as an effective tool, despite a number of limitations, in advance of investing significant efforts in perfecting systems using the more complicated and expensive evaluation tools; especially when time to market, cost and resources are important considerations.

Digital Video Systems
The proliferation of digital video transmission media for consumer, professional and automotive applications in recent years has triggered a change in the focus for many video product design and manufacturers; the requirement to achieve superior analog performance has plateaued and has been superseded with a demand to achieve the highest possible digital data rates possible. These transmission media include DVI, HDMI, LVDS, MHL and APIX.

The growth of HDMI has been one of the primary drivers in this race to higher data rates. At its inception, support for video transmission at up to 1.65GHz facilitated the transfer of 1080p video (1920 pixels x 1080 lines) with an 8-bit colour depth - a video format offering over ten times the video resolution of analog NTSC video. Further developments to the HDMI specification in recent years have seen the data rate of the maximum supported video resolution stretched through 2.25GHz to 3GHz with further increases most likely on the cards in future specification revisions (see Figure 1).

The Ultra HD 3GHz maximum video resolution specified in HDMI 1.4a (4k x 2k @ 24 Hz, 25 Hz or 30 Hz) enables cinema style clarity in the home entertainment system. A single frame of 4k x 2k data is comprised of 4096 pixels and 2160 lines; with 24 frames being transferred every second, 3GHz video sources and sinks must be capable of transmitting or receiving over 8 million pixels of active video data every second. No mean feat…

Figure 1: The Evolution of Video Formats

With the increased magnitude of data being transferred across the link shrinking the period of each bit transferred across the link, the prospect of bit errors occurring on the link increases. But what of the occasional random bit error? It may be the case that the random bit error occurs during the active video region which results in an incorrect pixel on the display. However, if that random bit error occurs during the control period in the HDMI stream, the synchronisation data may be disturbed which could result in a disturbance on the screen (e.g. a horizontal or vertical streak or picture flash). This risk is compounded when considered in conjunction with the data encryption protocol employed by the HDMI specification; High Definition Content Protection (HDCP).

HDCP is employed to protect high-value video content as it is transmitted across a video link, preventing the unlawful copying of movie and television content whilst it is transmitted between a source device (e.g. a DVD player or set top box) and a sink device (e.g. a television). Establishing a HDCP link between a source device and sink device can take over 2 seconds and is maintained by transferring a key between the source and sink devices every 2 seconds.

Should the aforementioned random bit error cause the picture to flash triggering a break in the authenticated link, the user may see "snow noise" (an indication that HDCP authentication has failed - see Figure 2). It could then take over 2 seconds for the authenticated link to be re-established; a contributor to user frustration and to field returns.

Figure 2: A Sample of Original Content and HDCP Snow Noise

Modern video signal chains can be comprised of a host of different devices. For example, the bill of materials of a Audio Video Receiver (AVR) can include a HDMI buffers, a HDMI mux, a HDMI and analog video receiver, a HDMI transmitter and a video signal processors integrating scaling, de-interlacing and on-screen display functions. To add further complication, these devices can also often be sourced from a broad range of semiconductor vendors.

Developing a reliable video signal chain incorporating all of these devices, supporting video formats with such high data rates, is becoming a significant challenge for video product design and manufacturers. Cable quality, power supply design, signal integrity, PCB quality, and silicon settings need to be at their absolute optimum to successfully support such video formats. But how can a video product design and manufacturer easily evaluate the impact of the tweaks to any of the previously mentioned system elements?


Loading comments...

Write a Comment

To comment please Log In