datasheets.com EBN.com EDN.com EETimes.com Embedded.com PlanetAnalog.com TechOnline.com   UBM Tech
UBM Tech

Connecting systems to displays with DVI, HDMI, DisplayPort: What we got here is failure to communicate

-January 04, 2007

 PDF

Peer at the back of a modern home-theater receiver, and you'll note a preponderance of analog-signal-carrying connectors amid the plethora of plugs (Figure 1). RCA jacks route low-level analog-audio signals, and banana plugs and binding posts handle postamplification connections to speakers. RCA jacks also tackle video duties: one for each composite-video output and three for each higher-quality-component video bus, with S-Video an intermediary-quality and -complexity option (Reference 1). RGB video comes in a variety of plug flavors: RCA, RF, and nine- and 15-pin versions, along with a variety of more proprietary alternatives.

Peer closely at the bottom two rows of the AVR-5805 back panel, however, and you'll notice that Denon is also embracing the digital-audio future. RCA jacks can also handle S/PDIF (Sony/Philips digital-interface) bit streams, as can optical-fiber plugs. Both proprietary Denon Link and industry-standard Ethernet RJ-45 interconnects tackle networked audio, as does IEEE-1394—that is, FireWire (Reference 2). Wires are no longer absolutely necessary, of course; a burgeoning number of wireless schemes are also contending for your next-generation design consideration (see the followup blog post, "On the air"). And, for video, direct your attention to the DVI (Digital Visual Interface) and HDMI (high-definition-multimedia-interface) plugs on the receiver back panel's top row.

Before reading more about DVI and HDMI, first consider a "bigger picture" question: Why is the analog-to-digital conversion happening? The answer begins with multimedia sources; audio historically has come from cassette tapes and LP records, but its primary starting places nowadays are optical discs and both downloaded and streamed varieties of Internet-housed binary files (references 3 and 4). Optical discs, along with digital bit streams over terrestrial antenna, cable, DSL (digital-subscriber-line), and satellite links, are also common video sources, and all of them replace earlier generation content, such as analog broadcasts and videotapes (Reference 5).

Next, look at the other end of the distribution chain, the last step or few before the content reaches your eyes and ears. The processes that generate sound waves and photons are inherently analog, but, for audio, EDN has extensively covered the emergence of Class D digital amplifiers for driving the transducers (Reference 6). And, for video, digital-centric alternatives, such as DLP (digital-light-processing), LCD, LCOS (liquid-crystal-on-silicon), and plasma technology are replacing analog CRTs, which are fading from prominence (references 7 and 8).

Now, consider the multistep process that audio and video traverse between their source and their destination. Digital-domain processing—resampling, resizing, mixing, format transcoding, and other tasks—can by itself induce potentially detectable quality-degrading transformations to your system's multimedia material. Eliminating unnecessary additional digital-to-analog and analog-to-digital conversions not only potentially reduces system cost, but also keeps the source content in as pristine condition as possible before reaching its ultimate endpoint: you. Analog content is also subject to degradation from its transmission medium, due to such factors as cable-impedance attenuation and imperfect load-impedance matching, and from its operating environment due to EMI coupling. And it's difficult to multiplex multiple analog signals on a single wire in an interference-free fashion; the multiplexing process is simpler in the digital domain.

One other important digital "advantage," at least to some folks, bears mentioning: Media-content rights-holders have struggled for years and are still struggling to copy-protect analog content, most notably through technologies such as Macrovision. Digital-domain material is much more straightforward to encrypt. This lockdown plugs the perceived Achilles' heel interconnection between video source and display, thereby controlling such factors as who accesses it, how many times they can access it, how long they have to first access it and how long thereafter, with what quality they can listen to and view it, and whether they can make a copy of it and, if so, how many copies and with what quality level.

DRM (digital-rights management) also more readily lends itself to several ideal characteristics of a content-control system: renewability, revocability, and upgradability (Reference 9). And, when DRM breaches inevitably occur, digital-domain watermarking identifies the source of the infringement, speeding prosecution. Yet, as you'll see as you read on, DRM flaws are at the root of the problems many end users have when they try to use modern consumer-electronics devices. When a digital-interconnect scheme not only results in restricted media access compared with its "fair-use" analog predecessor, but also spawns more usage glitches than this precursor, consumer backlash is inevitable.

DVI: the digital debutante

When Intel announced the formation of the DDWG (Digital Display Working Group) at the fall 1998 Developer Forum, the company put the stamp of approval on an interface technology, DVI, which Silicon Image had been promoting for several years as the TMDS (transition-minimized-differential-signaling)-based PanelLink topology. At the time, DVI wasn't the only game in town; in fact, the VESA (Video Electronics Standards Association) was championing two other TMDS-based approaches: P&D (plug and display) and DFP (digital flat panel). Apple's ADC (Apple Display Connector) was also TMDS-derived, but the connector and cabling additionally carried USB and display power buses. National Semiconductor championed another notable contender, OpenLDI (Open LVDS Display Interface), a fact that's probably not surprising to those of you who know of the company's long involvement with LVDS (low-voltage differential signaling).

Intel's influential blessing swung industry momentum in DVI's direction, however, and the alternative approaches faded from prominence. OpenLDI, for example, saw its most visible success with Silicon Graphics' 1600SW LCD, which touted then-leading-edge features, such as a 17-in.-wide screen, 1600×1024-pixel resolution, a 350-to-1 contrast ratio, and a 0.23-mm (110-dpi) dot pitch. However, only three graphics cards, from 3Dlabs, Formac, and now-defunct Number Nine, natively supported its OpenLDI interconnect. Out of necessity, SGI developed a MultiLink adapter that translated OpenLDI to the more common VGA (analog) and DVI (digital) protocols. And, whereas P&D was conceptually similar to DVI, VESA attempted to comprehend not only analog and digital video, but also USB and IEEE 1394 (FireWire)-tethered peripherals, such as mice, keyboards, printers, and audio devices. The DDWG instead used this additional connector real estate to implement an optional second parallel DVI link, thereby supporting higher resolution displays.

At the time it unveiled its support of DVI, Intel had a long and generally successful legacy of driving de facto standards, such as PCI (Peripheral Component Interconnect) and PCMCIA (Personal Computer Memory Card International Association), into the marketplace. Rigid compliance testing and periodic industry "plugfests" were key factors in that success. However, the development of DVI lacked a similar formal compatibility-validation process, an omission that was to its detriment. (Intel didn't develop DVI and therefore had less control over it, which may partially explain this omission.) DVI runs at a 165-MHz maximum clock speed with a 10-bit-per-clock transfer rate. After comprehending 8B/10B encoding and the eight-wire (four-twisted-pair) bundle per link (red, green, blue, and clock), this clock frequency translates to a peak single-link bandwidth of 3.96 Gbps and peak single-link resolution of 1920×1200 pixels (24-bit color, 60 frames/sec) (Figure 2). However, some silicon suppliers, particularly those that attempted to integrate a DVI transceiver within a larger piece of silicon, such as a graphics chip, were unable to meet the 165-MHz design target. (DVI's I2C-based DDC (display-data-channel) bus is the means by which graphics chips and displays communicate their respective capabilities and limitations to each other.) And neither video-output devices nor displays commonly supported the dual-link DVI implementation, delivering as much as 7.92 Gbps of bandwidth and 2560×1600-pixel resolution.

Design difficulty led to another DVI shortcoming—this one more financial in nature. In exchange for Intel's blessing, Silicon Image vowed to offer up its fundamental technology patents in a royalty-free fashion. However, the company retained control over its DVI implementation patents, and Silicon Image thereby obtained a lucrative revenue stream not only from other companies that bought its DVI chips, but also from those whose DVI-circuit designs overlapped its own. Adding to the industry angst over DVI was the fact that Intel was an investor in Silicon Image; a portion of Silicon Image's DVI-related income, therefore, ended up bolstering Intel's coffers.

When DVI entered the market, it faced a substantial legacy inventory of VGA-only displays in users' hands. Graphics cards might add DVI support, but, at least in the near term, eliminating VGA capability would mean fiscal suicide. Indicative of this fact, the most common DVI-connector option, DVI-I, incorporated both analog and digital interfaces and, through a dongle, could transform into a legacy 15-pin VGA plug. Intel's early promotional materials for DVI claimed that the computer industry was nearing an inflection point at which burgeoning display resolutions would couple with finer dot pitch to leave analog interfaces incapable of delivering adequate-quality images. Nearly 10 years later, this inflection point largely still hasn't materialized. Out of fairness to Intel, at least some of the reason for the delay is largely out of the company's control; Microsoft's operating systems don't yet robustly implement resolution-independent rendering of GUI elements, such as fonts and icons. (Windows Vista should make significant improvements in this regard.) And these elements on fine-pitch displays are therefore difficult to discern. Silicon suppliers have also made tangible improvements in the SNR, switching speed, and other attributes of their analog-video transmitters and receivers, thereby delaying DVI's ascendancy.

HDMI: an evolving derivative

DVI's connector form factor was adequate for computer applications, but consumer-electronics suppliers needed something smaller and more user-friendly—without screws, for example. Yet you shouldn't view HDMI as simply a shrunken DVI port (Figure 3). As its name implies, its developers, chief among them Silicon Image, incorporated in a DVI-backward-compatible manner the ability to transmit both video and eight-channel audio data (compressed and uncompressed, 24-bit sample size, 192-kHz sample rate) down a single cable. (See Figure 1 to understand the appeal of this enhancement.) Initial HDMI-draft versions modulated the audio information on the clock signal; nowadays, audio-data transfer occurs within "data-island" intervals—that is, during horizontal- and vertical-display blanking periods. HDMI's beyond-the-PC focus also necessitated that it support not only RGB, but also 4:4:4 and 4:2:2 component-video formats. And HDMI supports three encoding protocols: 8B/10B for video that can tolerate an occasional dropped bit, 4B/10B for audio, and 2B/10B for the most critical control information.

Ever since the initial Version 1.0 specification release in December 2002, HDMI has undergone regular enhancements in a backward-compatible manner (Table 1). Note, for example, that Version 1.2a formalized support for CEC (Consumer Electronics Control), a remote-control scheme employing the AV Link protocol, whose implementation in HDMI sources and destinations is optional but whose wiring support in cabling is required. The latest HDMI iteration, Version 1.3, increases the maximum single-link clock rate to 340 MHz, an enhancement that will require not only Version 1.3-compliant endpoints on both ends of the HDMI link, but also a Category 2 speed-certified cable for end users to benefit from it. Those benefits, which arrive through 10.2 Gbps of raw bandwidth, including error detection and correction encoding, might alternatively include boosting per-link image resolutions, boosting per-link image-frame rates, and increasing image-color depth beyond 24 bits/pixel. This depth increase would come through HDMI's support for 30-, 36-, and 48-bit—that is 10, 12, and 16 bits/component—color in both RGB and component-video formats.

Another benefit end users would derive from Version 1.3 is an expansion of the image-color gamut through support for the next-generation xvYCC color-space standard. HDMI Version 1.3 also broadens audio-transport support to encompass the latest high-fidelity lossless-compression formats from Dolby Labs and DTS. This addition is significant only if the transmitting device is incapable of decoding these formats; if it can decode these formats, it could alternatively employ the support for uncompressed audio transport in earlier HDMI versions, along with multichannel-analog-audio connections. Lip-synch correction compensates for the differing latencies you incur when processing audio and video throughout the home-theater-equipment chain, and the developers of the miniature Type C connector designed it with multimedia transfers from compact digital still cameras and videocameras in mind.

Skeptics might wonder whether HDMI Version 1.3's higher bandwidth capabilities are meaningful in real life, because previous HDMI iterations already handle traditional video sources, including standard- and high-definition optical discs and standard- and high-definition television from cable, IPTV (Internet Protocol television), terrestrial, and satellite providers. Three words suffice to respond to that cynicism: cameras, computers, and consoles. The importance that console- and computer-game enthusiasts place on high frame rates, enabling gamers' fast-response aspirations, is indisputable (Reference 10). Modern cameras can easily capture high-resolution, HDR (high-dynamic-range) images, and modern computers can easily render and output them. And display innovations, such as deep-black capability, LED backlights, multicolor-backlight arrays, and BrightSide Technologies' impressive per-LED, per-frame control of both white- and multicolor-LED arrays, are increasingly able to deliver these rich images to viewers (Reference 11). Displays are no longer handling just traditional video sources, and, with HDMI Version 1.3, the link between system and display is no longer the quality bottleneck.

No discussion of HDMI would be complete without covering DRM. The belief that all HDMI-inclusive devices also implement DRM is a common misconception; in actuality, HDCP (high-bandwidth-digital-content-protection) support is optional, albeit common, in HDMI, just as it was with DVI in the form of DVI-HDCP, and its implementation incurs the payment of additional royalties to HDCP intellectual-property-rights holders. Although the HDMI Founders organization learned from DVI's shortcomings and has implemented a formal validation process, this validation historically has not extended to cover the optional HDCP. As a result, and perhaps not surprisingly, most consumers' issues to date with HDMI trace back to HDCP-created root causes.

A scathing article by well-known consumer-electronics-accessories supplier Monster Cable succinctly documented these HDMI woes and their HDCP nexus (Reference 12). Common consumer complaints include the inability to get an HDMI-equipped DVD player and display to communicate when an audio/video receiver is between them, for example, whereas the DVD player and display work fine when directly connected. Consumers also complain about the inability to get various pieces of equipment to work unless users power them up in a specific order and the inability to restore previously stable operation once the consumer switches from a particular video source at the display and then switches back.

The root cause of all these problems is inevitably a disruption in the supposed-to-be-continuous HDCP "handshake" between source and destination, which the video source incorrectly interprets as a DRM breach and responds to by disabling its output. "Ugly" fixes for the problem include power-cycling the equipment or unplugging and replugging connectors to restore normal functions. Even if DRM functions as intended, DVI- and HDMI-equipped video sources often ship factory-configured with their digital outputs inactive, so an owner needs to first connect them to a display over analog connections, reconfigure them in their setup menus, and then reconnect them to the display over a digital link. And, invariably, Monster Cable reports, consumers throw up their hands in dismay and return cabling and gear to the store for refund, a scenario that yields no benefits to the supplier, the retailer, or the end user.

DisplayPort: falling short?

The electronics industry has long struggled to discern the differences of, and decide between, the dueling outputs of industry-standards bodies and those of individual companies' or multicompany consortiums' de facto standards. One recent example of this creative tension is the mind-share battle between HDMI and VESA's response: DisplayPort. Although VESA approved DisplayPort specification Version 1.0 in May 2006, I saw a demo of its predecessor, IBM's Digital Packet Video Link, many years earlier at an Intel Developer Forum. As its precursor's name implies, DisplayPort dispenses with the raw video-streaming approach of technologies such as DVI and HDMI, instead bundling audio, video, and control information in packets akin to those found in data networks.

Each DisplayPort Main Link comprises one, two, or four double-terminated differential-signal pairs with no dedicated clock signal; instead, the 8B/10B-encoded data stream embeds the clock (Figure 4). AC coupling enables DisplayPort transmitters and receivers to operate on different common-mode voltages and, therefore, to be fabricated on different process lithographies. DisplayPort Version 1.0 specifies both 2.7-Gbps link rates with 270-Mbytes/sec bandwidth per differential-pair lane after subtracting 8B/10B overhead and 1.62-Gbps link rates with 162-Mbytes/sec bandwidth per lane. The main link is not only high-speed, but also, like HDMI, unidirectional and exhibits claimed, albeit unspecified, low latency. The link rate and pixel rate are decoupled from each other; you can freely trade off pixel depth, resolution, frame rate, and the presence and amount of additional data, such as audio and DRM information in the link stream.

For example, with a one-lane, 2.7-Gbps link, you could alternatively implement a 30-bit/pixel, 4:4:4 YCrCb video stream of 1920×1080-pixel interlaced resolution at 60 fields/sec or an 18-bit/pixel RGB video stream of 1680×1060-pixel progressive-scan resolution at 18-frames/sec. A four-lane DisplayPort link enables you to, for example, implement a 36-bit/pixel 4:4:4 YCrCb video stream of 1920×1080-pixel progressive-scan resolution at 96 frames/sec, a 24-bit/pixel 4:2:2 YCr-Cb video stream of 1920×1080-pixel progressive-scan resolution at 120 frames/sec, or a 30-bit/pixel RGB video stream of 2560×1536-pixel progressive-scan resolution at 60 frames/sec. Innumerable other combinations are possible for one-, two-, and four-lane main-link configurations, including those that intermingle audio, video, DRM, and other information. A separate half-duplex, bidirectional auxiliary channel with 1-Mbps bandwidth and 500-msec maximum latency handles source/destination handshaking and exchange of source and sink respective capabilities, which a hot-plug-detection interrupt-request signal further supplements.

I attended a detailed technical presentation on DisplayPort at the 2005 SID (Society for Information Display) ADEAC (Americas Display Engineering and Applications Conference) in Portland, OR, when the specification was still in draft form. Both at that event and at a more recent presentation at the SMPTE (Society of Motion Picture and Television Engineers) Technical Conference and Exhibition in Hollywood, CA, I was struck by the fundamental contrast between the technically heavy DisplayPort material and the comparative absence of any tangible industry support. Plenty of companies, many fueled by their distaste for paying DVI and HDMI royalties, are willing to give press-release credence to the DisplayPort approach. But, by press time, Analogix was the only company that had publicly unveiled DisplayPort silicon.

Read more In-Depth Technical Features

DRM until recently also provided a point of differentiation between the dueling display-interface alternatives. Originally, DisplayPort planned to optionally implement Certicom, an obscure DRM technology that Philips developed. HDMI and DVI, in contrast, support HDCP, which has a nearly decade-old implementation history and the all-important backing of heavyweight content-rights holders in Hollywood and elsewhere. Perhaps this fact is the reason that VESA announced in early November 2006 that the upcoming Version 1.1 DisplayPort specification release would add support for HDCP.

Future forecasts

A third contender, the Intel-championed UDI (Unified Display Interface), joins DisplayPort and HDMI in the battle to become the next-generation digital interface. The UDI Working Group consortium, also comprising companies such as Silicon Image, Apple Computer, LG, Samsung, and Nvidia, launched itself with fanfare in December 2005, but subsequent progress has been more subdued, even though the Version 1 specification gained approval in July 2006. UDI is a descendant of and backward-compatible with HDMI but delivers as much as 16 Gbps of raw per-link bandwidth. As Wikipedia describes it, "The connector has a single row of 26 contacts pitched 0.6 mm apart from each other, looking very similar to the Intel-initiated USB plug, which has a single row with only four contacts. Three of the 26 contacts will not be wired but are reserved for undetermined future upgrade possibilities. Transmit and receive plugs are slightly different, and a UDI cable will fit only one way. Bidirectional communication works at a much lower data rate than available for the single direction video-data stream" (Reference 13). And, at a recent press briefing in San Francisco, HDMI representatives positioned UDI as a business-class-PC-targeted complementary follow-on to DVI, with greater bandwidth for higher single-link resolution but without HDMI's audio and other enhanced features.

So, what happened to UDI? Intel won't speak on the record about the specification's status, but several anonymous and well-placed industry sources say that Intel has put UDI on the back burner and has shifted its implementation focus to DisplayPort. These sources cite several reasons for the company's change of heart. One is long-standing industry animosity toward HDMI's royalty requirements, which significantly benefited Silicon Image and indirectly also Intel by virtue of its investment relationship. Another reason they cite is a desire to embrace a single standard that could serve both external and integrated graphics subsystem-to-display interconnect schemes, an area in which VESA claimed—and, apparently, Intel agreed—that DisplayPort had an edge over HDMI. Yet another reason could be VESA's belated embrace of HDCP (for which Intel also owns fundamental intellectual-property rights).

If the rumors of Intel's still-powerful loyalty switch from HDMI-derived UDI to DisplayPort are true, this change of heart may significantly boost VESA's fortunes. However, you still cannot discount HDMI's notable market lead. HDMI- and DVI-equipped, HDCP-enabled graphics cards are now ramping into production, thereby addressing Windows Vista's DRM requirements (Reference 14). HDMI ports are pervasive on HDTVs (high-definition televisions), along with a recently introduced Epson home-theater projector, and are beginning to appear on computer monitors, as well. And HDMI 1.3-equipped consumer-electronics video-source devices, notably Sony's PlayStation 3 and Toshiba's second-generation HD DVD players, are also entering retail channels, based on chips from Silicon Image and other suppliers. Without an immediately obvious technical advantage over HDMI and with slow germination hampering its perception in the market, DisplayPort will be hard-pressed to make any headway at whatever indeterminate point in the future it's ready to do battle.




References
  1. Dipert, Brian, "A crash course in color conversion,"EDN, June 7, 2001, pg 46.

  2. Dipert, Brian, "CAT5 tracks: Audio goes the distance, reliably and on time,"EDN, July 7, 2005, pg 47.

  3. Dipert, Brian, "Upward spiral: optical storage,"EDN, Aug 7, 2003, pg 38.

  4. Dipert, Brian, "Song wars: striking back against the iPod empire,"EDN, June 9, 2005, pg 52.

  5. Dipert, Brian, "Subpar wars: high-resolution-disc formats fight each other, consumers push back,"EDN, March 2, 2006, pg 40.

  6. Israelsohn, Joshua, "Class D Gen 3,"EDN, April 15, 2004, pg 49.

  7. Dipert, Brian, "Master of some: direct-view-display technology,"EDN, March 3, 2005, pg 38.

  8. Quinnell, Richard A, "Microdisplay technologies: Projections systems lose contrast,"EDN, April 14, 2005, pg 35.

  9. Dipert, Brian,

    Save Follow PRINT PDF

    Loading comments...

    Write a Comment

    To comment please Log In

    DesignCon App
    FEATURED RESOURCES