OLED: Better Off Once The Delusion's Dead
Speaking of LEDs…I rely on a few basic rules to guide (but not constrain) my analysis of various technology topics. One of them, ‘a true leader acts and doesn’t react to competition’, I discussed back in June. Here’s another; ‘when fiscal times get tight, R&D budgets fade.’ I realize that it first glance what I’ve just said seems elementary and obvious. However, you might be surprised (or, then again, maybe not) by how quickly those with a vested interest in finding exceptions to the rule are able to rationalize them.
How’s this all relate to OLEDs? The technology is still largely in the realm of R&D, with a few low-volume production exceptions; scattered digital camera and portable media player applications, for example, which represent small-sized display opportunities that large-format-focused LCD suppliers would frankly prefer to not bother with, anyway. Yet, Monday found Samsung forecasting that once everyone had bought a LCD TV, the company would be able to upgrade consumers to OLED-based displays…and the very next day, Sony announced massive layoffs and budget cutbacks.
This is the same Sony that in late September 2007 introduced, and at the subsequent CES showcased, a miniscule (11"), expensive ($2500) OLED TV that the company actually managed to get into limited production…an OLED TV that, however, quickly ended up in Sam’s Club’s bargain bin because…umm…it was miniscule and expensive. And power-hungry. And (exemplifying a well-known OLED Achilles Heel) had a prematurely short screen lifespan in spite of over-aggressive dimming algorithms. So Samsung, who hasn’t yet brought an OLED TV to market, thinks it’ll move the world from LCD to OLED in a few years. And Sony, who has brought an OLED TV to market, is rapidly retrenching, especially in risky technology and product areas. Any guesses which of them I think has the smarter strategy?
As I believe my editorial coverage has consistently suggested, I’ve always found the ‘LCD killer’ aspirations of OLED supporters to be pretty much a fool’s delusion. On the one hand, I understand it… televisions, standalone computer monitors and laptop-inclusive displays collectively comprise a huge amount of LCD volume each year, and snagging even a small percentage of that business is nothing to sneeze at. But when I try to think of what might motivate a LCD customer in one of these segments to seriously consider a switch to OLED, that’s where I draw a blank.
Self-illuminated OLED could have had a slender chance in the CCLF backlight era. One might be able to make a credible argument that CCLFs, although a low cost and proven technology, were too thick, too power-hungry, too illumination-uneven, or too-something-else to keep up with evolving high-volume computer and television display requirements. But fast-ramping and cost-effective LED backlights make tangible (and I’d argue, more than sufficient) improvements in all of these areas. The LED-backlit LCD in my new-to-me MacBook Air is, in a word, stunning. And, via approaches such as BrightSide Technologies’ per-LED control (acquired by Dolby Laboratories in early 2007), LED backlights can enable LCDs to deliver impressive color gamuts, thereby neutering another historical OLED strength.
Don’t interpret from what I’ve said that no opportunities for OLEDs exist, because you’re be misconstruing my intent. Plenty of applications exist, for example, which (as I said earlier) require screen sizes so small that it’d take an innumerable volume of them to fill a single LCD glass plate…especially as plate sizes steadily increase thanks to Moore’s Law-analogous lithography trends. Plenty of applications exist for which any backlight thickness (or, for that matter, incremental power consumption) would be a deal killer…or at least a major pain in the rump to design around. Plenty of applications exist that can exploit OLED’s flexibility and other unique attributes. And plenty of (hint: consumer electronics) applications exist that can tolerate OLEDs’ limited lifetimes.
I’ll close with an admittedly oft-used analogy from my personal past history. From the very beginning of flash memory’s life in the mid-1980s, plenty of ‘pundits’ proclaimed the pending demise of DRAM. After all, flash memory’s single-transistor structure rendered it even more lithography-scaleable than DRAM’s transistor-plus-capacitor combo, and unlike DRAM, flash memory was nonvolatile to boot (pun intended). Those of us ‘in the know’ chuckled at such pie-in-the-sky predictions, no matter that our wallets wished they were true…flash memory’s slow write speeds, even slower erase speeds, lack of per-bit erase capability and not-unlimited erase cycle counts were showstoppers.
History, of course, proved out the more conservative stance, but only after a lot of time, money and manpower was spent foolishly chasing after the ‘DRAM killer’ dream. Flash memory did end up clobbering EPROM and mask ROM but only, arguably, due to the volume-boosting assistance of two timely and immature- and unstable-software applications; PC power management (therefore flash BIOS) and the GSM digital cellular protocol (therefore flash firmware). But the bulk of flash memory today sells into mass storage applications uniquely tailored to its strengths and able to deal with its shortcomings; portable audio and (later) multimedia players, PDAs, digital still and (later) video cameras, digital audio recorders, etc…applications that arguably would have never appeared at all, and that at minimum probably wouldn’t have achieved even a shadow of their current widespread success, without the welcome assistance of flash memory as a foundation building block.
Nowadays, solid-state storage is beginning to seriously compete with longtime dominant hard drive technology, but only because SSDs now deliver the capacity needed by historical HDD applications and therefore because other attributes for which SSDs are often superior have come to the forefront. And, keep in mind, it’s taken 20+ years for the longstanding HDD-replacement vision to begin to translate into reality. I’d wager that OLED technology will follow a similar evolutionary path.
The initial widespread ‘LCD replacement’ hype bubble for OLED is already beginning to deflate, and will continue to do so in the future (although there’ll be an inevitable parade of large-OLED prototypes absent even remotely firm production production dates at CES again next month). Slowly but surely, OLED-optimal applications will emerge and ramp, and additional OLED volume will come from applications that LCD vendors consciously choose to exit (thereby opening the door to the OLED alternative). Eventually, it’s conceivable that (as HDDs are arguably doing today to SSDs’ benefit, and as plasma displays have ironically also done to LCDs’ benefit) LCD glass economics trends will translate into panels that are too big for the bulk of potential customers’ needs, thereby prompting a widespread OLED conversion. But don’t hold your breath waiting for that day to come any time soon.
Ditch the delusion, for OLED’s ultimate benefit. Agree or disagree, folks?