AMD's Radeon HD 6800 GPUs: For Nvidia, Even Worse News
Two months ago, as regular readers may recall, I published an analysis that was critical of Nvidia’s long-term relevance in the company’s current form. Part of the reason for my skepticism was category-generic versus vendor-specific. Cores integrated within chipset and CPUs are increasingly able to handle traditional discrete graphics processor duties, and alternative GPU mainstream applications have been slow to emerge and underwhelming when they arrive.
Another key problem plaguing Nvidia is its competitive disadvantage versus competitor ATI Technologies, a division of AMD, in the dwindling graphics business segments for which discrete GPUs remain relevant. At the time that my writeup appeared back on August 19, Nvidia still hadn’t rolled out the entirety of its first generation of products based on the power- and silicon area-hungry Fermi architecture; the proliferation of the suite wasn’t complete until last week. AMD had completed its entire DirectX 11-cognizant Radeon HD 5000 series rollout before the first Nvidia Fermi-based GPU hit the street.
And ironically, only 11 days after Nvidia finished its first-generation Fermi family launch, AMD has unveiled the first two members of its second-generation DX11-supportive family, the Radeon HD 6000 series:
Specifically, the company unveiled the Radeon HD 6850 and 6870, together comprising ‘Barts’, the first output of the project code-named Northern Islands and one of the worst-kept tech secrets in recent history. Note that, at least at the moment, AMD plans to maintain the Radeon HD 5700 family in the in-production product line:
Key specifications include:
- Peak power consumption (filling in the above TBDs) on the 6850 in board form is 127W; on the 6870 it is 151W
- The 6850 MSRP is $179 in board form with a 1 GByte frame buffer. The 6870 has a $229 MSRP when also mated with 1 GByte of GDDR5 SDRAM.
- Both chips interface to their frame buffer memory arrays over a 256-bit bus
- Both chips embed dual DVI and dual mini DisplayPort interfaces, along with a HDMI port. 6870-based boards require two six-pin power connectors; boards based on the 6850 require only a single power input.
Here’s how AMD is positioning the Radeon HD 6850 and 6870 from an absolute performance standpoint versus prior-generation devices:
The new products’ naming is, I feel, a bit deceptive given that the 6850 is claimed to be slightly lower performance than the 5850, with the same diminished trend holding for the 6870 versus the 5870. Factor pricing into the mix, however, and the situation looks much better for the newcomers:
And anyway, from the earlier slide you already know that AMD has 5870-clobbering ‘Cayman’ (the 6950 and 6970) on the way, along with dual-GPU 5970-exceeding ‘Antilles’ (the 6990). Here’s the irony, though; AMD fabricates the 6850 and 6870 on the exact same 40 nm process used for the 5800-series predecessors. So how did the company accomplish its performance-per-dollar boosting trick, along with making video processing, display output and other enhancements along the way? These three slides summarize the story:
AMD was able to revisit its understandably rushed first-generation 40 nm designs, making circuit compaction and re-layout moves that shrunk the resultant die size and boosted the reasonable yield-capable clock speeds, along with enhancing the shader processors in order to (for example) deliver up to twice the tessellation rate with the 6870 versus the 5870. The last bullet of the second slide above gives, I think, a concise bottom-line summary: with the 6870, AMD was able to deliver better performance than the 5850 at a 25% die size decrement, on a common process lithography. That’s huge.
Ironically, a recent AMD positioning decision in lock-step with a competitor’s move has given Nvidia a potential short-term window of upside opportunity. AMD’s decided to drop the ATI brand name, a few days shy of four years since the acquisition was finalized. Almost in lock-step, Nvidia has begun bypassing its former add-in board partners (such as XFX) and begun selling Nvidia-branded boards in major retailers such as Best Buy and Newegg. Much as I admire EVGA’s bravado, I predict that Nvidia (who’s spent many years and mucho dinero establishing its brand with consumers, much more successfully than has AMD in re-branding ATI, in my opinion) will have near-term profit-boosting success with this cut-out-the-middleman approach.
But inherent in Nvidia’s move is, I believe, desperation. Any incremental profit garnered by a direct-to-retail strategy is made moot by market share-preserving price cuts like the ones Nvidia just made, considering the company’s products’ substantial die size disadvantages versus AMD competitors at comparable performance and price points. It’s exactly the same discard-the-partners strategy undertaken by 3dfx shortly before that particular graphics pioneer imploded. When it did, Nvidia snatched up the remnants…including then-PR manager Brian Burke, who’s currently a senior PR representative at Nvidia. Do I think Burke’s right now feeling some déjà vu? Yes, I do.
p.s…I’m still awaiting my review cards, but AnandTech has lot of benchmark data up for your analysis.