AMD Radeon HD 6970 review
A capable performer that’s just edged out by the cooler, quieter competition from Nvidia
Review Date: 15 Dec 2010
Reviewed By: Mike Jennings
Price when reviewed: £250 (£294 inc VAT)
Features & Design
Value for Money
AMD introduced its latest series of graphics cards a month ago with its mid-range HD 6800-series. That was enough to cause Nvidia to rush out the GTX 570 to market, and now AMD is following up with its single-chip flagship cards, the Radeon HD 6970 reviewed here, and the HD 6950.
It's certainly got the specification to go head-to-head with Nvidia's finest. The 40nm chip, code-named Cayman, packs in an 880MHz core clock alongside 2GB of GDDR5 memory running at 1,350MHz. The 389mm2 die includes 2.64 billion transistors, and the core provides 1,536 stream processors – an impressive figure, but slightly less than the number included on the HD 5870, the firm's previous top-end card.
That sounds like a retrograde step, but it's indicative of the major changes introduced by AMD in its latest architecture. Previous cards used a single powerful stream processor surrounded by four slave chips, but this year's models do away with this structure, instead using four equally powerful stream processors in each package.
Fewer stream processors are used, but the resulting chip is more efficient and powerful than its predecessors. AMD has divided its stream processors into two overarching graphics engines, with 24 packages, each containing 64 processors across the card.
AMD’s revised architecture proved its worth in our Crysis benchmarks. The Radeon HD 6970 ran through our 1,920 x 1,080 Very High quality test at 48fps, with this score falling to 42fps and then 38fps when we enabled 4x and 8x anti-aliasing. AMD’s card even ran through our Very High test at the huge resolution of 2,560 x 1,600 with a playable score of 31fps, although adding anti-aliasing dropped this to 25fps.
These scores indicate that the Radeon HD 6970 will handle any game you care to throw at it, but the situation is less clear-cut when you consider Nvidia’s cards. The GTX 580 retains its crown as the fastest single-GPU product on the market, but it’s the GTX 570 that AMD should be worrying about. Nvidia’s card is around the same price as the HD 6970 and almost as impressive in our tests, running only 2fps slower than the HD 6970 in our 1,920 x 1,080 Very High quality test with 4x anti-aliasing activated.
For our secondary gaming test, we’re moving to Just Cause 2 as of this month – a keen test for any graphics card thanks to its detailed environments, complex lighting effects and long draw distances. The Radeon HD 6970 motored through its Very High quality tests (run at 1,920 x 1,080 and with 8x anti-aliasing) at 37fps, but again the GTX 570 was better, romping through the same tests with an average of 63fps.
Elsewhere, AMD has introduced a vapour chamber cooler in the Radeon HD 6970, aping Nvidia’s GTX cards. It works by circulating liquid around the chip to transfer heat away from the surface, and makes the HD 6970 a very large card indeed. At 274mm in length, it’s a shade longer than the GTX 580, and requires both eight- and six-pin power plugs.
Unfortunately, AMD’s vapour chamber isn’t as efficient as Nvidia’s. An idle temperature of 57˚C is fine, but the HD 6970 rocketed to 92˚C during our stress tests – 8˚C hotter than the GTX 570. It’s power intensive, too: our test rig drew 377W with the card under stress; with the GTX 570 it drew 283W; and with the GTX 580 in situ it pulled a total of 292W. It's also loud, especially during more intensive benchmarks.
Overall, the AMD Radeon HD 6970 is impressive, but it doesn’t quite do enough to take over from Nvidia at the graphics card top table. The price is too similar to the GTX 570, the noise levels higher, efficiency lower and overall performance slower. AMD needs to do better than this to grab its crown back.
Author: Mike Jennings
Output to a CRT TV?
Is there an easy (read cheap) way to output to a CRT TV?
By milesfinch on 15 Dec 2010
Are you asking how to output this specific card to a CRT TV Miles, or just how to output a PC signal to a CRT TV in general?
What connectors does the TV have? S-video? Component?
I only ask as it seems a bit strange to output a £300 GPU to such an old TV - especially as you can buy a flat-screen 1080p TV for less than the price of this card. (Even more so as to make proper use of a card like this you'll need a PC considerably North of the £1000 mark...)
By Mr_John_T on 15 Dec 2010
Other sites that have reviewed the cards place them better than the Nvidia's GTX 570 & 580 on temps and power usuage.
By Duggie on 15 Dec 2010
Something wrong with these results.
The radeon 6900 series' PowerTune tech means that it is physically impossible for the card to draw more than 250W - and the 580's power limiting tech is hard coded to function only when Furmark and/or OCCT is running, making it a difficult comparison to make.
Regardless, your conclusions about performance are plainly wrong - or the games tested are outliers and not representative of general gaming performance with the card. The 580 is between 3 and 8% faster than the 6970 on average, whilst costing 50% more ($515 vs $375), drawing more power (at both idle and load), and generating higher temperatures.
I would suggest PC Pro reasserts its testing methodology since it appears it is delivering pretty useless results as it stands.
For people interested in proper benchmarks of this cards (and comparisons with other high end models from previous generations), I would recommend checking out sites like Hothardware.co.uk or Anandtech.com.
By Omoronovo on 16 Dec 2010
Something wrong with these results.
@Omoronovo Yes the results do strike me as odd too but there was no claim of power consumption being over 250w. They are talking about total system draw or what comes out of the wall not what the card itself takes.
By urmaster on 16 Dec 2010
Why on earth is the benchmark set at low settings and res, as most people who are going to buy the card are not going to set the res so low.
By holdenmd on 17 Dec 2010
- Europol warns: public Wi-Fi isn't safe
- Privacy groups challenge Facebook's WhatsApp buy
- IDC: iPad intertia opens door for Windows tablets
- Chip breakthrough to eliminate checkout queues
- Rivals put on notice as Spotify snaps up The Echo Nest
- Windows 8.1 Update 1 leaks via Microsoft's website
- Bitcoin "founder" says: you've got the wrong man
- Has bitcoin creator been found?
- HTC Desire 310: more competition for the Moto G
- Mozilla questions why Dell charges £16 to install Firefox
- Move over Delia: IBM Watson is cooking tonight
- Eric Schmidt on the double-edged smartphone: friend and foe
- Getty joins the race to the bottom
- Hour of Code: five steps to learn how to code
- Sony Xperia Z2 Tablet review: first look
- Sony Xperia Z2 review: first look
- Samsung Galaxy Gear 2 review: first look
- Nokia XL review: first look
- Samsung Galaxy S5 review: first look
- Nokia X review: first look
- Make the most of your mobile data
- Old-school internet scams: five that just won't die
- Bitcoin believers not worried by Mt. Gox disarray
- How to hack your car
- Small server vs cloud: which is best for SMBs?
- Block party: why do millions play Minecraft?
- What to do if you’re still on Windows XP
- Microsoft Word: top 20 secret features
- Measuring me: is your body the future of security?
- The best mobile apps for business
- Windows Server 2012 R2: how the Datacenter edition could change SMBs
- Invoices and VAT: how to set up your documents correctly
- Nexus 5 vs Samsung Galaxy S4 Active: the best phone for avoiding screen burn
- How much is a social user worth?
- The key to choosing a secure password
- Thunderbolt Bridge: a fast Mac migration tool
- Should you advertise on Twitter?
- How to track a lost smartphone
- Self-publishing success: the best way to sell your book
- 1.6TB SSD: why would you need one?