Our hardware reviewer’s license stipulates that we must include a block diagram in page one of any review of a new GPU, and so you have it above. So when we say our task of cramming a review of Cayman into a few short days isn’t easy, that’s because this chip is the most distinctive member of the recent, bumper crop of new GPUs.Ī logical block diagram of the Cayman GPU architecture. In other words, more goodness abounds throughout. The highlights include higher geometry throughput, more efficient shader execution, and smarter edge antialiasing. Unlike Barts, Cayman is based on a fundamentally new GPU architecture, with improvements extending from its graphics front end through its shader core and into its render back-ends. Or even your slightly older twin brother’s, perhaps. Many of us in the outside world had heard the name, but AMD did a surprisingly good job (as these things go) of keeping a secret, at least for a while-Cayman ain’t your daddy’s Radeon. At the same time, work quietly continued-at what had to be a breakneck pace-on another, larger chip code-named Cayman. Barts was also downsized to hit a nice balance of price and performance. That chip retained the same core architectural DNA as its predecessor, but it added tailored efficiency improvements and some new display and multimedia features. The safer, more incremental improvements were incorporated into the GPU code-named Barts, which became the Radeon HD 68. At that time, AMD had a choice: to push ahead with an ambitious new graphics architecture, re-targeting the chips for 40 nanometers, or to play it safe and settle for smaller, incremental changes while waiting for TSMC to work out its production issues. Both AMD and Nvidia had to scramble to rebuild their plans for next-generation chips, which were intended for 32-nm. “Isn’t this just another rehashed version of AMD’s existing GPU architecture, like the Radeon HD 6800 series?” Oh, but the answer to your question, so cynically posed, is: “Nope.”Īs you may recall, TSMC, the chip fabrication firm that produces GPUs for both of the major players, upset the apple cart last year by unexpectedly canceling its 32-nanometer fabrication process. “What’s all the fuss?” you might be asking. We’re gonna have to cut some corners, leave out a few vowels and consonants, and pare back some of the lame jokes in order to get you a review before these graphics cards go on sale. Consequently, our task of getting a handle on these things and relaying our sense of it to you… isn’t easy. Today’s GPUs are incredibly complex beasts, and the companies that produce them don’t waste any time in shoving ’em out the door once they’re ready. The second is the number of days we’ve had to spend with it prior to its release. The first figure is the number of transistors in AMD’s new Cayman graphics processor. Anybody have any idea on what could be going on here?ġ) Take the computer apart and try cleaning againĢ) Possibly get an aftermarket cooler, but I have no idea what to do - TWIN TURBO Proģ) Go into BIOS and change the auto-shutdown temp to something slightly higher.Ĭould it be the PSU? I plan to get a 1050W between now and Christmas, to leave room for CrossFire.2.6 billion.
The GPU idles at 42-47C, stays around 50-60C throughout the game, but when the explosions go off and it increases to just under 70C my whole computer just shuts down.
Room temp is 72-74F, I have a ground-based room fan pointed at the computer, and a nearby window is letting in the winter breeze. I have only experienced this problem in BF3, and it even happens on Low settings. So I bought a new XFX Radeon HD 6950 2 gig card from Newegg about a month ago, and whenever I play BF3 and the temperature exceeds 65-70, my computer shuts down and "Power Saver Mode" pops up on my monitor.