AMD's Phenom Unveiled: A Somber Farewell to K8
by Anand Lal Shimpi on November 19, 2007 1:25 AM EST- Posted in
- CPUs
It's surreal isn't it? Is this how you pictured it? With forty-three days left in the year, AMD is finally letting us publish benchmarks of its long awaited Phenom microprocessor. The successor to K8, AMD's most successful micro-architecture to date, and the cornerstone of AMD's desktop microprocessor business for 2008: Phenom is here.
But shouldn't there be fireworks? Where's the catchy title? The Star Wars references were bound to continue right? Why were there no benchmarks before today, why are the next several pages going to be such a surprise?
AMD had been doing such a great job of opening the kimono as its employees liked to say, giving us a great amount of detail on Barcelona, Phenom and even the company's plans for 2008 - 2009. The closer we got to Phenom's official launch however, the quieter AMD got.
We were beginning to worry, and for a while there it seemed like Phenom wouldn't even come out this year. At the last minute, plans solidified, and we received our first Socket-AM2+ motherboard, with our first official Phenom sample. What a beautiful sight it was:
These chips are launching today, with availability promised by the end of the week. Phenom today is going to be all quad-core only, you'll see dual and triple-core parts in 2008 but for now this is what we get.
The architecture remains mostly unchanged from what we've reported on in the past. This is an evolutionary upgrade to K8 and we've already dedicated many pages to explaining exactly what's new. If you need a refresher, we suggest heading back to our older articles on the topic.
The Long Road to Phenom
Ever wonder why we didn't have an early look at Phenom like we did for every Core 2 processor before the embargo lifted? Not only are CPUs scarce, but AMD itself didn't really know what would be launching until the last moment.
At first Phenom was going to launch at either 2.8GHz or 2.6GHz; then we got word that it would be either 2.6GHz or 2.4GHz. A week ago the story was 2.4GHz and lower, then a few days ago we got the final launch frequencies: 2.2GHz and 2.3GHz.
Then there's the pricing; at 2.2GHz the Phenom 9500 will set you back $251, and at 2.3GHz you'd have to part with $283 (that extra 100MHz is pricey but tastes oh so good).
The problem is, and I hate to ruin the surprise here, Phenom isn't faster than Intel's Core 2 Quad clock for clock. In other words, a 2.3GHz Phenom 9600 will set you back at least $283 and it's slower than a 2.4Ghz Core 2 Quad Q6600, which will only cost you $269. And you were wondering why this review wasn't called The Return of the Jedi.
AMD couldn't simply get enough quantities of the Phenom at 2.4GHz to have a sizable launch this year (not to mention a late discovery of a TLB error in the chips), and the company was committed to delivering Phenom before the holiday buying season as these are tough times and simply waiting to introduce its first quad-core desktop parts was just not an option. Rather than paper launch a 2.4GHz part, AMD chose to go with more modest frequencies, promising faster, more competitive chips in Q1 2008. It's not the best PR story in the world, but it's the honest truth.
Two more quad-core Phenoms will come out in Q1: the 9900 and 9700, clocked at 2.6GHz and 2.4GHz respectively. The Phenom 9900 will be priced below $350 while the 9700 will be a sub-$300 part. As you can probably guess, the introduction of those two will push down the pricing of the 9600 and 9500, which will help Phenom be a bit more competitive.
It's worth mentioning that in the 11th hour AMD decided to introduce a multiplier-unlocked version of the Phenom 9600 sometime this year that will be priced at the same $283 mark. Whether or not it's called a Black Edition is yet to be determined.
124 Comments
View All Comments
B166ER - Monday, November 19, 2007 - link
Had to reply and clarify. The phrase refers to current setups in which quadcore gaming is not a primary reason to purchase said processors. Alan Wake, Crysis, and others, while being able to take advantage of quadcore setups, are not out yet, and I would guess that less than 3% of games out there now are quadcore capable, and scaling of such games is probably hardly optimized even if there were more supply. You speak in a future context in which these games will be available, even still the abundance of them will still be in wait.Nonetheless, Anand that might be your best review in my book to date. It speaks honestly and depicts a very seemingly structureless company that only has time on its side to pull itself up from potential disaster. AMD has not shown too many positive strides as a company lately, and mindless spending on what should have been a direct "let em have at it" approach only shows what was speculated previously: the company has a vacuum in leadership that needs to be filled by capable hands. And we all know proper administration starts from the very top; Hectors step down will not be mourned by me. Dirk has his cup filled, but his past credentials speak highly and show him capable. I can only wish well.
Phenom needs to be priced competitively, simple enough. and it needs higher quality yields for overclocking. Its amazing how they can stay just one step behind in almost every step currently. I hope the 7 series mobos bring about better competition vs Intel and Nvidia boards. We as a community need this to happen.
leexgx - Monday, November 19, 2007 - link
there are about 2-3 games i think that use quadwingless - Monday, November 19, 2007 - link
Thats such a mixed bag it makes me sick. Phenoms are mediocre at best at almost everything but somehow magically rape Intel in Crysis. I'll have nightmares. WTF is going on internally in those four cores to make this happen? I hope software manufacturers code well for AMD so they can shine. The pro-Intel software market is huge and thats where the fight is. Unfortunately it doesn't look good for AMD there either because programmers hate having to learn new code.defter - Monday, November 19, 2007 - link
Rape Intel in Crysis??Crysis was one of the few benchmarks where fastest Phenom was faster than slowest Core2 Quad. Still the Phenom's advantage was less than a percent.
I think it's better to say that Crysis is the benchmark where Phenom doesn't utterly suck (it just sucks a lot).
eye smite - Tuesday, November 20, 2007 - link
You really think those shining numbers are realistic from pre production sample cpu's? I think you should all wait til sites have full production MB's and cpu's and can give real data with all their tests, then you can decide it's a steaming sack of buffalo droppings. Until then, you'll just sound like a racaous bunch of squabbling crows.JumpingJack - Monday, November 19, 2007 - link
Ohhhh, here we go again with the 'It's note coded well for AMD' conspiracy theories.wingless - Monday, November 19, 2007 - link
AMD recently released new software libraries for these processors....JumpingJack - Monday, November 19, 2007 - link
Yeah, great for the FPU library which they already compiled into their PGI for the SPEC.ORG runs, which consquently are slowly getting the non-compliant branding pasted all over them.TSS - Monday, November 19, 2007 - link
"It turns out that gaming performance is really a mixed bag; there are a couple of benchmarks where AMD really falls behind (e.g. Half Life 2 and Unreal Tournament 3), while in other tests AMD is actually quite competitive (Oblivion & Crysis)."the UT series and halflife 2 are both very CPU intensive, while oblivion and especially crysis are videocard killers. it's hard to say but it sucks in games as well. you'd wanna see a difference in crysis though, make it scale to about 640x480. it's just too demanding on the graphics card to compare at 1024x768. in lament terms, in HL2 the graphics card is usually picking his nose waiting for the CPU, so a stronger CPU will make a dig difference. in crysis especially it's exactly the other way around, so that's why scores are closer together.
why is this true? in half life 2, there are 50 frames between the best and worst of the line up. in UT3, there are 50 frames between the best and the worst of the lineup (meaning regardless of clockspeeds or architecture. the frames per second is also measured in the hundreds, even by such a new and graphic intensive game as UT3 (though it should be noted as far as i know the beta demo did NOT ship with the highest resolution textures to keep file size down). now crysis has about 9 frames per second difference between a 2,2 ghz phenom and a 2,66 ghz intel proc, and oblivion manages about 16. same system different game it can only be concluded that the game is much more graphic card intensive, which shouldn't be hard to imagine since crysis is well known to be the graphics card killer of the moment.... and oblivion of the last generation (HDR and AA anybody?).
AMD's 10-30% slower in every other test they are as well in gaming. i belive the difference would've shown more if they used a SLI or crossfire solution, though i understand that's not possible with the chipsets and drivers existing at the moment.
MDme - Monday, November 19, 2007 - link
Time to upgrade.....to the dark side.