The Elder Scrolls IV: Oblivion CPU Performance
by Anand Lal Shimpi on April 28, 2006 10:00 AM EST- Posted in
- CPUs
The Test
Thankfully ATI's CrossFire runs on both ATI chipsets as well as Intel's 975X, so we were able to use our ultra high end GPU of choice to compare CPU performance under Oblivion. Remember that, just like in our first Oblivion article, we're manually walking through portions of the game and using FRAPS to generate our results, and thus the margin for error in our tests is much higher than normal; differences in performance of 5% or less aren't significant and shouldn't be treated as such.
While we tested with a number of AMD CPUs, we had issues with our Intel test bed where we couldn't adjust clock multipliers to give us the full spread of Intel CPU options, and thus we were only able to highlight the performance of a handful of Intel CPUs. However, with what we had we were able to adequately characterize the performance offered by Intel solutions under Oblivion. We also didn't have an Extreme Edition 965 on hand, so the EE 955 is the fastest offering from Intel in the test. The EE 965 should offer another 5% or so above what the EE 955 offers based on the tests we've done, just in case you're curious.
CPU: | AMD Athlon 64 and Athlon 64 X2s Intel Pentium Extreme Edition, Pentium D and Pentium 4 |
Motherboard: | ASUS A8R32-MVP Intel X975XBX |
Chipset: | ATI CrossFire 3200 Intel 975X |
Chipset Drivers: | ATI Catalyst 6.4 Intel 7.2.2.1007 |
Hard Disk: | Seagate 7200.9 300GB SATA |
Memory: | 2 x 1GB OCZ PC3500 DDR 2-3-2-7 1T 2 x 1GB OCZ PC8000 DDR2 4-4-4-12 |
Video Card(s): | ATI Radeon X1900 XT CrossFire |
Video Drivers: | ATI Catalyst 6.4 w/ Chuck Patch |
Desktop Resolution: | 1280 x 1024 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
Armed with a pair of X1900 XTs running in CrossFire mode - the clear GPU performance leader in our first Oblivion article - we set out to run some additional tests. Pay attention to the rest of the system as well: we've installed 2GB of high quality (i.e. low latency) RAM, which also helps performance. 1GB is sufficient, but Oblivion appears to do a good job of making use of additional memory; load times and area transitions are noticeably quicker with 2GB of RAM. We used the same "High Quality" settings we introduced in the last review:
Oblivion Performance Settings | High Quality |
Resolution | 1280x1024 |
Texture Size | Large |
Tree Fade | 50% |
Actor Fade | 65% |
Item Fade | 65% |
Object Fade | 65% |
Grass Distance | 50% |
View Distance | 100% |
Distant Land | On |
Distant Buildings | On |
Distant Trees | On |
Interior Shadows | 50% |
Exterior Shadows | 50% |
Self Shadows | On |
Shadows on Grass | On |
Tree Canopy Shadows | On |
Shadow Filtering | High |
Specular Distance | 50% |
HDR Lighting | On |
Bloom Lighting | Off |
Water Detail | High |
Water Reflections | On |
Water Ripples | On |
Window Reflections | On |
Blood Decals | High |
Anti-aliasing | Off |
36 Comments
View All Comments
RyanHirst - Sunday, April 30, 2006 - link
hehe it's ok. Just chompin' at the bit curious, that's all. If anyone on eBay wanted the collectible books I no longer need, I'd have turned them into a magic pair of 275s and I'd know. Alas. Pantone 072 Blue.bob661 - Saturday, April 29, 2006 - link
I guess this round of testing only applies to ATI video cards. I guess us Nvidia owners are left to guess how CPU performance affects GPU performance. Oh well. :(PrinceGaz - Sunday, April 30, 2006 - link
Just refer to part 1 of the Oblivion article and find out where your nVidia card of choice lies in relation to the four ATI cards tested this time and it is easy to see how it will perform with various CPUs.Bladen - Saturday, April 29, 2006 - link
My guess as to why CPUs help so much in towns is because the Radiant AI takes a fair amount of power.BitByBit - Saturday, April 29, 2006 - link
I don't know why, but the benchmark graphs didn't appear at all in that review, nor did they in the previous Oblivion review; I get nothing where the graphs should be.Has anyone else had this problem?
blackbrrd - Saturday, April 29, 2006 - link
If you have turned off refering you won't see any images.JarredWalton - Saturday, April 29, 2006 - link
Someone else had a problem with Norton Internet Security blocking images for some reason.Madellga - Monday, May 1, 2006 - link
That was me. Turn off privacy control.RyanHirst - Saturday, April 29, 2006 - link
This article still left me with the nagging question about multithread performance. The oblivion team made reference to th game being heavily optimized for multithread performance because they knew from the beginning they'd be writing it simultaneously for the XBox360.So the debate about the potential of multithread code in games has been going on for awhile, and here we have the perfect game test, and we happen to know AnandTech had a couple of four-way servers in the shop over the last few weeks.... but the CPU guide leaves that question unanswered.
If it's not unreasonable to heap $1200 in graphics hardware onto a M/B for a game that is GPU bound only half of the time (outdoors), is it too much to ask that a $1200 pair of Opteron 275's be tested to see how multithread the first advertised multithread game really is? Is it not possible that the game can offload a large number of threads to extra cores?
If we can talk about throwing over $1K at a game, isn't anyone the least bit curious how a 4-core opteron rig with 4 gigs of RAM in NUMA might handle this game?
JarredWalton - Saturday, April 29, 2006 - link
P4 EE 955 runs up to four threads, and it doesn't seem to get any help from the extra capability. It could be that further INI tweaks would allow Oblivion to run better on multi-core (more than 2 core) systems, but if going from 1 to 2 cores gives 10 to 20% more performance, there's a very good chance that moving from 2 to 4 cores wouldn't give more than another 5%. Ideally, a CPU-limited game should be able to get as much as 50% or more performance from multi-threading, but rarely can we realize the ideal case.Also, FYI, the servers are at a completely different location than the GPUs for this testing. They also don't support dual X1900 cards in CrossFire - they might not even have X16 PCIe slots, let alone two of them. Servers are, quite simply, not interested in improving gaming performance. There are a few out there targeting the 3D graphics workstation that might support SLI, but not CrossFire. Multi-core will really only become important when we have multi-core CPUs. The desktop/home PC market isn't interested in multiple socket motherboards (other than a few extreme enthusiasts).