FEAR 1.03 Performance

The final game that we'll look at is FEAR, as it represents the most demanding game engine currently out there. Note that we don't say that it's the best looking game out there, but as you'll see in the results, it is definitely not CPU limited. We didn't have time to run benchmarks on all of the configurations for FEAR, and the 6800 GS would not run in SLI mode -- or rather, it would run, but the results were the same as running with a single card.

That represents one of the biggest problems with SLI: game executable detection/support. It could be that the latest patch broke support on lower NVIDIA cards. It's unfortunate that we can't see how the 6800 GS SLI compares to a single 7800 GTX. I tried five different driver versions from NVIDIA, including their latest beta drivers, and I tried various methods to force SLI support. The eVGA KO cards worked in SLI, after a few tweaks, but nothing that I did would enable SLI on the 6800 GS cards. We also have graphs for minimum frame rates as well as average frame rates, since FEAR conveniently reports that information. (We could list maximum frame rates as well, but that really doesn't matter nearly as much as the other two.)

How's that for graphically demanding game? Even the GTX KO SLI configuration shifts the bottleneck to the graphics cards starting as low as 1024x768 (or 800x600 with anti-aliasing). We benchmark without soft shadows, but enabling the soft shadows has a performance hit roughly equal to that of enabling 4xAA. The truly amazing thing about FEAR is that it's not even really playable without high-powered SLI graphics cards if you want to run anti-aliasing and a high resolution. If FEAR is your game, a system like the Monarch Hornet Pro Revenge will manage to run it properly. Hopefully, FEAR represents the worst case scenario for the next year or more, unless you're one of those who likes finding excuses to upgrade frequently.

Day of Defeat: Source Performance Noise, Heat and Power
Comments Locked

13 Comments

View All Comments

  • JarredWalton - Monday, March 6, 2006 - link

    HDCP support is a graphics/display issue. As has been reported, HDCP is not supported on any current retail graphics cards. It's also not supported under Windows XP. We should start seeing HDCP enabled cards (meaning, with the necessary decryption chip) in the near future. The GPUs are ready, but they still need the appropriate chip soldered onto the boards.

    Personally, I'm really not happy with HDCP at all, so I'm doing my best to avoid it. 1280x720 DivX looks quite nice and runs flawlessly on current hardware. Here's an example from the olympics (18GB compressed to 4.5GB 1280x720):

    http://images.anandtech.com/reviews/multimedia/tvt...">2006 Olympics Men's Hockey Gold Match
  • AGAC - Tuesday, March 7, 2006 - link

    Hey, what's to love about HDCP. That said, it seems that we just will have to swallow that frog... I mean, DivX does look nice indeed. The problem is availability of mainstream content. I think it's going to be a very cold day in hell before you can walk in the regular video rental and get the latest blockbuster title in beautiful DivX 1280x720.

    DHCP will be broken, we all know that. It only harms the legal user because one will have to upgrade video cards, monitors and god knows what more will not be HDCP compliant. Thanks for the your tip and simpathy. Keep up the good work.

    AGAC
  • DigitalFreak - Monday, March 6, 2006 - link

    NT

Log in

Don't have an account? Sign up now