Not starting a Geforce vs ATI war again or anything, just having a simple question. If the graphic card rankings thread lists a HD3870 X2 as being higher then a every geforce card except 3 ( the GTX 280/260, and 9800 GX2 ) why does it always seem to fail horribly in all video FPS benchmarks I see posted on various websites? An example of a popular game would be AoC, in it, the HD3870 X2 gets lower fps then a GeForce 9600 GT and GeForce 8800 GT on average...so...any comments? Not only that, but its gamespots comments like this:
" During our testing, we found that Age of Conan doesn't play nice with ATI video cards. Aside from performance concerns, the ATI cards we tested locked two of the view distance sliders permanently to their maximum settings, and there isn't much you can do about it yet. Because of the locked sliders, we could only directly compare ATI and Nvidia cards at high quality settings. The game also has two entirely different preset medium quality settings for Nvidia and ATI cards; as a result, the "medium" image quality for the two sets of cards look nothing alike. The main difference between the two sets of medium settings comes from the main view distance range setting. On Nvidia cards, medium settings force the main view distance range to 2000 meters, down from 2800 meters in high quality mode. The draw distance number plunges to 200 meters on ATI cards. We tested both GPU brands with matching high quality settings but had to separate out the rest into their own charts for the other quality settings. "
that really make me not understand why people would not just want to go with the tried and true series of Nvidia cards?
" During our testing, we found that Age of Conan doesn't play nice with ATI video cards. Aside from performance concerns, the ATI cards we tested locked two of the view distance sliders permanently to their maximum settings, and there isn't much you can do about it yet. Because of the locked sliders, we could only directly compare ATI and Nvidia cards at high quality settings. The game also has two entirely different preset medium quality settings for Nvidia and ATI cards; as a result, the "medium" image quality for the two sets of cards look nothing alike. The main difference between the two sets of medium settings comes from the main view distance range setting. On Nvidia cards, medium settings force the main view distance range to 2000 meters, down from 2800 meters in high quality mode. The draw distance number plunges to 200 meters on ATI cards. We tested both GPU brands with matching high quality settings but had to separate out the rest into their own charts for the other quality settings. "
that really make me not understand why people would not just want to go with the tried and true series of Nvidia cards?