ATI, finally got NVIDIA on the Ropes?

Status
Not open for further replies.
Like I said -- you don't know what you're talking about. The X1800XT is on par with the 256MB 7800GTX, and manages to beat it by a few FPS in several games, while both cards are at stock speeds. I bet whenever you look at benchmarks comparing the two cards you go straight to the doom 3 benchmark. You have to accept the truth. BOTH CARDS ARE EQUAL. This will not be the case when the 512mb 7800GTX comes out, but right now, the two cards are ON PAR.
 
X1800XT beats the 7800GTX in Direct 3d games, and is on par with them in OPEN GL games now that the new drivers are out. This forum is obviously Nvidia biased, so Nvidia fans will say anything to make Nvidia look good.

Ow ya, it isn't pixel pipelines that matter more than it is fill rate, and the X1800XT has more than the GTX right now.

(If you look at the Hard forums benchmark, you will notice that the X1800XT looses slightly in the benchmarks, BUT IF YOU LOOK CLOSELY, you will see that the XT is run at higher image quality settings than the GTX is)
 
apokalipse said:
the ATI cards are better in Direct3D (like 3dmark05)
the Nvidia cards are better in OpenGL (like Doom 3)

DIrect 3d games:

BF2 Ow ya no one plays that anymore
HL-2 ya sure its an obsolete game
Fear- Hmmm thats such an old game it doesn't even matter (btw there is a fix where you can get 10 frames from fear with ATI cards by changing the Fear.exe to something else)
AOEIII- who plays that anyways???
 
http://www.hexus.net/content/item.php?item=3668
this says it improves in the Anti-Aliasing department.

http://downloads.guru3d.com/download.php?det=1254
looking at what ATI's drivers actually bring in, one of the features is "Adaptive Anti-Aliasing"
which basically means the card turns anti-aliasing up or down depending on the load.

I'm kinda skeptical about that, because it really just means that even if you set anti-aliasing to a certain level, the card might still push itself down a level or 2 in order to increase performance.

i.e. if you tried to test it in a benchmark with X amount of AA, it could adjust itself lower when on higher load. that means it's not really going at that X amount of AA all the time. which also means that against other cards (even other ATI cards with previous drivers) it might not really be a fair test

if cutting down the level of AA automatically is what the performance increase comes from, then this is just crap.
 
I don't really care which card is better, but this is just stupid. getting better benchmark scores by automatically adjusting the level of AA to lower even if you set it at a certain level. it's just crap. it's not a performance enhancement, it's cutting corners.
 
No it's not, first of all you can choose if you want to use it or not, and second of all, if you are in a small room where you would only need 2x AA to antialias the whole scene, then that is actually a very good idea because it would boost preformance without any loss in image quality, If you want to talk about cutting corners, then let's have a talk about Nvidia's anisotropic filtering. They cut some major corners on it, if you leave image quality at the default setting of "quality" in the nvidia drivers, you will notice a lot of texture shimmering with AF, and when you set image quality to higjh quality, the shimmering goes away, but at the cost of 10-30 fps depending on what game you are playing and what you are looking at.
 
the GT gets 180FPS on source. idk about yall but thats pretty **** good. I would definatly put the res up and crank the AA and AF to lose 10-30fps. its not like your eye can see over 100fps.......
 
We're talking real world preformance here. Try playing fear on the GTX at 1280x1024 with everything on maximum, with 4x aa and 16x high quality af. The PC will choke. Don't use the source engine as an example, thats already several years old. Try todays games.
 
Status
Not open for further replies.
Back
Top Bottom