ATI, finally got NVIDIA on the Ropes?

Status
Not open for further replies.
Toms hardware is ALWAYS using outdated drivers, for both ATI and Nvidia....that's why I dislike them.

Their reviews just seem half assed most of the time.

HardOCP though I think is ridiculously nvidia biased....can't remember 100% if it's that site or not, but the guy who owns it goes to pimprig.com and he really favors the nvidia line, so I don't much trust that one either.

I THINK...>THINK< it's that site anyway, not 100% sure on that so don't quote me
 
Nah, HardOCP gave very good non-biased benchmarks of the X1000 series. I love HardOCP and Xbit Labs as my favorite hardware sites. BColes just got owned :)
 
You guys are hilarious, keep it coming. I'm enjoying this thread.

Hey Nubius, what'dya think of DriverHeaven.net and that review I posted. Are they a trustworthy bench?
 
don't know, never been to driverheaven.

For the most part I avoid websites that do reviews and just look for member posts across various sites, however this of course generally means waiting a month or more after the cards are released.
 
Well im a member there and its very professional from what I can tell. You should check em out.
 
right........

anyway.....
I don't care for this "ATI sucks" or "Nvidia sucks" stuff. why do people always look at the company before the card a lot fo the time?
the card is not the company.

let's get back to the facts about things:
the X1800 is a good card, which is on par with the 256MB 7800 GTX, and the 512MB GTX is even better.

Adaptive Anti-Aliasing is not a performance enhancement, it is cutting corners. it doesn't matter how you look at it.
ATI is saying that it increases Doom 3 performance. of course it does, but not by making the card faster. it does so by lowering settings.
now if people buy a high-end card from ATI, they will want it to be able to play at high settings all the time. Adaptive Anti-Aliasing does the opposite.
that being said, I am not saying that ATI or its X1800 "sucks."
the X1800 is still a good card however you look at it. but adaptive AA is stupid.

the 512MB 7800 GTX is really just an upgraded 256MB GTX, but it actually does something for the card. the 256MB 7800 GTX wasn't the best Nvidia could have put out. Nvidia did that deliberately against ATI.

now, a lot of benchmarks are ATI biased, even if the author has not intended it to be.
see, 3dmark05 and HL2 are some of the most widely used benchmarks people use. and ATI cards work better on those.

people often tell others not to count benchmarks that favour Nvidia, such as Doom 3, as Doom 3 does heavily favour Nvidia. we all know that. you may aswell not use 3dmark05, or HL2. since they favour ATI.

or you could include a wider range of benchmarks, INCLUDING some that favour one card, and some that favour the other.

you could run HL2, 3dmark05, AND Doom 3. come on, why hide what really happens?
 
this is the second time you have ignored my post:

if you are in a small room where you would only need 2x AA to antialias the whole scene, then that is actually a very good idea because it would boost preformance without any loss in image quality, If you want to talk about cutting corners, then let's have a talk about Nvidia's anisotropic filtering. They cut some major corners on it, if you leave image quality at the default setting of "quality" in the nvidia drivers, you will notice a lot of texture shimmering with AF, and when you set image quality to higjh quality, the shimmering goes away, but at the cost of 10-30 fps depending on what game you are playing and what you are looking at.
 
Status
Not open for further replies.
Back
Top Bottom