GTX 260 core 216 worth it to upgrade? 8800 GTX comparison

Status
Not open for further replies.
You may not notice it until you play with a card that doesn't do it.

Or..try doing a fraps log of your min fps while playing the cod4 map Countdown and flying around inside the smoke. :)
 
You may not notice it until you play with a card that doesn't do it.

Or..try doing a fraps log of your min fps while playing the cod4 map Countdown and flying around inside the smoke. :)

ha lol good point, but i have played through cod4 as well as multiplayer on my dads gtx 260 and it's just like mine locked at 60fps. i dont know you might be right tho. i remember my fx5500 used to have horrible smoke lag in kotor. that card sucked.
 
^ got that right. mine was a pci version too :( it was just a hold off card untill i got some cash, after my x1900xtx died for the second time.
 
all the fx cards sucked :p

heh QFT, that was when ATI had time to be put on the map. Also the fx cards included a time when nvidia was specifically making drivers to detect benchmarking programs which caused their cards to use various "cheat" methods to get higher scores ( dunno if the companies still do this? )

anyways most professional benchmark testing completely negate AA being on ( for understandable reasons ), does anyone have any hyperlinks or benchmarks of their own comparing AA on and off? ( Preferably in Warhead :p )
 
^^

what???

Nvidia has nothing to to with 3dmark..if futuremark wants to add points
for physx, thats on them.

They bought Ageia out, and added physx support to their current lineup of gpu's which already have physics support. Not for points. Physx drivers do absolutely nothing for my score.
 
No remember, they got a lot of flak earlier this year because it was found out that some files in 3dmark were written over with Nvidia files or something to that effect. It had to do with physx. I don't remember the particulars but I know there was a stink about it.
 
"Questionable tactics
Nvidia's GeForce FX era was one of great controversy for the company. The competition had soundly beaten them on the technological front and the only way to get the FX chips competitive with the Radeon R300 chips was to greatly optimize the drivers.

Nvidia historically has been known for their impressive OpenGL driver performance and quality, and the FX series certainly maintained this. However, with regard to image quality in both Direct3D and OpenGL, they aggressively began various questionable optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy, and thus visual quality.[9] Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate.[9] Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to "High Quality" can alleviate this occurrence at the cost of performance.[9]

Nvidia also began to clandestinely replace pixel shader code in software with hand-coded optimized versions with lower accuracy, through detecting what program was being run. These "tweaks" were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that Nvidia had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all.[10] This artificially boosted the scores the FX series received. Side by side analysis of screenshots in games and 3DMark03 showed noticeable differences between what a Radeon 9800/9700 displayed and what the FX series was doing.[10] Nvidia also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers. It should however be noted that ATI also created a software profile for 3DMark03.[11] In fact, this is also a frequent occurrence with other software, such as games, in order to work around bugs and performance quirks. With regards to 3DMark, Futuremark began updates to their software and screening driver releases for these optimizations.

Both Nvidia and ATI have optimized drivers for tests like this historically. However, Nvidia went to a new extreme with the FX series. Both companies optimize their drivers for specific applications even today (2008), but a tight rein and watch is kept on the results of these optimizations by a now more educated and aware user community. "

GeForce FX Series - Wikipedia, the free encyclopedia

the wiki entry has a whole bunch of more links to the sources they cite
 
Status
Not open for further replies.
Back
Top Bottom