another 6600GT thread - ATTN nivida fans

Status
Not open for further replies.

macdawg

Daemon Poster
Messages
813
some ATI fanboys are saying that now that the their is a 128 MB X800 that the 6600GT is dead.

I told them them to look at benchmarks on the 6600GT at www.anandtech.com and told them ATI doesn't have pixel shader 3.0 and they said this:

"Not that much of a difference between ps2 and ps3 "graphics" wise. Ps3.0 allows for more registers so it's easier to code for. by the time ps3.0 games come out current cards would barely be able to run them so buying a 6800gt right now to play ps3.0 games when they come out next year is useless. For example, unreal 3 has been tested with the 6800ultra and the x800xt and they ran at around 5-10fps with all the eye candy on.
http://www.extremetech.com/article2/0,1558,1583548,00.asp"

"ps4.0 will be the big "graphics" jump. "

"you may have been tricked by PR stunts like this one:
http://www.pcper.com/comments.php?nid=145"

"there is absolutely nothing pixel shader 3 does, that pixel shader 2 cannot do. pixel shader 3 allows for longer code. that's it."

"There are 2 good reasons why the x800 (128) has retired the 6600gt:
1. 12 Pixel Pipelines (vs. 8) 2. No overheating and BIOS problems"
 
lol

Those articles linked are at nearing a year old, the x800 is hardly anything new and claiming its superiority over the 6600GT means nothing considering the x800 is a high end stream and the 6600 is a midrange stream.

It's nothing but common knowledge that PS3.0 is a very demanding instruction and even cards like the 6800GT and Ultra may prove futile in games using this instruction. So what do you propose is the solution. PS4.0, an even more demanding instuction that has absolutely no support as of yet! Every new platform introduced is usually rather innefficent until next big breakthroughs occur, and the G70 and r520 will most likely be able to handle PS3.0 with relative ease.

In addition, with dual GPU cards on the horizon, PS3.0 may be threatening to existing cards, but in a year, probably in a few months, it'll be the mainstream platform and cards will be able to handle it.
 
Well put Gaara.. These two cards are in completely different price range. Which puts them in completely different performance range.
 
Meh, is all I have to say to them, they are just pissed that they spend a ton of money on a flagship card that doesn't even have support for the latest graphics that will be on a game like HL2 that was supposed to be built specifically for ATI haha
 
they are talking about a 128MB RADEON X 800 - saying it's the same price and that it's better!
 
Status
Not open for further replies.
Back
Top Bottom