How important is Shader Model 3? (deciding between ati X850XT and nvidia 7600GT)

Status
Not open for further replies.
Beefcake said:
But I still think the X1800GTO is very overlooked on these forums, Nvidia fanboism is pretty dominaint here. [/B]

I agree with you. If I get the job that I'm interviewing for tommorow I'll spend the extra $50 and get the HIS X1800GTO because I hear that all of them perform like the one in this review on VR-Zone

Basically it says that you can overclock it and reach ~9500 in 3dmark05. 7600gt average overclock is ~6500.
 
I wonder how many people have X1900XT's here. They have 5-6 times more 7900's than X1900's im sure.
 
IMO The X1900XT/X is the best single card GPU out in the market, you can get a HIS X1900XT with the IceQ3 cooler for $420 dollars after a 10 dollar mail-in rebate on Newegg. People still keep repeating (like what nubius said) that ATI's drivers sucks, this is very true for the X series of cards, but not true for the X1K series. While Nvidia's drivers may still be somewhat better, it's like saying since the 60,000 dollar car is better then the 50,000 dollar car, that the $50k car sucks. This is also why people buy the best of the best and are spending too much money on it, just look at SLI, or, god, even quad SLI. ATI's X1K drivers are pretty good, a huge step up from the X series.

People keep recommending Opterons because everybody else recommended it, I know this is true because someone actually said "I was following the crowd" or something of that extent.

The X1900GTO isn't a bad card except it costs more then the 7900GT and ATI really screwed up with the card. They locked 12 of the 48 shader pipelines and underclocked standard X1900 clocks, maybe they didn't want people buying the X1900GTO and overclocking it to XT/X speeds but they should of kept the shader pipelines as that is the thing that puts the X1900XTX next to the 7900GTX.

I remember when people kept saying they are not going to overclock but people say "ooh, you will overclock, you may think you won't but you will, get the Opteron" even though the 3800+ is better (and cheaper) then a stock Opteron 165. That's almost as bad as when Green Radience told me (and others) saying things like "he won't do this, or he won't do that" YOU ARE NOT HIM! DON'T SAY WHAT THEY WANT OR WILL DO BECAUSE YOU ARE NOT THEM!

I would just go with SM3.0, either get the X1800GTO or the 7600GT, either will make you happy (and they both have SM3.0).

X1K drivers don't suck
The X1800GTO can do HDR+AA, AAA, and HQAF
The X1800GTO handles AA/AF great like many ATI cards do (a X1800GTO runs at almost the exact same speed as the 7600GT while playing Doom 3 w/ AA/AF)

But the 7600GT is a good card and plays FEAR better then the GTO, but that isn't quite as importaint as having better frames in Oblivion while being outside (where framesrates can easily go to the unplayable level).
 
Trifid said:
No it doesn't, the X1600XT has 12 pixel pipelines and 5 vertex. I had a bad experience with the ATI X1600pro (restarts, and my god the drivers couldn't be worse even if they tried.) ATI how ever has the technology but that isn't avaliable in these cards.

SM3 is pointless in multiplayer games, as it is distracting and you can't see the enemy at the end of a tunnel. Single player it is great.


just go a scearch by google, i don't understand how do u post before u search.

x1600xt has 4 pixel pipeline and it is 12 pixel shaders

ths same as 1900xt

x1900xt has 16 pixel pipeline but it has 48 pixel shaders


The x1600xt has 4 pixel pipelines, and 12 pixel shader units. It's pretty easy to see how a card with that kind of setup will do in games. Compare the x1600xt to the x800 to see what I mean. Both cards have the same memory bandwith, and they both have the same number of pixel shaders. The only major difference is in the core speed, which is 600mhz on the x1600xt and 392mhz on the x800. With the 53% increase in core speed, you would expect the x1600xt to be at least 20% faster in games, right? This article from Anandtech.com says otherwise. The x1600xt is usually beaten by or equal to the x800, and sometimes even the 8 pipeline 6600gt. The only exception being BattleField 2, where the x1600xt completely destroyed both cards(not bad for a 4 pipeline card). The x1600xt has 53% more pixel processing power on the core, but can barely compete with the cards that have the full 12 pipelines. In fact, the card is closer in performance with an 8 pipeline card with those same specs, suggesting that a single pipeline setup like the rv530 is just as good as 2 pipelines with the old pixel pipeline setup.

http://www.gamespot.com/pages/unions/read_article.php?topic_id=24070333&union_id=1927



also see here

http://www.motherboards.org/reviews/hardware/1613_3.html

http://en.wikipedia.org/wiki/Comparison_of_ATI_Graphics_Processing_Units
 
Well I spent the extra $50 and got the X1800gto and then a overclock to 730/725 and scored 9250 in 3dmark05. So I'm very happy.
 
Status
Not open for further replies.
Back
Top Bottom