Amd Athlon 64 3000+ and ATI Radeon x700 pro

Status
Not open for further replies.
beedubaya said:
Who cares if it gets like 10 fps higher in CSS if the image quality is utter crap in new games because it lacks the proper shaders. Shader Model 3 is the most important aspect of modern graphics...go ahead and get an ATI X series card, but don't say you weren't warned. The R520 a.k.a. X900 will be a good card when it comes out...and will have the shaders to run modern games. Wait for that if you must go ATI.
You're overrating pixel shader 3 way too much, I don't know how you don't realize this. That's like saying that everybody needs to buy a 64 bit processor because all future things will use it.
Stop kidding yourself kid, barely anything uses that stuff and barely anything will because it hardly does anything so who cares. It won't become even close to mainstream in games for at least another 3 years.
 
3 years...what a statement lol....all major titles comming out this year and hereafter will be titles in which SM3.0 is ESSENTIAL..including F.E.A.R., S.T.A.L.K.E.R., HL2 "The Lost Coast", HL2 "Aftermath", Unreal Tournament 2007. In fact, UT2007 will be so SM3 intensive its rediculous. Whatever you want to think...the SM3 age is upon us..and like I said, the X900 will be a good card, and will have SM3 support, but getting an X800 series card is a complete waste of money.
 
beedubaya said:
3 years...what a statement lol....all major titles comming out this year and hereafter will be titles in which SM3.0 is ESSENTIAL..including F.E.A.R., S.T.A.L.K.E.R., HL2 "The Lost Coast", HL2 "Aftermath", Unreal Tournament 2007. In fact, UT2007 will be so SM3 intensive its rediculous. Whatever you want to think...the SM3 age is upon us..and like I said, the X900 will be a good card, and will have SM3 support, but getting an X800 series card is a complete waste of money.
Everything you just mentioned is a scrub title.
And I didn't waste my money, because I'm sitting here getting 70 fps with 12x aa, 16af and whatever else playin' some aao just chilling and owning some noobs. And I couldn't give a shit about S.T.A.L.K.E.R. or whatever other crap thank you.

And not only that, but even if you did want to play one of those titles you could play them just fine on a PS2 card because PS3 hardly makes any difference. So because my ATI card is superior it will still probably beat the crap outta the nvidia if I wanted it to. I will leave you with this excerpt and I hope you feel like a noob.

"The step from 2.0 towards 3.0 is a rather small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more performance. DirectX 9 is now updated and we are going to see more support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Not even the slightest bit. Yet any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it gives a performance boost."
-Source 3dguru.com
http://www.guru3d.com/article/Videocards/184/3/
 
yea, except hardly anything uses 2.0, duh.
almost all games now use 1.1 and currently the 6600gt and 6800gt kick the shit out of everything else for the money. I could hardly give a shit if the x800xt-pe gets 5fps higher than the 6800U 512m when the ultra will last another 6-12 months because it has pixel shader 3.0 which is far better than 1.1 because NO ONE uses 2.0.
oh, AND the fact that the 6x00gts are WAY better for the money.
i hope you feel like a noob.
|\|00|3

P.S. Sorry for hijacking thread, get an eVGA 6600gt and a 3200+ venice core. best value/money right now (if you have the cash get a 6800gt)

P.P.S. AMDs have lower clock frequencies, but they do more with each clock cycle, that's why they are named xx00+ because you move the decimal over three and have what they are equivalent or better to in intel with normal programs, and AMDs preform way better in games than intels. ex. 3500+ = 3.500+ = 3.5+ghz intel in apps
 
What you say would be true....if game manufacturers would include SM2.0 and 3.0 paths. However, the day of SM2.0 has come and gone, and many game developers are skipping straight from 1.1 to 3.0, like we see in Splinter Cell:Chaos Theory. SM2.0 is a real pain in the arse to code in when developers can code in SM3.0 much easier to get the same effects, with higher framerates. Plus, Nvidia sponsors many games and they don't want SM2.0 effects to be included so Nvidia hardware is required to get true DX9 effects.

Plus, who cares whether or not you care about S.T.A.L.K.E.R. Most people at this forum are drooling over that game.
 
don't you need a 64-bit processor if you want to use windows x64? and don't you need SM 3.0 to run games like Splinter Cell Chaos Theory on high. see, what beedubya is trying tto say is that in games like SCCT, a 6600GT can run it in all of Dx9.0c glory complete with HDR lighting, parallax mapping, and functional specular lighting. an X850 XT-PE runs it Dx8.1 in SM 1.1 and has NO HDR lighting, NO parallax mapping, and disfunctional specular lighting.

not only that, but an X850 XT-PE will be running lost coast on LOW compared to a 6800GT running it on HIGH will all of its HDR glory.

All of this is because Nvidia cards have a GREAT thing called Shader Model 3.0!!!
 
dale5605 said:
Everything you just mentioned is a scrub title.
And I didn't waste my money, because I'm sitting here getting 70 fps with 12x aa, 16af and whatever else playin' some aao just chilling and owning some noobs. And I couldn't give a shit about S.T.A.L.K.E.R. or whatever other crap thank you.

And not only that, but even if you did want to play one of those titles you could play them just fine on a PS2 card because PS3 hardly makes any difference. So because my ATI card is superior it will still probably beat the crap outta the nvidia if I wanted it to. I will leave you with this excerpt and I hope you feel like a noob.

"The step from 2.0 towards 3.0 is a rather small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more performance. DirectX 9 is now updated and we are going to see more support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Not even the slightest bit. Yet any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it gives a performance boost."
-Source 3dguru.com
http://www.guru3d.com/article/Videocards/184/3/

dale,
if the link doesn't automaticly take you to the Splinter Cell Chaos Theory part of the review, please go to it. enjoy the inferior image quality of the ATi X-series of cards.

http://www.hardocp.com/article.html?art=NzYwLDc=
 
Sw1tCh[FX] said:
dale,
if the link doesn't automaticly take you to the Splinter Cell Chaos Theory part of the review, please go to it. enjoy the inferior image quality of the ATi X-series of cards.

http://www.hardocp.com/article.html?art=NzYwLDc=
I can't even tell the difference between the screenshots, not that I have good eyes, but it's not very much. And that link tells you plain and simple that it offers very little performance edge and the game runs just fine without shader 3. Notice how when they start enabling the extra shader 3 options the game starts to run too slow. So thanks for proving my point with that link.
 
Dale... pff

FarCry and SCCT are both heavily shader model 3 dependant, and future games are.

And if your just pwning some n00bs in aao, you wont see a difference from your x800 XT compared to a 5900
 
Status
Not open for further replies.
Back
Top Bottom