Is this performance ok? (regarding CS:S Performance test)

Status
Not open for further replies.
Nubius said:
Nah, probabl the simply fact you don't have any AA on at all, I didn't stress test with absolutely no AA with Trilinear AF, so I don't know what I would have gotten there, but AA will most definitely take a nice chunk out of frame rate. Try that and see what kind of difference it makes...also it seems like CS does a decent job of testing CPU and RAM also, which the AMD64's memory bandwidth is a lot better than socket A

yea, you're right

4xAA 16xAF - 99.82/100.22
4xAA 8xAF - 100.62
0xAA/6xAA (6 reverts to 0)/16xAF - 122.34
no AA/trilinnear AF - ~129-131
 
dale5605 said:
ApM maybe it is because you overclocked too high. 3.4 ghz is quite a large overclock for an AMD I think.

Huh? Im confused :-s, I have not overclocked my AMD :-s
 
Yeah, if you have an AMD64 3400+ you should label it as such. 3.4GHz is just misleading.

yea, you're right

4xAA 16xAF - 99.82/100.22
4xAA 8xAF - 100.62
0xAA/6xAA (6 reverts to 0)/16xAF - 122.34
no AA/trilinnear AF - ~129-131
Pretty damn good FPS dude. You're running your 6600GT at stock right? 500/1000 for you?

4xAA and 16xAF and still getting 99 is pretty dang good. Even if I raised my memory to 1000 I don't think it'd do that.

You get 8200 in 3dmark03 stock, I have to OC my card to get that.

564/1101 @ DDR400 @ 2.5GHz - 71.84 Drivers- 8615 - 3dmark03(Build 360)

That's the best I've been able to do for 3dmark03 which wouldn't be too hard for you to beat. Looks like the memory bandwidth of the AMD64's really help, because I know the CPU isn't giving the boost here with yours only being a 3200+
 
The trick is seeing how many fps you get on 6x temporal aa.

I like how people say nvidia is better just because they have pixel shader 3. You don't see anyone saying that ATI is better just because their cards can run 12x AA. Just one small feature that makes little difference shouldn't factor in so largely.
 
Because simply put...who cares? not like you can even tell a difference between 6x and 12x besides your framerate dropping to hell.

Regarding pixel shader 3, when that's being used as nvidias sole arguement I apply "Who cares?" to that too, so I'll say that to both companies.

My biggest beef is people complaining about 10 FPS..who freakin cares...oh no his gets 150FPS and mine gets 140...yet my eyeball consideres anything about 40FPS smooth :whhaaaaaaa:
 
I don't lose out on much fps though. I don't play at high framerates either. I usually run the highest settings between 40-60 fps (vsynced at 60) and that is with 12x aa so 12xaa doesn't make the fps unbearable... I also use things like geometry instancing and alternating pixel center and the quality is great. When I run it everything low I get like 250 fps.
 
Nubius said:
Yeah, if you have an AMD64 3400+ you should label it as such. 3.4GHz is just misleading.

Pretty damn good FPS dude. You're running your 6600GT at stock right? 500/1000 for you?

4xAA and 16xAF and still getting 99 is pretty dang good. Even if I raised my memory to 1000 I don't think it'd do that.

You get 8200 in 3dmark03 stock, I have to OC my card to get that.

564/1101 @ DDR400 @ 2.5GHz - 71.84 Drivers- 8615 - 3dmark03(Build 360)

That's the best I've been able to do for 3dmark03 which wouldn't be too hard for you to beat. Looks like the memory bandwidth of the AMD64's really help, because I know the CPU isn't giving the boost here with yours only being a 3200+

wierd, yesterday i ran stock and i got ~99-100 for 4/16, but i just tested and i'm getting 90-91 (both on 76.10)

i got 7999 on 3dmark03 on 66.93 and 8218 on 76.10 (all stock)

edit: i've also played cs:s on my friend's dell (intel extreme integrated :D), and he gets 26fps and it's still playable
 
Status
Not open for further replies.
Back
Top Bottom