GTX 260 Core 216 vs. Hd 4870 1GB

Status
Not open for further replies.
Wouldn't 40 fps be just fine, and probably just as good as 70 as the human eye perceives it? I th.ink I'd be fine with 35 and over...

EDIT: And just how much do games like Crossfire better than SLI? And I've nver heard about it being more error-prone than Xfire... I don't think I screwed up too bad with the C216 though, even though I had been planning on the 4870 1GB for months, and changed my mind in the last 3 days...
 
Wouldn't 40 fps be just fine, and probably just as good as 70 as the human eye perceives it? I th.ink I'd be fine with 35 and over...

EDIT: And just how much do games like Crossfire better than SLI? And I've nver heard about it being more error-prone than Xfire... I don't think I screwed up too bad with the C216 though, even though I had been planning on the 4870 1GB for months, and changed my mind in the last 3 days...

Yes, because the human eye can't tell past 30, as it'd be smooth.. (well for some games the smooth feeling comes at 35 or 40FPS depending on the engine)
 
you'll be fine, and you really just want the higher fps (over 35 ) for the intense parts where action drops it considerably, see I'm always straddling 35 Fps and anytime my comp has to render more of the upcoming environment or I get into a firefight frames drop to the lower 20's :| (mostly CPU bottleneck though)
 
Just curious why do people want more then 30 FPS out of any game? The human eye cannot see more then that any way?
 
Well it's 30 fps on average, which means a lot of the frames fall below that.

But probably most importantly, cause it makes people feel better that they wasted their money on something very powerful to get the extra "necessary" frames.
 
Just curious why do people want more then 30 FPS out of any game? The human eye cannot see more then that any way?

You cant see more, but the game is more fluid.

Also, it allows for areas which get intense and can dip below 30fps very very quickly.

I like to stay in the 60fps range myself....anything past that is just a waste, as thats the maximum my screen can display.
 
OK SERIOUSLY, LEARN TO LOOK AT SOME BENCHMARKS. Even 3 GTX 280's can only do 80 FPS. 2 4870 will do descent, but not 70!!!

i average 67 - 71 FPS all high no AA, 1280x1024 with the 2 x 8800GT SLI so how can 2x4870 not get 70? makes no sense :confused:

also the guy above is right, intense actions sequences drop to about 23 fps which is not good at all, and anything less than 45 fps in some games is very choppy but when you reach 55+ FPS the screen is smooth!
 
You can tell the difference between 30 fps and 60 fps. At 30 fps there is no lag, but the frames are still choppy, and there might be "screen lag." At 60 fps, there is reduced lag and the frames become "smooth" and the "screen lag" is reduced. Now if your looking at a CRT which can hit 120fps, you can totally tell a difference between 60fps and 120fps. At 120, there is almost zero lag, or "screen lag", and the frames are more than smooth.

You could compare FPS to the HDTV world, a lot of people say 720p and 1080p are basically the same, but if you put them side by side and compare, 1080p is way better. Same thing with monitors, if you put a 30fps beside 60fps, you can tell the difference.
 
You can tell the difference between 30 fps and 60 fps. At 30 fps there is no lag, but the frames are still choppy, and there might be "screen lag." At 60 fps, there is reduced lag and the frames become "smooth" and the "screen lag" is reduced. Now if your looking at a CRT which can hit 120fps, you can totally tell a difference between 60fps and 120fps. At 120, there is almost zero lag, or "screen lag", and the frames are more than smooth.

You could compare FPS to the HDTV world, a lot of people say 720p and 1080p are basically the same, but if you put them side by side and compare, 1080p is way better. Same thing with monitors, if you put a 30fps beside 60fps, you can tell the difference.

absolutely thats what i was trying to say ;)
 
hdtv world cannot be used to compare with pc world the max res on a retail tv is 1920x1200 that include the mahoosive 106" plasma's and projecters. In comparison my screen is 24" and has a resolution of 1920x1200 the pixels and much smaller allowing more detail. You can get 2560x1600 screens that are only 20" the difference is to vast and the resolution you choose depends on the size of the screen for a tv. A cheapo 40" 1080p next to an equally priced 20" 720p screen the 720p will win as its relatively more expensive in its respective market placement therefore should have better quality contrast, brightness, backlight, pixel quality.

A tv generally has to depend on a feed of 25-30fps and the image is calibrated to correlate. Monitors can be fed an infinite amount of data but normally the refresh rate restricts it to 75fps try and give it too many frames and you can max out the buffer and experience tearing. Everyones different I like to play crysis around 20-25fps and a rts like Age of E at 40-50fps when its around 70-75 Its too much don't like it irritates my eyes unless I turn on the "perfect motion" function and every other frame is black. The general rule is that human eyes cannot supply the brain with more then 60fps as just like a computer you need memory buffers, cable bandwidth, processing time the brain isn't an infinite supercomputer and has the limitation/benefit of non-linear processing. It will be a longtime until we can make a program complex enough to have true AI that has freethought i.e. ask it for a random number and it will be truly random.
 
Status
Not open for further replies.
Back
Top Bottom