NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

Status
Not open for further replies.
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

By that standard all video cards are 'inefficient'. Efficient GPUs should experience little or no power loss. where do YOU draw the line?

Ultimately it's not a deal breaker for me if I have to pay for 300W extra with a pair of 480's over a pair of 5870's. Everyone's gonna have a different 'line' they draw where performance gain won't justify the price they're paying. Even if you earn just $12.50 an hour you could buy one of these in a week, and to pay for the extra power you'd have to work 2 hours more a MONTH.

I just want to be able to play the games I have and will have, do the work I do, etc.
To me, at the end of the day, if I'm running my games faster and they're looking better than yours and Photoshop is working faster, then I couldn't give two hoots if you come up to me and say 'well MY system is using less power'.

edit: btw, sorry if this post comes off a bit rough. I'm not trying to be a dick or anything and you make a good point.

What work will GTX 480's help you get done faster than HD 5870's? Photoshop does not support CUDA and the few plugins that do support it are extremely expensive and only work on Quadros. Video encoding doesn't count either as the current gpgpu applications are widely considered to produce inferior quality results when compared to cpu encoders.

I draw the line when one gpu draws considerably more power for a negligible performance increase.

And that score is with the card crippled. ATI has had plenty of time to work on enabling gpu physic calculations, they claimed they could do it back in 2006, they even said it had merit.
ATI - Effects Physics & Gameplay Physics Explored | [H]ard|OCP

Until a Physx game manages to beat BF:BC2's Havoc physics and Crysis' CryPhysics, both of which are cpu based, gpu accelerated physics is irrelevant imo.
 
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

Until a Physx game manages to beat BF:BC2's Havoc physics and Crysis' CryPhysics, both of which are cpu based, gpu accelerated physics is irrelevant imo.

If ATI would enable Physic calculations then we would see more games utilizing it and cpu based physic calculations would go away.

Godfrey CHeng said:
The same rule applies to effects physics, only on a different scale. As developers implement physics they'll judge what is the most immersive physics simulation they can produce that still yields playable frame rates. Because of the massively parallel architecture of ATI's GPUs, adding even 5,000 objects to a scene doesn't result in a drop in frame rates. Bumping things up to 10,000 objects shows a slight drop in frame rates, but it's nothing significant – the performance decline is not as linear as you'd see from the CPU. Today our early physics demos show that at high resolutions, we can simulate the collisions of 20,000 boulders running at frame rates well over 100 FPS.

I like the next paragraph even better.
Godfrey Cheng said:
As an interesting aside when talking about taxing the GPU with physics processing, physics acceleration actually allows gamers to get the full benefit of the X1K architecture. With physics processing, you have that geometric complexity that can push the vertex shaders to their limits, and the complex parallel processing needs that maximizes the dedicated branching processor of ATI's GPUs. In essence, you put the whole architecture through its paces, which as a gamer; you like to hear as it means you're getting the most value from your graphics card purchase.

It just seems to me that ATI is hindering technology to create controversy.
 
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

Folders are going to use these I think. Did you see that link I posted? Already has Fermi support, but still not 5xxx support? What's with that?
 
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

What's with the 2006 articles? It's 2010 and a lot has changed since then.

ATI cards are plenty capable of implementing PhysX and quite good at it too, but Nvidia bought Ageia and made PhysX exclusive to NVIDIA cards. Then basically paid a bunch of game developers to implement PhysX in their games. This was their attempt to create a PhysX standard that would require a gamer to purchase a NVIDIA card.

Sadly, some morons actually fell for this....physx is dying and being replaced with OpenCL calculations and Havok...as it doesn't matter what kind of card you have for those.

It's just another example of how NVIDIA play dirty.
 
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

If ATI would enable Physic calculations then we would see more games utilizing it and cpu based physic calculations would go away.



I like the next paragraph even better.


It just seems to me that ATI is hindering technology to create controversy.

Any game developer that wanted to could implement gpu accelerated physics using OpenCL or DirectX Compute which would run on both Ati and Nvidia cards.

Folders are going to use these I think. Did you see that link I posted? Already has Fermi support, but still not 5xxx support? What's with that?

I can't say I didn't expect that to happen. Folding seems to have become more of a Nvidia marketing ploy than anything else. If it was really all about the science it seems like they would want to maker the most of all of the hardware they could instead of deliberately ignoring millions of the most powerful cards on the market.
 
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

Hmmm... WOT flagged AnandTech as an attack page.

Anyway, here's the Folding benchmark:
22218.png


You definitely need to Fold with them, Slay.
 
re: NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed!

Exactly, which is what most are doing, but a few are accepting money from NVIDIA to go exclusively with PhysX.
 
Status
Not open for further replies.
Back
Top Bottom