GTX 580 Reviews

Status
Not open for further replies.
And I'm not surprised to see that nVidia's presumably cherry-picked results don't stand up to real-world tests, all companies do it and it's not a bad thing by any means.

Actually, it is a bad thing in that it is not realistic compared to what the average consumer will get from an off-the-shelf unit. It's like those pictures burger chains use; no one ever gets a burger that looks like the picture because it is a idealistic representation and not meant to reflect what will be sitting in the warmer when you walk through the door.

Give me a company that shows the average performance of their product and I will buy from that company without hesitation.
 
found this a bit interesting.

NVIDIA GeForce GTX 580 Barely Faster than GTX 480 at Same Clock Speeds - Legit Reviews

metro_same.jpg


We are seeing a 1.33 to 1.66 FPS increase with just the extra CUDA cores, which translated to just a 3.5% boost in performance in Metro 2033 at 1920x1200. It looks like going from 480 to 512 CUDA Cores is minor and that the bulk of the performance boost is due to the higher clock frequencies that the card comes with. NVIDIA claimed up to a 30% performance increase and we saw a 31% increase in 3DMark vantage, but in real games the most we saw was 18% during the testing we did in our review. What are your thoughts?

Im guessing that the reason why is because most games and applications cant make use of all the processing power? That's just a guess.

And plus, lest see some more games tested. Not just one.
 
Actually, it is a bad thing in that it is not realistic compared to what the average consumer will get from an off-the-shelf unit. It's like those pictures burger chains use; no one ever gets a burger that looks like the picture because it is a idealistic representation and not meant to reflect what will be sitting in the warmer when you walk through the door.

Give me a company that shows the average performance of their product and I will buy from that company without hesitation.

Any good Hi-Fi manufacturer.

Actually Rotel have amps they rate at 100watt per channel but have been tested up to 600watt per channel through 1 ohm with no problem.
 
Actually, it is a bad thing in that it is not realistic compared to what the average consumer will get from an off-the-shelf unit. It's like those pictures burger chains use; no one ever gets a burger that looks like the picture because it is a idealistic representation and not meant to reflect what will be sitting in the warmer when you walk through the door.

Give me a company that shows the average performance of their product and I will buy from that company without hesitation.

Well, wouldn't know if they did, so you'd just assume it wasn't as good.
 
Actually, it is a bad thing in that it is not realistic compared to what the average consumer will get from an off-the-shelf unit. It's like those pictures burger chains use; no one ever gets a burger that looks like the picture because it is a idealistic representation and not meant to reflect what will be sitting in the warmer when you walk through the door.

Give me a company that shows the average performance of their product and I will buy from that company without hesitation.

I suppose it is a bad thing, but it stopped mattering to me because I automatically adjust my expectations of what the numbers really are. So if any company were to be honest I wouldn't notice and probably assume their product was a fair bit worse :p (edit: as Druid said)
 
Vantage takes more advantage of GPU raw power than any game so 30 to 18 really is not that bad. I suppose whiners will be whiners anyways. Besides they claimed 30, and we got 31. I really dont see the problem with this.
 
Status
Not open for further replies.
Back
Top Bottom