Coolermaster Black Storm Sniper vs. Antec Red Lanboy

Status
Not open for further replies.
That is quite nifty! I knew there was a way to run one or the other, just mistaken in thinking you had to manually switch between integrated and discrete.

I am curious as to what the power draw difference is going to be though, might be something worth testing for techies everywhere if you have a way to measure your system's power draw.
It puts your discrete GPU into idle mode, which should mean its still drawing roughly 160-170w (source | source).
So the question becomes: with your GPU always drawing idle power, is your integrated graphics really more power efficient at low-end tasks?
I'm guessing the answer is probably "yes" otherwise they wouldn't have added it into the motherboard, unless its just a total gimmick (yeah, I'm a little cynical).

Yes, it sure sounds nice. I do have a couple of those Kill A Watt Electricity Usage Monitors so I will test and post the wattage usage results comparing integrated and discrete after I build this new rig. On the other hand, hopefully I can find the answer before I build this new rig so that if it is a gimmick, I can save myself the heart ache and just build a normal machine.

Either way, I fully intend to discover if this really is an energy saving bell and whistle.

Thanks for those url's, but I am not so sure they are super accurate. The reason I question the accuracy and validity of the wattage ratings is because I have already tested both rigs using the Kill A Watt Electricity Usage Monitor and nothing is over 150 watts when the machine is in idle mode....Example: I just plugged in my 2500k rig and at idle the full amount of watts the entire machine is pulling is only 79w at idle and a massive 92 watts when playing IRobot digital movie....Not sure how my video card can be pulling 162 watts at idle...it doesn't make sense to me....

EDIT: Maybe I just discovered the true wattage draw at idle on the 6870...

I see another area that says this:

Average IDLE wattage for 6870 ~ 19W

19 watts is not bad at all! I think this could save me money over the long term...



Soar
 
soarwitheagles said:
I just plugged in my 2500k rig and at idle the full amount of watts the entire machine is pulling is only 79w
I was really confused as to how the number could be so different, till I realized their test setup itself is drawing much more wattage.
AnandTech uses an i7 920 (45nm/130w TDP) OC'd to 3.33GHz; and guru3d uses an i7 965 (45nm/130w TDP) OC'd to 3.75GHz.

Dug around a little and found some wattage tests on Lucid Virtu, apparently @ idle it does draw more wattage than just having a discrete GPU.
But, depending on what your doing with your computer not at idle it may use less...
Does Lucid Virtu Work? | bit-tech.net
 
I was really confused as to how the number could be so different, till I realized their test setup itself is drawing much more wattage.
AnandTech uses an i7 920 (45nm/130w TDP) OC'd to 3.33GHz; and guru3d uses an i7 965 (45nm/130w TDP) OC'd to 3.75GHz.

Dug around a little and found some wattage tests on Lucid Virtu, apparently @ idle it does draw more wattage than just having a discrete GPU.
But, depending on what your doing with your computer not at idle it may use less...
Does Lucid Virtu Work? | bit-tech.net

Roark,

Once again, you are incredibly helpful! Ok, I read and reread the articles you posted from bit-tech. Dude, after reading those articles on Smart Response and Lucid Virtu, I wanted to throw my nice, new, shiny Gigabyte Z68 motherboard either out the window of the absolute tallest building I could find or into a super duper heavy duty paper shredder! Gosh, the difference you helped me discover between the manufacturer's promises and the real life failures!

The guys at Bit-tech appear to really know what they were talking about and their benchmarks and tests are undeniable.

Oh the wonderful blessings of having forums like this that can save us hours of heartache and terrible deception and going down the wrong road of a seemingly wonderful upgrade!

Thanks again Roark, and I will probably sell the new Gigabyte Z68 motherboard on CL and pray that God has mercy on the new owner!

The Gigabyte GA-P67A-UD3-B3 is working flawlessly for me since day one. I suppose I will choose to be content with it and not worry about saving a couple of pennies from a motherboard that says it can switch between GPU's and save me money...when in reality, it appears to be a major disaster just waiting for some poor ignorant person to step into!

Ok, one more question....

If you had access to an AMD 6870 and a Nvidia GTX 460, which would you choose and why [and yes, I do see the performance difference]?

Soar
 

Attachments

  • 6870 vs 460.JPG
    6870 vs 460.JPG
    90 KB · Views: 40
Yeah, if I was building a new rig I would go for a Gen3 Z68 board since they have PCIe 3.0 slots for when Ivy Bridge CPUs drop.
However, for someone who upgrades as regularly as you do certainly not worth it since new boards should come with the IB launch.

GTX 460 vs HD 6870 ... *sigh*
The 6870 is undeniably more powerful and doesn't use much more wattage under full load, if the #s can be trusted (source).
That being said, I'm an Nvidia fan-boy to the core and most of the games I play don't require more than a GTX 460 can handle, so that would be my choice.

Depending on what games you play and what other GPU intensive stuff you do there is a pretty cool idea.
Use your AMD rig as your daily with the 6870 (and LanBoy) and SLi the 460 in that rig with the one already in the Intel rig.
GTX 460 SLi is around GTX 570/580 levels, and with an i5 2500K behind it you could blow anything away.
 
Yeah, if I was building a new rig I would go for a Gen3 Z68 board since they have PCIe 3.0 slots for when Ivy Bridge CPUs drop.
However, for someone who upgrades as regularly as you do certainly not worth it since new boards should come with the IB launch.

GTX 460 vs HD 6870 ... *sigh*
The 6870 is undeniably more powerful and doesn't use much more wattage under full load, if the #s can be trusted (source).
That being said, I'm an Nvidia fan-boy to the core and most of the games I play don't require more than a GTX 460 can handle, so that would be my choice.

Depending on what games you play and what other GPU intensive stuff you do there is a pretty cool idea.
Use your AMD rig as your daily with the 6870 (and LanBoy) and SLi the 460 in that rig with the one already in the Intel rig.
GTX 460 SLi is around GTX 570/580 levels, and with an i5 2500K behind it you could blow anything away.

Roark,

Thanks again for the great advice!

PCIe3? Dude, I haven't even heard of that before! And in regards to the Ivy Bridge, I heard reports that they would cost much more than the current Sandy Bridge i7 and i5...somewhere to the tune of $1000...and that would be way out of my budget and price range. I read some articles that stated the present Sandy Bridge CPU's would be in demand for quite some time due to the excessive prices of the Ivy Bridge CPU's...have I heard correctly or was that hogwash?

Yes, you are the second person that recommended the GTX 460 over the 6870...I am seriously considering the GTX 460. Ok, even though I rarely game, I received two free copies of the Battle Field 3, so I probably will not be able to resist the urge to install one and play it over the Christmas Break here...which begins for me this Saturday [I am a teacher] and goes until Jan. 9th.

Which card would do better for playing the BF3?

Finally, I am not even sure what all this Nvidia bells and whistles such as PhysX, CUDA, and SLI is all about. Have you any short and simple explanations?

It's me,

Soar

PS EDIT: Dude, I just found the recommended hardware for BF3 and it appears that neither of my present cards live up to the recommended GPU:

Recommended system requirements for Battlefield 3

OS: Windows 7 64-bit
Processor: Quad-core Intel or AMD CPU
RAM: 4GB
Graphics card: DirectX 11 Nvidia or AMD ATI card, Nvidia GeForce GTX 560 or ATI Radeon 6950.
Graphics card memory: 1 GB
Sound card: DirectX compatibl sound card
Hard drive: 15 GB for disc version or 10 GB for digital version
 
As I understand it PCIe 3.0 will run at twice the speed of PCIe 2.0 while running an Ivy Bridge CPU, though I'm not sure how necessary 32x PCIe lanes will be.
I haven't done any real research into Ivy Bridge since at this point most of the information is going to mostly be speculation.
Considering how well Sandy Bridge CPUs are working I can certainly see them being a staple of general users for quite some time.

The GTX 560 falls roughly between the GTX 460 and the HD 6870 (source).
I know that when it launched AMD cards were having driver problems with BF3, not sure about now.
Like I said though, for a dedicated gaming rig if even for just the next month or so, I would set up your 2500K build with the two GTX 460s in SLi, it will give you the best performance.

SLi is the Nvidia equivalent of Crossfire for AMD. It is essentially the linking of two discrete GPUs so that they work in conjunction for increased performance.
I have not done either myself, but as I understand it the easiest way to do so is to use a physical bridge, supplied with SLi compatible motherboards for Nvidia and most AMD GPUs, or just purchase one.
Once you link them via the bridge there is some software tweaking that needs to be done, easy enough to research if you decide to do so.
As for CUDA and PhysX I don't have a simple answer, here are the links to Nvidia's page though: CUDA PhysX.
 
As I understand it PCIe 3.0 will run at twice the speed of PCIe 2.0 while running an Ivy Bridge CPU, though I'm not sure how necessary 32x PCIe lanes will be.
I haven't done any real research into Ivy Bridge since at this point most of the information is going to mostly be speculation.
Considering how well Sandy Bridge CPUs are working I can certainly see them being a staple of general users for quite some time.

The GTX 560 falls roughly between the GTX 460 and the HD 6870 (source).
I know that when it launched AMD cards were having driver problems with BF3, not sure about now.
Like I said though, for a dedicated gaming rig if even for just the next month or so, I would set up your 2500K build with the two GTX 460s in SLi, it will give you the best performance.

SLi is the Nvidia equivalent of Crossfire for AMD. It is essentially the linking of two discrete GPUs so that they work in conjunction for increased performance.
I have not done either myself, but as I understand it the easiest way to do so is to use a physical bridge, supplied with SLi compatible motherboards for Nvidia and most AMD GPUs, or just purchase one.
Once you link them via the bridge there is some software tweaking that needs to be done, easy enough to research if you decide to do so.
As for CUDA and PhysX I don't have a simple answer, here are the links to Nvidia's page though: CUDA PhysX.

Roark,

Wow! Double the lanes...that sound fast to me. I will have to keep my eyes and ears open to understand the real world performance difference when the 32 lane comes out.

Yes, I have heard from several sources that the Ivy Bridge substantial cost difference will cause many people to not upgrade...I will have to post some sources on this...just as you do to document all your good advice.

I too have heard the news about AMD driver issues...so for now, I will definitely go with the EVGA GTX 460 1GB for now. I sold out of the XFX GTX 460's that were lower performing, so I will not have the opportunity to SLI with a similar card for now. I do have some 240's and a 450 laying around...maybe that would suffice?

Yes, I will use the Intel 2500k for the gaming rig as you advised. I plan on taking it to my parents for Christmas because I have some family members that love to do the computer gaming but do not have very fast rigs. They should like this one.

Well, Roark, thanks again for all your help.

I appreciate you taking the time to give me good advice backed up with legit sources.

Have a wonderful day!

Soar
 
For the PCIe speeds, as I understand it, we are barely utilizing the full potential of 16x PCIe 2.0 (source).
Granted that is for a 5870, can't seem to find a review for an HD 6### or GTX 5## series card.
So, even with 32x PCIe slots I doubt we'll see cards that can fully utilize the speed available for quite some time.

To SLi you need 2 of the same GPU card, so 2 GTX 460s (manufacturer doesn't matter), sadly you wouldn't be able to SLi a GTX 460 and 450 or 240.
I would say do some tests between the 460 and 6870 to see which can run BF3 better for you.
Searching while typing, found this performance list, might be better with the 6870 in the gaming rig (BF3 GPU performance).

Soar, thanks for giving me a focus for my desire to do research.
This is a nice exercise in information hunting to keep my brain from atrophying.
 
For the PCIe speeds, as I understand it, we are barely utilizing the full potential of 16x PCIe 2.0 (source).
Granted that is for a 5870, can't seem to find a review for an HD 6### or GTX 5## series card.
So, even with 32x PCIe slots I doubt we'll see cards that can fully utilize the speed available for quite some time.

To SLi you need 2 of the same GPU card, so 2 GTX 460s (manufacturer doesn't matter), sadly you wouldn't be able to SLi a GTX 460 and 450 or 240.
I would say do some tests between the 460 and 6870 to see which can run BF3 better for you.
Searching while typing, found this performance list, might be better with the 6870 in the gaming rig (BF3 GPU performance).

Soar, thanks for giving me a focus for my desire to do research.
This is a nice exercise in information hunting to keep my brain from atrophying.

Well, thanks again Roark for being so helpful and informative. You've helped me out considerably.

I did have a funny experience last night. I have had a 5870 laying about that I have never tried. So, I yanked out the Nvidia card from the 2500k and threw in the 5870. The screen resolution and movies appeared much better. But the most shocking part for me was the heat. I am not joking, when I put my hand near the back of the case near the vents for the 5870, it literally felt like a space heater....maybe even a little warmer! I did not have time to measure the wattage, but it felt as if more heat was coming out of that vent than some of the space heaters I have used in the past.

Tonight, I will do a wattage check then yank it out. For some reason, every time I move or bump the computer case then something disturbs the computer screen/video card so it does funny lines on the screen.

Something is defintely not right even though I downloaded and installed the most recent AMD drivers.

I may test the 6870 during our vacation and see if there is much of a difference.

This sure is a great learning experience!

Thanks again for sharing your insights and good advice!

Soar
 
I know the 5870 is supposed to run a little hotter than a 460 (roughly 5°C idle 12°C load), but I wouldn't expect space heater levels.
... So, reading while typing makes me think I should learn conversions, 5-12°C = 41-53.6°F, so yeah I can see where the heat is coming from.

Feel silly writing this considering you build computers regularly but, you made sure it was properly seated and the monitor cable was snugly plugged right? :p
 
Status
Not open for further replies.
Back
Top Bottom