How identical do graphics cards need to be for a dual setup? - Page 3 - Techist - Tech Forum

Go Back   Techist - Tech Forum > Computer Hardware > System Upgrades
Click Here to Login
Reply
 
Thread Tools Display Modes
 
Old 09-30-2014, 06:58 PM   #21 (permalink)
Build Guru
 
PP Mguire's Avatar
 
Join Date: Dec 2004
Location: Fort Worth, Texas
Posts: 28,306
Default Re: How identical do graphics cards need to be for a dual setup?

How did you fry it exactly? It's 99.9% impossible to damage your card by conventional means of overclocking with all the safety features put in place.

Sweet on the board. Free upgrade sort of lol.
__________________

__________________
"Resolution is just a number." #Ubisoft
Origin/Steam = PP_Mguire Twitch = pp_mguire Instagram = ppmguire PSN = PP_Mguire

Access to my Plex PM me.
PP Mguire is offline   Reply With Quote
Old 10-01-2014, 05:14 AM   #22 (permalink)
Newb Techie
 
Join Date: Sep 2014
Location: Norway
Posts: 9
Default Re: How identical do graphics cards need to be for a dual setup?

Well that's exactly the point, I was overclocking it moderately using the manufacturers own tweaking hardware, so for it to get fried it must have been defective from the get go, which is why I'm probably going to get a new one.

The only thing I did is go into the GPUTweak software (from Asus) and turn up the GPU clock and the memory clock. GPU I turned up to 1080 from 1020 and the memory I turned up to 7700 from 7000. I tried a bit higher on the gpu clock, but it was unstable in the 3dmark, so I turned it down, 1080 was the highest stable setting.

If I'd been using more "creative" ways to overclock it and actually managed to go outside of the safe limits and override the controls, then it'd be on me and I'd have to buy a new one out of pocket.

On a slightly related note, how does overclocking work if you have more than one graphics card? Can an SLI setup be overclocked at all? Do they have to be overclocked as a set or do you overclock each one separately?
__________________

SilverfoxAlpha is offline   Reply With Quote
Old 10-01-2014, 05:23 AM   #23 (permalink)
Build Guru
 
PP Mguire's Avatar
 
Join Date: Dec 2004
Location: Fort Worth, Texas
Posts: 28,306
Default Re: How identical do graphics cards need to be for a dual setup?

There is literally no way for a manufacturer to know if you overclocked using simple software. My go to is MSI Afterburner and the slider limits for voltage (the only thing that can kill really) are controlled via the bios so you can't go further than the VRM will allow. Usually lower with Nvidia. With Nvidia if a clock is too high or the card gets too hot the driver simply crashes and resets everything to default until you open the program back up.

On that note, if and when you ever OC you need to adjust the fan ramp profile to make sure your card stays cool and monitor temps closely.

When you OC with a SLI setup the 2 cards get the clocks applied at the same time. One mimics the other. In a SLI setup, VRAM is mirrored so VRAM clocks match each other, as do core clocks, fan speeds, and voltage. If you SLI for example, a reference eVGA 780ti and a "Superclocked" 780ti the clocks of the Superclocked card are downclocked to match that of the 780ti and they boost equally. Boost will be limited to the weaker card (either/or).

To give you an example of how it's pretty hard to kill a card without serious modifications or stupidity. I'm running a 680 FTW+ that has a beefed up VRM circuitry compared to regular 680s and I'm running a modified bios. I upped the voltage to always run maxed and upped my boost clock to 1300 by default. The thing here is Nvidia volt limits most of their cards so if I set a 1.3v it will still hard limit itself to 1.215 (which sucks). I also raised the TDP% to 168% so it will utilize all the volts the hardware will allow. Basically what that equates to is I did a lot of work only to be limited to about 1300MHz clocks solid because of that hard volt limit Nvidia put in place. Lame.

When you overclock Nvidia cards you have to realize that the base core clock gets boosted by GPU Boost (2.0 on the 700 series and 900 series). What this means, is the card will automatically boost your clocks to as high as the limit that is set by the software you're running. So in your case, that would be heat. GPU Boost 1.0 on the other hand (like on my card) boosts to a specific TDP limit which is 131%. Back to your case, if you set a base core clock higher and can't contain the heat your card will only boost as high as the set thermal limit you set. If you set this to 90c with your default base clock it will boost until your card runs at 90c (not real safe). If you raise your base clock and your card hits 90c it won't boost any further than before. So basically the best way to get the most out of your card is set a high thermal limit to something around 85c, max your GPU fans out, and see how high it boosts. When you do this usually you'll get a higher boost clock when you raise the base clock because you're keeping temps down.

Another thing to remember here is, if you're using a non reference GPU (like Asus DCU2, eVGA ACX, Gigabyte Windforce) your GPU heat is being dumped into your case. So if you don't have this heat properly expelled then you'll start having heat soak issues thus making it harder for your GPU to boost higher. This is why I personally prefer the reference blower style coolers to blow all of that heat out of your case and to make sure your GPU is sucking in cold air all the time. The alternative to this being, point a large fan at your PC and go to town lol.
__________________

__________________
"Resolution is just a number." #Ubisoft
Origin/Steam = PP_Mguire Twitch = pp_mguire Instagram = ppmguire PSN = PP_Mguire

Access to my Plex PM me.
PP Mguire is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Dual graphics cards-dual monitors Tik172 System Upgrades 6 07-18-2009 12:28 PM
does sli still have to be 2 identical cards soulafien New Systems | Building and Buying 9 06-26-2008 10:41 PM
Buy as Dual Channel Kit or 2 identical memories? Playstation New Systems | Building and Buying 3 02-25-2007 07:08 PM
Should i buy Dual Channel Kit ram or 2 identical rams? Playstation System Upgrades 1 02-25-2007 05:03 PM
Gigabyte graphics cards and Ati graphics cards Venom Monitors, Printers and Peripherals 6 11-15-2004 02:50 PM



Copyright 2002- Social Knowledge, LLC All Rights Reserved.

All times are GMT -5. The time now is 08:56 PM.


Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2017, vBulletin Solutions, Inc.