PP Mguire
Build Guru
- Messages
- 32,653
- Location
- Justin, Texas
Instead of copying and pasting a PR article, I can briefly explain it so people can understand it.
As most know, when dealing with high graphics and FPS, if you do not lock your vertical sync you can have tearing. If you didn't, now you do. That is where one frame is basically overlapping the other because the monitor can't keep up with the speed of frames the GPU can pump out. Vsync eliminates this by locking the frame rate to the refresh rate of your monitor, which is typically 60. This may not necessarily be an issue for some with lower end rigs, but those with rigs that can do over 60fps you can have lag, more noticeably input lag and stuttering. This happens because the GPU basically has to wait for the monitor to do its thing before sending out the next frame. Simple enough right?
This is why "gaming" monitors with true refresh of 120Hz and 144Hz have come out. Overlooking the 3D gimmick BS, if you have a vertical refresh of 120 or 144 you have less of a chance of hitting your cap. Especially if dealing with single GPU solutions. The problem here is you can still have the same effects which is where adaptive vsync comes in hand, but isn't really a true solution. It has its ups and downs, and like regular vsync or none at all has its flaws. (For anybody still reading, this effectively happens on both sides)
What G-Sync is, is a hardware technology that is going to be embedded (probably) on gaming monitors provided by Asus, BenQ, Phillips, and Viewsonic (maybe more to come) that effectively and actively changes the refresh value of the monitor to stay in sync with the GPU. So what this means is, hypothetically let's say you have a GTX 780 and a G-Sync enabled 144Hz BenQ monitor, the G-Sync technology will change the refresh value on the fly to stay with your FPS for a much smoother experience. In theory, this is incredibly awesome. It resolves the issue that has plagued PC gamers for years, but it's going to come at a price.
You will need at the minimum a Kepler GPU (that is 600 series, 700 series, and Titan) and one of these monitors. Since it's not something that will catch on quickly (again, probably, just my opinion) it will be in select monitors by only those brands. This meaning, you won't really have much of a choice like you do right now with regular monitors. Now for such technology and newness, they can charge an arm and a leg for it, of which most are hoping they don't. My guess would be this will come with an extra 50 to 100 bucks onto the price of an equivalent monitor. Don't quote me, that's just a guess. The problem being, if you're like Kman and prefer 21:9, or others like him who prefer IPS or 1440p, you probably are going to be left out in the cold. I'm purely guessing that this will first come as 1080p solutions on tried and true TN panels, and probably in the 120 or 144hz flavor. Of course I hope I'm wrong, but not everything is perfect.
This technology essentially is making benchmarking at an FPS level almost pointless, but rather checking to see who has the smoothest experience with what combo of hardware. To some having the highest FPS is ok while dealing with tearing, but others don't mind dealing with input lag and reducing that tearing. G-Sync effectively solves this decision, but at a cost. If it's worth it to you, that is totally your preference. On the console side of things, you don't really need something like this as your experience is usually limited to 30 or 50/60 anyways.
Now I'm going to explain why this is bad. I have a love hate relationship with new technology as it is, but as of right now I feel any new proprietary technology in the graphics department is horse ****. Like AMD and their Trueaudio and Mantle, we have Nvidia with their PhysX and now G-Sync (along with many other technologies they showcased). You have to sit back and think, this isn't very good for the consumer. You have to make a choice and settle with it while picking sides. All of that completely set aside from the actual performance of the cards and drivers. If you want what is essentially the important step towards smoother gameplay but you're an AMD fan, you either have to suck it up and get a Kepler card, or stick with your AMD card. Each technology in itself has a positive, but in the end it's all negative in the fact that neither company will adopt (or license) the others technology so we ALL benefit. That's just my 2c on the subject there.
Me personally? I don't really care. I use Vsync in SP games and don't really notice any input lag, and on MP online games I go for the highest FPS possible and don't care about tearing. I'll have to be the judge when I physically use it. I was just bored and decided to write this up.