Nvidia G-Sync, Why We Need It:

PP Mguire

Build Guru
Messages
32,592
Location
Fort Worth, Texas
NVIDIA_G-SYNC_01.jpg


Instead of copying and pasting a PR article, I can briefly explain it so people can understand it.
As most know, when dealing with high graphics and FPS, if you do not lock your vertical sync you can have tearing. If you didn't, now you do. That is where one frame is basically overlapping the other because the monitor can't keep up with the speed of frames the GPU can pump out. Vsync eliminates this by locking the frame rate to the refresh rate of your monitor, which is typically 60. This may not necessarily be an issue for some with lower end rigs, but those with rigs that can do over 60fps you can have lag, more noticeably input lag and stuttering. This happens because the GPU basically has to wait for the monitor to do its thing before sending out the next frame. Simple enough right?

This is why "gaming" monitors with true refresh of 120Hz and 144Hz have come out. Overlooking the 3D gimmick BS, if you have a vertical refresh of 120 or 144 you have less of a chance of hitting your cap. Especially if dealing with single GPU solutions. The problem here is you can still have the same effects which is where adaptive vsync comes in hand, but isn't really a true solution. It has its ups and downs, and like regular vsync or none at all has its flaws. (For anybody still reading, this effectively happens on both sides)

What G-Sync is, is a hardware technology that is going to be embedded (probably) on gaming monitors provided by Asus, BenQ, Phillips, and Viewsonic (maybe more to come) that effectively and actively changes the refresh value of the monitor to stay in sync with the GPU. So what this means is, hypothetically let's say you have a GTX 780 and a G-Sync enabled 144Hz BenQ monitor, the G-Sync technology will change the refresh value on the fly to stay with your FPS for a much smoother experience. In theory, this is incredibly awesome. It resolves the issue that has plagued PC gamers for years, but it's going to come at a price.
You will need at the minimum a Kepler GPU (that is 600 series, 700 series, and Titan) and one of these monitors. Since it's not something that will catch on quickly (again, probably, just my opinion) it will be in select monitors by only those brands. This meaning, you won't really have much of a choice like you do right now with regular monitors. Now for such technology and newness, they can charge an arm and a leg for it, of which most are hoping they don't. My guess would be this will come with an extra 50 to 100 bucks onto the price of an equivalent monitor. Don't quote me, that's just a guess. The problem being, if you're like Kman and prefer 21:9, or others like him who prefer IPS or 1440p, you probably are going to be left out in the cold. I'm purely guessing that this will first come as 1080p solutions on tried and true TN panels, and probably in the 120 or 144hz flavor. Of course I hope I'm wrong, but not everything is perfect.

This technology essentially is making benchmarking at an FPS level almost pointless, but rather checking to see who has the smoothest experience with what combo of hardware. To some having the highest FPS is ok while dealing with tearing, but others don't mind dealing with input lag and reducing that tearing. G-Sync effectively solves this decision, but at a cost. If it's worth it to you, that is totally your preference. On the console side of things, you don't really need something like this as your experience is usually limited to 30 or 50/60 anyways.

Now I'm going to explain why this is bad. I have a love hate relationship with new technology as it is, but as of right now I feel any new proprietary technology in the graphics department is horse ****. Like AMD and their Trueaudio and Mantle, we have Nvidia with their PhysX and now G-Sync (along with many other technologies they showcased). You have to sit back and think, this isn't very good for the consumer. You have to make a choice and settle with it while picking sides. All of that completely set aside from the actual performance of the cards and drivers. If you want what is essentially the important step towards smoother gameplay but you're an AMD fan, you either have to suck it up and get a Kepler card, or stick with your AMD card. Each technology in itself has a positive, but in the end it's all negative in the fact that neither company will adopt (or license) the others technology so we ALL benefit. That's just my 2c on the subject there.

Me personally? I don't really care. I use Vsync in SP games and don't really notice any input lag, and on MP online games I go for the highest FPS possible and don't care about tearing. I'll have to be the judge when I physically use it. I was just bored and decided to write this up. :lol:
 
Going to be 1080p starting out, 144Hz and 3D on that Asus panel. 4k will come later. I'm not positive, but as an educated guess I will say they will probably skip 1440p.
You think all of the panels will be 3D-capable? That just seems like a feature I'd be paying for that I wouldn't want nor use.
 
You think all of the panels will be 3D-capable? That just seems like a feature I'd be paying for that I wouldn't want nor use.
You misunderstand me. The first panel to get this IS an Asus 144Hz 3D panel that is 24" and 1080p. I think the majority of panels are going to be 1080p and 120/144Hz for now which will be superseded by 4k later.
Idc about 3D either, but I think most all 120+ panels are 3D capable with a separate kit.
 
...as of right now I feel any new proprietary technology in the graphics department is horse ****. Like AMD and their Trueaudio and Mantle, we have Nvidia with their PhysX and now G-Sync (along with many other technologies they showcased). You have to sit back and think, this isn't very good for the consumer.

Each technology in itself has a positive, but in the end it's all negative in the fact that neither company will adopt (or license) the others technology so we ALL benefit

Proprietary tech is one of the biggest things that drives business! That's how businesses make themselves viable options over their competitors - by taking something anyone *can* do, and doing it best.

Hyperbolically, the consumer benefits the most when all companies freely share every little bit of code and share standards with each other. Unfortunately, in reality, things don't work out if you try that. You have to consider what benefits the business as well as the consumer, because you kinda need the former to have the latter in the first place.

In other words, no that will not benefit ALL of us. That will benefit all of us who are *consumers*. The business loses. It's the entire *point* from a business perspective that you have to buy their brand to get X feature. Like it or not, these things do take massive stacks of cash to realise - cash that's not gonna fall out of thin air, and cash that you'll be giving your competitors a cut of if you just said "screw it, everyone can use this for free".

Heck, even licensing out the tech can come back to bite you. As you said, if someone desperately wants G-Sync, they need a nVidia card. One feature might not be enough to tip the scales, but build up another few (add physX for example) and you've got the start of something which might cause someone to seriously consider picking up your card instead of a competitors. License that tech out, and suddenly that advantage disappears.

If I understand correctly, they can't license the idea of refreshing the monitor in sync with the FPS instead of vice versa. That would be damn wrong. But keeping your own work on that idea to yourself instead of sharing it with your competitors? Cry me a river :p
 
Except the fact that almost every single bit of proprietary tech that came about without some serious backend flopped. Hard. Dating as far back as the first low level API, GLide. Excellent, but failed. PhysX, still barely being used, but still very much a flop. ATI Stream, failed. We're looking at Mantle and Trueaudio, quite possibly failing before they even get released. High end developers don't appear to really care about it besides Johan from Dice. They are in bed with AMD anyways, no surprise there.
G-Sync even though has the ability to completely change the game, will probably only be used in the pro-gamer and extreme enthusiast market. Why? Because AMD fanboys (and there are a lot of them) don't want to go Nvidia, and the rest probably won't give a **** about it because they are too ignorant to what it actually does. Much like most of the people I've seen comment on the technology, AMD and Nvidia fans alike. Then you have people like myself who think it's great, but I don't want to drop 175 bucks to try and modify my monitor, and I damn sure don't want to spend 400 bucks on a monitor I don't really want.

THe business loses if nobody wants it. When good tech dies due to greed we all lose. Please keep an open mind on that matter. Although companies wouldn't make as much money because they force everybody to buy JUST their ****, it's better to make it available for both sides of the market for an option. Rather than getting some sales and flopping. Especially when we're dealing with something like this, which like I said could totally change the game on how things are done. It's been a problem for years like Carmack has said quite a bit. It won't go anywhere at all unless both AMD and Nvidia make it.

They can license the software/code used to sync their GPU to the hardware being put on the monitor. Unlike Mantle which is GCN proprietary, G-Sync could be coded for both sides to use. It's just how the GPU communicates to the hardware in the monitor which controls the refresh of said monitor. So in other words, it could be done, it would be for the better of the industry to share the tech, but Nvidia are greedy *******s and don't give a ****. They want people to buy their Kepler GPU and couldn't care less about anything else. PhysX and better driver support hasn't been enough to tip the scales, and neither has having better performance for the money. Hardly anybody really truly understands what this technology does, meaning they won't find a need for it either, or the want/need to buy the GPU and monitor it requires.
 
Last edited:
I wouldn't exactly call PhysX a "flop", tons of games support it including the Arkham series, Borderlands 2, Metro 2033, Witcher 2, MOH:A, ARMA 3, and other such small-time titles :p

List of games with hardware-accelerated PhysX support - Wikipedia, the free encyclopedia

Not a flop in my books by a long shot.

G-Sync even though has the ability to completely change the game, will probably only be used in the pro-gamer and extreme enthusiast market.

Would allowing AMD to use it really remove the "pro-gamer and extreme enthusiast market" restriction? Seems like it'd increase the number of people using (i.e by including the AMD pro-gamer market) but it's not going to be something aimed at your average CS or Minecraft player anyway. At least, not for quite a while. Honestly, wtf would the *average* person need this for anyway?

Although companies wouldn't make as much money because they force everybody to buy JUST their ****, it's better to make it available for both sides of the market for an option. Rather than getting some sales and flopping

As always, that's a business decision. It could benefit them more to license it out, it could benefit them more to hang on to it.
Whether you're going to class them greedy capitalist swine for not sharing their tech or not, at the end of the day a business is out to turn a profit. It makes sense for that business to keep itself viable both now, and for future prospects. They're not going to intentionally shoot themselves in the foot; I'm sure they're not just flipping a coin to decide what to license out or not, and pardon me if I'd take the financial advice of their team of market analysts over a dude on a forum on the internet :grin: no offence meant
 
Back
Top Bottom