DVI to D-Sub(analog) adapter

Status
Not open for further replies.

jmoussa87

Baseband Member
Messages
51
First off, I am not asking about them, can grab one at my local PC shop for $5. What I really wanted to know was whether it would give me a 'true' analog ouput? I have a 22 inch widescreen digital monitor with DVI inputs and a 7600GT with dual DVIs only.

Why do I sound insane wanting to use 'old-style', analog input connection when I have digital? Simply because I believe it scales older games, and typically non-native resoltions much better than digital, which I find quite rigid and inflexible, with loads of nasty interpolation. DVI is overrated. ;)

The output from the GPU would technically be a digital signal, but it would be converted to analog(d-sub) and then transferred to my monitor via the analog input so I am assuming it would be completely analog.

I am just asking before I try it because in actual fact i haven't got the card yet, its still tenous, but will be getting it soon. It is actually getting very difficult to find modern GPUs with an analog input, such a shame, even most of the relatively older models 7 and 8 series geforce ain't got them. Shame.:(
 
The DVI-VGA plugs don't actually convert digital signals to analog. DVI ports are dual-mode, if you use an adapter it detects it and the GPU outputs analog signals to the VGA monitor. I don't think it would improve your games though, because no matter what, a digital signal gets converted to an analog signal somewhere along the line and a low res picture gets upscaled. With DVI, the PC may do the upscaling (if it's still outputting a native-resolution image to your monitor) while with VGA (or DVI with a lower resolution set), your monitor's internal circuitry is doing the upscaling before outputting to the LCD panel (the signal going to the panel is ALWAYS native resolution).
 
The DVI-VGA plugs don't actually convert digital signals to analog. DVI ports are dual-mode, if you use an adapter it detects it and the GPU outputs analog signals to the VGA monitor. I don't think it would improve your games though, because no matter what, a digital signal gets converted to an analog signal somewhere along the line and a low res picture gets upscaled. With DVI, the PC may do the upscaling (if it's still outputting a native-resolution image to your monitor) while with VGA (or DVI with a lower resolution set), your monitor's internal circuitry is doing the upscaling before outputting to the LCD panel (the signal going to the panel is ALWAYS native resolution).

Well that is a shame, but I guess I won't completely know until I try it for certain. In the worst scenario I can always buy another GPU with an analog output and sell this card(since I bought it really cheap I will actually make money). When I used to used DVI connection the Nvidia software allowed me to configure scaling as either windowed original resolution, black bars with some scaling, and full scaling by either the monitor or GPU, althoguh I found the monitor did a better job it still tended to strech and distort images, and interpolation was almost unbearable. I still reckon analog is superior.:D
 
...Your GPU HAS an analog output. A DVI output on a GPU is a dual-mode connector. If you connect a DVI monitor to it, it outputs digitally, and if you plug an analog (VGA) monitor into it via an adapter, it outputs analog (just like a VGA port would, the GPU senses the adapter and outputs analog signals over the DVI port, the port supports both modes).

If you use an adapter and a VGA connection to your monitor, it'll be analog, just as if you had a VGA port directly on your GPU, because that's what the adapter does. The real issue here is that you're upscaling. No matter where upscaling is done (monitor, GPU, digital, analog, etc) you're doing the same thing. You're taking a tiny image and making it bigger than it was intended to be. Upscaled images always look horrible on LCD monitors, on CRT's, the resolution of the screen actually changed because the beam scanned a different number of times on the screen. This results in "scan lines" but tends to make low-res images appear sharper. LCD's don't do this, they simply stretch the image, resulting in a poor quality image. Try a CRT if you have one (using a DVI->VGA adapter), try scaling via GPU and then actually changing the resolution, you'll see a difference. You won't see this difference with an LCD panel though, because LCD's have only one native resolution.
 
OK, just found the info from hardware secrets. I didn't realise VGAs had DVI-D. So definitely with the adapter it will convert the signal to analog. Ihttp://www.hardwaresecrets.com/article/157/8

"It is very interesting to note that DVI-I outputs can be transformed into VGA outputs by the use of an adaptor that usually comes with the video card (see Figure 28), if the DVI-I connector has analog signals (DVI-A) present – which is the case with all video cards. Thus you can transform the DVI-I connector of your video card on a second VGA output, allowing you to connect two video monitors to your computer. This connection, however, will be analog, not digital, since the VGA connection uses analog signals and you are using the DVI-A signals from the connector to generate this output.

Connecting two DVI devices to your PC, however, is only possible if you have a video card with dual DVI outputs, since converting a VGA output to DVI is not possible as VGA output uses analog signals and video monitors featuring a DVI input usually require digital signals (i.e. a DVI-D connector)."
 
Thanks for explaining the science behind the technology. I don't think I could go back to a CRT though, they are too bulky and chew up power consumption. Besides I've got a new 22" LCD so I actually want to use it. I guess I could do the whole dual-monitor thing, keeping my LCD for native resolution gaming, productivity tasks, wide screen movies....and getting a smaller CRT for older games and low res apps. But honestly, upscaling isn't that bad to my eyes, maybe I just have bad eyes, but I find it much more tolerable than interpolation from digital signals. You should play a low res game full screen on DVI and then go analog, it looks way better IMO anyway. It's kind of funny, everyone is going digital, HDMI, SCART, component, and all these other wiz bang connections and I will still be using classic analog/VGA from the 80s, hahaha.:D

...Your GPU HAS an analog output. A DVI output on a GPU is a dual-mode connector. If you connect a DVI monitor to it, it outputs digitally, and if you plug an analog (VGA) monitor into it via an adapter, it outputs analog (just like a VGA port would, the GPU senses the adapter and outputs analog signals over the DVI port, the port supports both modes).

If you use an adapter and a VGA connection to your monitor, it'll be analog, just as if you had a VGA port directly on your GPU, because that's what the adapter does. The real issue here is that you're upscaling. No matter where upscaling is done (monitor, GPU, digital, analog, etc) you're doing the same thing. You're taking a tiny image and making it bigger than it was intended to be. Upscaled images always look horrible on LCD monitors, on CRT's, the resolution of the screen actually changed because the beam scanned a different number of times on the screen. This results in "scan lines" but tends to make low-res images appear sharper. LCD's don't do this, they simply stretch the image, resulting in a poor quality image. Try a CRT if you have one (using a DVI->VGA adapter), try scaling via GPU and then actually changing the resolution, you'll see a difference. You won't see this difference with an LCD panel though, because LCD's have only one native resolution.
 
Status
Not open for further replies.
Back
Top Bottom