DVI to D-Sub(analog) adapter - Techist - Tech Forum

Go Back   Techist - Tech Forum > Computer Technology > Computer Audio and Multimedia
Click Here to Login
Closed Thread
 
Thread Tools Display Modes
 
Old 10-05-2008, 08:45 PM   #1 (permalink)
Junior Techie
 
Join Date: May 2008
Posts: 51
Default DVI to D-Sub(analog) adapter

First off, I am not asking about them, can grab one at my local PC shop for $5. What I really wanted to know was whether it would give me a 'true' analog ouput? I have a 22 inch widescreen digital monitor with DVI inputs and a 7600GT with dual DVIs only.

Why do I sound insane wanting to use 'old-style', analog input connection when I have digital? Simply because I believe it scales older games, and typically non-native resoltions much better than digital, which I find quite rigid and inflexible, with loads of nasty interpolation. DVI is overrated.

The output from the GPU would technically be a digital signal, but it would be converted to analog(d-sub) and then transferred to my monitor via the analog input so I am assuming it would be completely analog.

I am just asking before I try it because in actual fact i haven't got the card yet, its still tenous, but will be getting it soon. It is actually getting very difficult to find modern GPUs with an analog input, such a shame, even most of the relatively older models 7 and 8 series geforce ain't got them. Shame.
__________________

__________________
jmoussa87 is offline  
Old 10-05-2008, 09:21 PM   #2 (permalink)
 
Join Date: Nov 2006
Location: Illinois, USA
Posts: 2,363
Default Re: DVI to D-Sub(analog) adapter

The DVI-VGA plugs don't actually convert digital signals to analog. DVI ports are dual-mode, if you use an adapter it detects it and the GPU outputs analog signals to the VGA monitor. I don't think it would improve your games though, because no matter what, a digital signal gets converted to an analog signal somewhere along the line and a low res picture gets upscaled. With DVI, the PC may do the upscaling (if it's still outputting a native-resolution image to your monitor) while with VGA (or DVI with a lower resolution set), your monitor's internal circuitry is doing the upscaling before outputting to the LCD panel (the signal going to the panel is ALWAYS native resolution).
__________________

__________________

CalcProgrammer1 is offline  
Old 10-06-2008, 03:28 AM   #3 (permalink)
Junior Techie
 
Join Date: May 2008
Posts: 51
Default Re: DVI to D-Sub(analog) adapter

Quote:
Originally Posted by CalcProgrammer1 View Post
The DVI-VGA plugs don't actually convert digital signals to analog. DVI ports are dual-mode, if you use an adapter it detects it and the GPU outputs analog signals to the VGA monitor. I don't think it would improve your games though, because no matter what, a digital signal gets converted to an analog signal somewhere along the line and a low res picture gets upscaled. With DVI, the PC may do the upscaling (if it's still outputting a native-resolution image to your monitor) while with VGA (or DVI with a lower resolution set), your monitor's internal circuitry is doing the upscaling before outputting to the LCD panel (the signal going to the panel is ALWAYS native resolution).
Well that is a shame, but I guess I won't completely know until I try it for certain. In the worst scenario I can always buy another GPU with an analog output and sell this card(since I bought it really cheap I will actually make money). When I used to used DVI connection the Nvidia software allowed me to configure scaling as either windowed original resolution, black bars with some scaling, and full scaling by either the monitor or GPU, althoguh I found the monitor did a better job it still tended to strech and distort images, and interpolation was almost unbearable. I still reckon analog is superior.
__________________
jmoussa87 is offline  
Old 10-06-2008, 08:28 PM   #4 (permalink)
 
Join Date: Nov 2006
Location: Illinois, USA
Posts: 2,363
Default Re: DVI to D-Sub(analog) adapter

...Your GPU HAS an analog output. A DVI output on a GPU is a dual-mode connector. If you connect a DVI monitor to it, it outputs digitally, and if you plug an analog (VGA) monitor into it via an adapter, it outputs analog (just like a VGA port would, the GPU senses the adapter and outputs analog signals over the DVI port, the port supports both modes).

If you use an adapter and a VGA connection to your monitor, it'll be analog, just as if you had a VGA port directly on your GPU, because that's what the adapter does. The real issue here is that you're upscaling. No matter where upscaling is done (monitor, GPU, digital, analog, etc) you're doing the same thing. You're taking a tiny image and making it bigger than it was intended to be. Upscaled images always look horrible on LCD monitors, on CRT's, the resolution of the screen actually changed because the beam scanned a different number of times on the screen. This results in "scan lines" but tends to make low-res images appear sharper. LCD's don't do this, they simply stretch the image, resulting in a poor quality image. Try a CRT if you have one (using a DVI->VGA adapter), try scaling via GPU and then actually changing the resolution, you'll see a difference. You won't see this difference with an LCD panel though, because LCD's have only one native resolution.
__________________

CalcProgrammer1 is offline  
Old 10-06-2008, 08:44 PM   #5 (permalink)
Junior Techie
 
Join Date: May 2008
Posts: 51
Default Re: DVI to D-Sub(analog) adapter

OK, just found the info from hardware secrets. I didn't realise VGAs had DVI-D. So definitely with the adapter it will convert the signal to analog. Ihttp://www.hardwaresecrets.com/article/157/8

"It is very interesting to note that DVI-I outputs can be transformed into VGA outputs by the use of an adaptor that usually comes with the video card (see Figure 28), if the DVI-I connector has analog signals (DVI-A) present which is the case with all video cards. Thus you can transform the DVI-I connector of your video card on a second VGA output, allowing you to connect two video monitors to your computer. This connection, however, will be analog, not digital, since the VGA connection uses analog signals and you are using the DVI-A signals from the connector to generate this output.

Connecting two DVI devices to your PC, however, is only possible if you have a video card with dual DVI outputs, since converting a VGA output to DVI is not possible as VGA output uses analog signals and video monitors featuring a DVI input usually require digital signals (i.e. a DVI-D connector)."
__________________
jmoussa87 is offline  
Old 10-06-2008, 08:50 PM   #6 (permalink)
Junior Techie
 
Join Date: May 2008
Posts: 51
Default Re: DVI to D-Sub(analog) adapter

Thanks for explaining the science behind the technology. I don't think I could go back to a CRT though, they are too bulky and chew up power consumption. Besides I've got a new 22" LCD so I actually want to use it. I guess I could do the whole dual-monitor thing, keeping my LCD for native resolution gaming, productivity tasks, wide screen movies....and getting a smaller CRT for older games and low res apps. But honestly, upscaling isn't that bad to my eyes, maybe I just have bad eyes, but I find it much more tolerable than interpolation from digital signals. You should play a low res game full screen on DVI and then go analog, it looks way better IMO anyway. It's kind of funny, everyone is going digital, HDMI, SCART, component, and all these other wiz bang connections and I will still be using classic analog/VGA from the 80s, hahaha.

Quote:
Originally Posted by CalcProgrammer1 View Post
...Your GPU HAS an analog output. A DVI output on a GPU is a dual-mode connector. If you connect a DVI monitor to it, it outputs digitally, and if you plug an analog (VGA) monitor into it via an adapter, it outputs analog (just like a VGA port would, the GPU senses the adapter and outputs analog signals over the DVI port, the port supports both modes).

If you use an adapter and a VGA connection to your monitor, it'll be analog, just as if you had a VGA port directly on your GPU, because that's what the adapter does. The real issue here is that you're upscaling. No matter where upscaling is done (monitor, GPU, digital, analog, etc) you're doing the same thing. You're taking a tiny image and making it bigger than it was intended to be. Upscaled images always look horrible on LCD monitors, on CRT's, the resolution of the screen actually changed because the beam scanned a different number of times on the screen. This results in "scan lines" but tends to make low-res images appear sharper. LCD's don't do this, they simply stretch the image, resulting in a poor quality image. Try a CRT if you have one (using a DVI->VGA adapter), try scaling via GPU and then actually changing the resolution, you'll see a difference. You won't see this difference with an LCD panel though, because LCD's have only one native resolution.
__________________

__________________
jmoussa87 is offline  
Closed Thread

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
HDMI vs. DVI video card fifreak3 Monitors, Printers and Peripherals 9 05-22-2008 08:29 AM
DVI will not work during boot kontradictor Hardware Repairs and Troubleshooting 2 01-12-2008 10:03 PM
AC Adapter for My Hard Drive smalls0224 Hardware Repairs and Troubleshooting 1 07-06-2007 01:06 PM
Conflict between PCI Video card and PCI Wireless Network Adapter otaku_mike Hardware Repairs and Troubleshooting 1 06-13-2007 01:01 PM
rangebooster N wireless adapter causing problems teddyn Computer Networking and Internet Hardware 2 05-13-2007 05:48 AM



Copyright 2002- Social Knowledge, LLC All Rights Reserved.

All times are GMT -5. The time now is 08:50 PM.


Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2018, vBulletin Solutions, Inc.