DVI and VGA aren't the same. By definition, DVI is digital, and VGA is analogue. But this isn't actually true because, unlike VGA, there are in fact different types of DVI. They are:
- DVI-D - Digital signal only.
- DVI-A - Analogue signal only.
- DVI-I - Digital and analogue signals.
This is why on some graphics cards, you can use a DVI to VGA adapter to hook up an LCD TFT monitor to the DVI port of the graphics card, via a VGA cable. Although it's hooked up to the DVI port, the signal sent to the monitor would be analogue.
When I use use DVI (digital) with my LCD TFT monitor, I can notice the difference. If I look closely at the monitor, I can see that image is less fuzzy and has more clarity. This is because, since information for each pixel is sent via binary data, there's less interference.
For a better understanding of the types of DVI, and why it's better than VGA, check out this wikipedia article: Digital Visual Interface - Wikipedia, the free encyclopedia
Spikoman69, the majority of scan converters (external boxes capable of converting video signals to allow output to a CRT or LCD monitor) out there only support VGA output anyway.
As for switching to a PCI or USB tuner, it wouldn't make a difference if you were to use a PCI TV Tuner / USB TV Tuner, have your monitor connected to your graphics card via DVI (digital), and watch TV using this method, because the input signal would still be analogue (unless you use a DVB TV Tuner - digital tuner).