video card for 56' tv

Status
Not open for further replies.
No it does not, I have compared two identcal monitors one with VGA one wiht DVI and there is no discernable picture quality loss or gain with either one.
 
Umm, the monitor I am using in VGA is made for DVI and there is no difference.............. Yes I tried in both VGA and DVI

And ALL LCDs are made for DVI because their picture is digital instead of analog like CRTs
 
JoshSB said:
Umm, the monitor I am using in VGA is made for DVI and there is no difference.............. Yes I tried in both VGA and DVI

And ALL LCDs are made for DVI because their picture is digital instead of analog like CRTs

well there is probably no difference because of the resolution, on a HDTV I'm sure it would make a difference.

And to The General, analog is not bad, it's just not as good as Digital.

And not all LCDs are made for DVI. I'm using a LCD monitor right now that only accepts a VGA input.
 
Dude, mine only accepts a VGA as well, I ment that they are optimized to use a DVI connection due to the fact that the screen it displays is in digital format, not analog.
 
Okay, for future referance, when I say "=/=" that means "does not equal" and when I say "=" it means "equals."

I know analog can be just as good as digital, but in some cases (as stated above in my previous posts) analog is better.

Let me settle this DVI/VGA argument. I had an LCD that has a DVI input AND a VGA input. I plugged them both in and set it to clone mode, switched back and forth between the two and saw no difference. I consider myself to have a keen eye for quality.

They are the same quality.

Also, people seem to think that HDTV is super hi-res, but no. 1080p (the highest there is) is only 1940x1080. My monitor (with VGA) goes to 2048x1536 and works great.
 
Hey guys, the ONLY reason a DVI hookup would be better is if the Analog signal was degraded, this can happen due to attenuation, interference on a non-shielded cable, or a just shitpoor video card connection.

None of which happens these days considering the fact every VGA cable is shielded against interference, and the distance is no where near enough to cause attenuation (i think minimum is like 70-100 feet) and that the video cards have nice hookups these days.
 
The General said:
Okay, for future referance, when I say "=/=" that means "does not equal" and when I say "=" it means "equals."

I know analog can be just as good as digital, but in some cases (as stated above in my previous posts) analog is better.

Let me settle this DVI/VGA argument. I had an LCD that has a DVI input AND a VGA input. I plugged them both in and set it to clone mode, switched back and forth between the two and saw no difference. I consider myself to have a keen eye for quality.

They are the same quality.

Also, people seem to think that HDTV is super hi-res, but no. 1080p (the highest there is) is only 1940x1080. My monitor (with VGA) goes to 2048x1536 and works great.

yeah i know that =/= is "does not equal" that's why I said "analog is not bad"....

but whatever, I'm just saying that if DVI is available he should go with it.
 
Yeah, it's just that when I say something and someone says my name followed by the same thing I said, it seems that they misread me. No problem. :p
 
Status
Not open for further replies.
Back
Top Bottom