Radeon HD 6850

Status
Not open for further replies.

Digimortal

Baseband Member
Messages
48
Location
Washington
I just bought my new video card and I switched from the DVI cable to an HDMI cable and it looks way worse and doesn't fill the whole screen anymore... Is this normal? Is a DVI cable a better choice for the primary display or do I just need to change some settings somewhere?

It's already set to 1080p in the Catalyst Control Center and I scaled it to fit the screen but it still just doesn't look anywhere near as crisp as when I had the DVI cable plugged in...
 
Must be something on your monitor. I can't think what it is though.

HDMI and DVI are both digital connections.
 
I honestly don't know what it could be. Only options on my monitor are pretty much size, brightness, gamma, all that junk.... Nothing to fix how crappy it looks with the HDMI. It still looks amazing with the DVI cable but I would think HDMI should look better but I could be wrong. Oh well, I guess I'll stay on DVI for now.
 
I think I might know what your talking about...

Is there about an inch or less black all around?

And does the text look a little strange, kinda little and dull?
 
Check in the ATI Control center, make sure the selected resolution is the proper one for your screen, and make sure your HDMI cable is of decent quality.
 
On my system (5870 normal + 5870 x6, 3 Dell ST2210 monitors) I have issues connecting monitors with HDMI as well. I have 3 identical monitors (Dell ST2210 1920x1080 21.5", DVI/HDMI/VGA inputs) so it isn't a monitor issue. When using DVI they are fine, but I switched the middle one out with HDMI for audio. The picture on it then had a 1" black bar around all sides only on the HDMI monitor. In Catalyst Control Center there is an Overscan adjustment and for some reason they automatically set it too low. Dragging the slider all the way up fixes the issue and makes the picture crisp and clear on my monitor. This may be your problem.

HOWEVER!!!!- if your monitor is a 720p (or even 1080p) television (whether big or small) it may have overscan built in. For some idiotic reason, TV manufacturers think that a pure digital signal isn't clean enough to display normally so they distort it (they think this is doing you a favor, but it is painful and obnoxious). It is mainly a carry-over from analog-signal days when a small amount of overscan would eliminate edge glitches and noise, but on a digital signal the edges don't have noise (if they do, it's the fault of the cable box or other source equipment). Anyways, this will give a particularly nasty picture when using HDMI to connect a PC to a TV with overscan behavior. You can try to tune it out with the overscan slider but usually it is not fixable.

On one 1080p TV I've used, using the HDMI port with a purely DVI (HDMI without audio) signal will disable overscan and give a clean picture, though enabling audio through the HDMI stream will also turn on overscan and kill the picture quality. On my 720p plasma, it has overscan no matter what.

If your TV suffers from the overscan problem and has a VGA port, use the VGA port instead. While VGA signals aren't as good quality, TV's generally do not overscan them and the overall picture looks better. I have my plasma (720p Toshiba 42hp66) connected this way and the picture is fine. Make sure you get the right resolution (some TV's don't correctly report their resolution).
 
Status
Not open for further replies.
Back
Top Bottom