NON preset mode

Status
Not open for further replies.

julz091

Beta member
Messages
2
hi guys, recently upgraded my pc to a nice kit (using an asus 6950HD). When i first installed the drivers the card was wroking great with a hdmi cable with full 1900 x 1080 res. however, for some reason i turned it on today and my screen (Benq E900HD) came up with the screen setting, HDMI, followed by nonpreset mode.

so now when i have it on 1900 x 1080 there is a black border around my screen, i have made sure all my drivers are up to date and cannot find a solution anywhere online. any help?
 
It should be noted that the video data-stream via HDMI is exactly the same as the data-stream in DVI connections. There are two main differences with HDMI. Coming out of the Home Theater (HT) world, HDMI also supports full 5.1 surround sound and HDMI supports control signals so, for example, a HT receiver knows how to switch from BluRay to Tivo or cable box when one or the other is selected.

Most computers systems do not use audio through HDMI and none need the control signals (unless this is a HTPC). Unfortunately, the migration from the HT world to the PC world has not be perfectly smooth, and it seems some monitors and graphics cards have inconsistent difficulties communicating (handshaking) via HDMI.

So, if you continue to have problems and your card and monitor support DVI, I might suggest you try a DVI cable instead and see what happens.
 
I thought all cards that came out with HDMI could do video as well as sound. Might not be 5.1, but still produces sound. If HDMI is the same data-string, then why should it make a difference what you use?
 
I thought all cards that came out with HDMI could do video as well as sound.
They all do video, of course. But sound is another matter. Many do have "integrated" sound capability, but if you look at the ports on the back of most graphics cards, they don't have audio "jacks" or ports for audio inputs or outputs (line in, line out, mic in, headphone out, front speaker out, rear speaker out, surround speaker out) to run out to your speakers, microphone or headphones, as a sound card does, or as motherboard integrated sound has. Right?

So the only way to get sound in that case with HDMI is if your monitor has integrated speakers - which most don't. Or, if your monitor has audio pass-through support to run your regular computer speakers - which most don't.

And let's face it, the speakers that do come integrated into monitors are not what you would call "high fidelity"! They are not surround sound, and they don't produce thundering bass as from a powered sub-woofer. Most computer users have separate, self-powered computer speakers that don't accept HDMI or DVI inputs. They accept direct connections from the sound card or integrated motherboard sound.

As I noted before, HDMI came out of the home theater world. They are migrating to the computing world for a couple reasons - mostly, of course, money. Since more and more PCs are being integrated into home theaters, folks needed a way to get video to the their TVs or A/V receivers which used HDMI. It cost more money to put HDMI and DVI connectors on TVs and receivers and since the digital quality and signal is exactly the same, why have both?

Also, DVI connectors are huge, compared to HDMI. So space is a concern, especially on graphics cards.

Remember, the folks who make big screen TVs are the same folks, for the most part, who make computer monitors. So if Samsung, Sony and Vizio never have to buy, stock, inventory and design around DVI, they save money. Now whether they pass those saving on to us consumers, that's another issue. But the point is, it is Samsung, Sony, Vizio who are driving the HDMI push, not AMD/ATI or NVIDIA (or Sound Blaster either).
 
Status
Not open for further replies.
Back
Top Bottom