Wierd video card

Status
Not open for further replies.
Horndude:

Yeah. But that just sounds like the actual designer of the card setting a "lower standard" of accuracy for the 6600GT than the high-end card. Rather than the 6600GT not sticking to its specs during run time.. That's what I meant my "errors" :).. It not giving "100% accurate results" at a resolution that it should be capable of rendering.

In terms of "pure" accuracy (when compared to a real world object), I would imagine that even the most expensive cards will have "errors" as it is trying to "digitally" render an image.. The real life details that would be available at the very finest scale would not be available even in such a card..

I would imagine that when a 6600GT says "I can display images at 1024-768", it "guarantees" that the image will be "100% exact" at this resolution. And it is just incapable of looking at the object at a resolution of like 3000x2000. Of course, even a 6600 GT can zoom in on any image to any scale, if needed be, and provide "more fine details" ASSUMING the object is "rendered again" around this new zoomed in area. But, a "simple zoom", without rerendering, would not give as good results with the 6600GT as with a more expensive card.

This accurate? :)
 
Yes, I think so, its a little more complicated than that I think though.

Years ago when vid cards really started becoming something, back in the voodoo2 and voodoo3 days tomshardware did a comparison of video output quality among several different gaming cards, and one at that time, a hires work station card, and they put snapshots of the screens on the page for comparison, you'd be suprised at how different some cards render certain things.Drivers make a huge difference, so does every piece in the rendering chain including the monitor.In the case of workstation cards up until recently OpenGL was the graphics api of choice, now directX is being used alot too.But its still no matter what you do, nothing more than an approximation.But certain qualities are more important when doing hires work than high volume work like what a gaming card does, I mean framerate on a workstation isnt that critical like in games, but on the otherhand some of these hires monitors have dual DVI inputs, and are setup to be as accurate as possible and have fairly high refresh rates as well,so they need lots of vid RAM, and be very stable and hold that stability, some cheap gaming cards can even be affected by temperature and refresh rates.

Now if you took an nvidia gaming card,an ATI gaming card, and a workstation card, and had identical setups for the rest, and had them all render the same complex image, there are going to be some differences, some minor, some not.No graphics card can be good at everything, what workstation cards attempt to do is focus on rendering quality AND adhere to some very strict driver standards that gaming cards typically do not.

Ive seen this happen on my rigs lots of times, some cards are better than others, some do a really good job with video for example, some dont.Some handle shading and other image enhancements better than others,its hardly an exact science so to speak.Video drivers in gaming cards in particular can be really weird sometimes, they have to cut corners in order to get speed, and youve probably seen often in some games the video quality can vary quite a bit from game to game, its not just the software, its also the card itself.You wont see as many differences in cards that have well tested and refined drivers or games that have been properly designed, but they do handle things differently, and the video output will show it.Workstation cards usually have very very accurate and tested drivers, this is why OpenGL has been so popular for so long, it hasnt had any of the growing pains directX has gone thru, its well established and pretty well known and hasnt changed much.
 
My knowledge on the subject is sort of vague so I'm simply giving a vague answer. Programs like autocad use very very very precise measurements and calculations and the GPU handling those needs to keep them as precise as possible. When you're using autocad framerate obviously has no meaning so you want to aim for a card that has precision and reliability, not speed.

Now I'm not saying that an nVidia or ATi card makes thousands and thousands of miscalculations, you'd notice that. I'm simply suggesting that in games framerate is important as the card has to be able to keep up with gameplay, so graphical quality is somewhat sacrificed to keep the picture smooth.

Think of the two like enabling AA, if you have it off you'll have a higher framerate and more speed but your picture quality won't be as precise. Enabling AA will smooth out edges and other crap but it costs you FPS in the end. See where this comes in?

Same deal with ECC RAM, they're built more for reliability for speed, but that's not to say that unbuffered RAM is registering errors all over the place.
 
Status
Not open for further replies.
Back
Top Bottom