x800-gt gaming compatibility

Status
Not open for further replies.

david_1475

Solid State Member
Messages
15
Just a quick question about vid cards. I'm putting together a new computer and the best value for money that i have found is a x800-gt ddr3 256mb video card. Do these 'x000' cards tend to have compatibility problems running games? I have a hunch that someone once told me something along those lines and a moment ago when i tried to find out the campatibility for battlefield 2 i found a page which said:

"Video card must have 128 MB or more memory and one of the following chipsets:
NVIDIA GeForce FX 5700 or greater
ATI Radeon 8500 or greater"

There is no mension of the x000 types. Has anyone had much experience with these cards?

Thanks.
 
get a 6600GT< it just destroys then X800GT

and any of the X8*** series from ATI are just crap
 
Yeah, dont even think about a Radeon X8** series card because they suck tremendously. Take it from me, I had an X800 Pro (supposed to be a good card), I got a 6800 GT from Nvidia and I can't believe the difference. Its the same way with all ATI vs. Nvidia cards of the same price range. Do yourself a favor and get the 6600 GT.
 
david_1475 said:
Just a quick question about vid cards. I'm putting together a new computer and the best value for money that i have found is a x800-gt ddr3 256mb video card. Do these 'x000' cards tend to have compatibility problems running games? I have a hunch that someone once told me something along those lines and a moment ago when i tried to find out the campatibility for battlefield 2 i found a page which said:

"Video card must have 128 MB or more memory and one of the following chipsets:
NVIDIA GeForce FX 5700 or greater
ATI Radeon 8500 or greater"

There is no mension of the x000 types. Has anyone had much experience with these cards?

Thanks.

I dunno guys, the X800GT beat the 6600GT in performance benchmarks...

The video card will be fine with Battlefield 2 and almost any other game. The X--- series are better than the 8500s.

However some games like Battlefield 2 prefer nVidia over ATi so a 6600GT WOULD perform better in the case of battlefield 2. 6600GT is a great video card and has features that ATi cards up to he X850XT PE lack.

Might I suggest waiting a while for the X1600 series of ATi video cards to come out? The X1600XT with outperform both the 6600GT AND the X800GT and will cost around the price of these video cards right now.
 
Yeah when do those new ATI cards come out, i might have to upgrade to one of those in the future.
 
The X850XTPE does beat the 6800 Ultra very badly in preformance, that's just a fact, and why shouldn't it? It has some insane clock speeds going for it, but it lacks SM3. One thing I will give ATI is their anostropic filtering quality. It is beautiful. It has no texture "shimmering" like nvidia does with AF. The only way to remove the shimmering with an nvidia card is to turn the image quality all the way up to high quality in the control center, but that results in a major preformance drop. ATI has perfect AF at the normal preformance setting. For this reason I am seriously considering an X1800XL for my next video card upgrade.
 
macdawg said:
this is the only computer forum I know that bashes ATI cards.

At the bottom of this link they show screenshots playing the Lost Coast with a 7800 GTX on the left, and an ATI Radeon X850 XT on the right, and there is difference in quality.

http://www.bit-tech.net/gaming/2005/09/21/lost_coast_benchmark/1.html

There is actually no difference in quality. And yeah, the ATI X series pretty much sucked compared to the GeForce 6 series. However, ATI now has a chance to turn that around with the X1K series. Believe it or not, this forum used to be an Nvidia basher when I first joined, back when it was GeForce FX and Radeon 9800 series. That was right as the GeForce 6 series and Radeon X series cards were hitting.
 
Thanks for the feedback. The reason that i started this topic is because, as Flanker mensioned, the x800gt does seem to beat the 6600gt in performance, and yet everyone on here seems to bag them out. I've been using the following pages to compare video cards, and using the memory transfer rate as the primary indication of overall speed.

ATI cards: http://www.hardwaresecrets.com/article/131
nVidia cards: http://www.hardwaresecrets.com/article/132

From the stats on those pages (making an estimate for the x800gt based on the others in its class) it appears that an x800gt with 256bit memory interface beats the hell out of the 6600gt with only 128bits. It is also a 256mb card as opposed to only 128mb for the 6600gt. I'm tossing up between the following 2 cards: (prices in australian dollars)

http://www.umart.com.au/pro/products_listnew.phtml?id=10&id2=82&&bid=2&sid=13631

http://www.umart.com.au/pro/products_listnew.phtml?id=10&id2=82&&bid=2&sid=9035

The only thing that the 6600 seems to have over the x800 is the DDR3 memory, which isnt specified on the x800 card, so ill assume it uses standard stuff.

So here is my problem. What is it that makes the 6600 better than the x800? The only thing that i can think of would be compatibility issues.....
 
the X800 is DDR3.

here is an interesting article about PS3.

http://www.techbits.ca/index.php?m=show&id=94

Have you ever wondered why you were compelled to buy the newest and the fastest? Simple, you like the performance and new abilities of new hardware. But what if what you were about to buy didn't give you what you bought right away? Unfortunately this is the case today--Pixel Shader 3 has brought itself onto the edge of view. This article is a light review of Pixel Shader 3, and what you can expect of it as well as your card.
Amidst the controversy that is the seemingly never-ending battle of Pixel Shader 2.0 versus Pixel Shader 3.0, arises a cloud of technicalities and professional hogwash that not even the most astute fan can clearly follow. My adventure for the pursuit of the truth began as a mere whim of the imagination when I was perusing the depths of news updates that lurk about the internet. For a long time I was baffled as to why there was controversy at all with bringing about PS3.0 in the future generation chips of the hardware giants, Nvidia and ATI, that is until I recollected the fact that PS2.0 has yet to be used in more than a handful of games.
I found it rather interesting that the gaming industry itself harped on how essential to our gaming experience it was to complete PS2.0 and prepare the world to be shocked by its real time wondersÂ… Years have passed, and we find ourselves stranded within the desert that is PS1.x and completely devoid of PS2.0 much less 3.0.
My research started by rather brash and crude word searches on the net—which yielded several sites with the answers I needed to what Pixel Shaders do and what forms there are. PS1.x used 16 bit precision to calculate light and/or map changes on a per-pixel basis. At 16 bit depth however, only so much data can be calculated per pixel and advanced calculations are impossible. (even though these calculations are far from simple) Hence the immediate need for PS2.0. As games slowly progress towards the PS2.0 line, we prepare ourselves for 24 bit depth that is “perfection.” However within this 24 bit perfection we find our first bit of controversy.
Basically what Pixel Shading performs is placing a specific map on another polygon or texture and giving it instructions. These instructions can be as simple as bump mapping with ray traced shadows (from light source shadows). Or it can be as complex as setting a moving bump map that cycles and reflects everything around it (water). Pixel shading can also perform radiosity type calculations which act like real world lighting. Think for example that you are in a room in your home with a sun beam pouring through the window. When you place an orange piece of paper in the sun beam, everything turns a shade of orange—the trim, the white wall, everything. This is the effect of the light grabbing the color of the paper and bouncing off with it.
The human eye has much difficulty discerning between 24 bit and 32 bit… and yes, being the graphics engineer that I am can tell you, I can see the difference when I look very hard, but only on a single image… not an animation of frames. The perfection (which is undoubtedly not perfect) of 32 bit graphics that would be contained within PS3.0. Within this blurb of whimsical banter you will find almost nothing to follow—yet I have not begun to explain. Within these bits of data (16, 24, or 32) that are transferred through your video cards pixel pipelines are the raw code lines that speak “clarity.” In short the higher the number of bits per pixel the clearer and richer the image, correct? Yes, to a point though, one would find it difficult to see the clear difference between two colors on the color wheel when they are 99.7% alike. This is where the controversy between 2.0 and 3.0 lies.
Advanced engineers that create these kinds of graphical environments for programmers to employ find flaws in their previous releases as all humans make mistakes. Turns out that they only partially programmed 2.0 to do various advanced tasks that 1.x could not do and worked feverishly to complete the code—hence, 3.0 was born. Problem – why sell a 2004 model when the 2003 has been marketed but not sold a single model? A possible answer is damage control. Basically what’s going on here is that the industry is trying to force the market curve to meet their needs and circumvent 2.0 as quickly as possible by forcing 3.0 out as a “perfect solution.” There isn’t any conspiracy here or a hidden agenda, the programmers of PS apparently found multiple advances in given areas and are eager to get them out to save us all a little time and energy.
So why does this matter to you, the consumer? Well there are two obvious points of view that one could hold. The first viewpoint is that you want the most advanced effects in your game possible, and if this means sacrificing several frames per second, so be it. You paid $500 for your video card and the powerful tools within it, by God you want it to deliver! Pixel shader 3.0 has all of the code of 2.0, plus some very meager performance tunes that will help 2.0 instructions run a bit quicker. However, in running 3.0 effects, you will only lose frame rate performance and gain but a very small margin in visual clarity. The second viewpoint is that you want raw performance—anything for that extra frame. You mortgaged your home and sold your car to purchase this card so that your games can look very good and play faster than a rocket. Many, many users fall into this category as not all of us can afford $1,000 CPU’s to back up our already cripplingly expensive video cards. In the current position of the market, only one game utilizes PS3.0, “FarCry,” so in essence you really don’t find the need to own a piece of hardware that supports this new feature. Purchasing a new video card does not necessarily mean that everything that it can support will be utilized. In this case, your decision to purchase a new Radeon X800 or a GeForce 6800 will fall into this category. Whichever of these two you fall into, that is entirely up to you, in my opinion it is not going to make a bit of difference if you have PS3.0 support or not… to each his own.
ATI technologies has now confirmed that their new R4xx line will not have PS3.0 vertex shaders or pixel shader units on the boards. Some may consider this a travesty as Nvidia has already placed this functionality on its GeForce 6 line. But in all honesty I do not think this is going to be a major problem for ATI or the consumer base itself. Regardless, both of the top end cards cost around $499 Retail and both have their strengths—Whether you opt for the Geforce or the Radeon really won’t cause you traumatic loss or gain in either spectrum, both are very powerful cards. Gaming companies have yet to develop games that efficiently and effectively use the PS2.0 system without causing dramatic drops in performance. Take HALO for example; the game itself has very low frame rates within the corridors and hallways of the alien ship and areas that have highly reflective walls and bump mapped areas. In these areas, the new R420 and NV40 are truly needed for playability. Keep in mind that the industry is geared at satisfying the most number of customers with a given average level of hardware—yielding more sales.
This brings me to the point that this entire hiccup on the net is just the whining of top of the line enthusiasts wanting to satisfy their number needs. Given that the worlds’ most popular external video card is a GeForce 2 MX400 running on DX7 hardware (unable to run even PS1.0) keeps most gaming companies from coding for the use of these advanced engines and must stay on track with Hardware T&L and non-volumetric fog. Consider that the largest number of computers sold on the market have integrated video solutions as they cost very little to produce when compared to their 256-bit external brothers. These video cards are often not even capable of decent frame rates on DX7 games—much less DX8 and 9 games. Until the industry forces the consumer base up to DX9 most games will not support PS2.0.
As far as PS3.0, the only reason I can find of its use right now is just for the slight advancement in efficiency that it provides over 2.0 in PS2.0 calculations. Unfortunately for the gaming community, only a handful of multimillion dollar budget game titles will sport PS2.0 and 3.0 in the near future (Doom 3, HL2, etc).
Honestly many games that come out in the next year or so still will not employ the pixel shader engine as pixel shading really takes a lot of horsepower, users are more interested in smooth game play than looks. With the advent of the new top of the line cards R420 and NV40, we find ourselves on the very fringes of a graphics revolution. Corner cutting techniques that yield excellent and sometimes better results than good old high polygon counts are already here. It is just a matter of time before we have the power to back these features up. The future is here and we can have but a taste of it and it sure is sweet. Whether it be PS2.0 or 3.0 doesn’t really matter right now—only after a couple more years will we be worried about what version we are running because the difference right now is next to nothing.
 
Status
Not open for further replies.
Back
Top Bottom