ATI, Nvidia, etc..... just paying for the name?

Status
Not open for further replies.
Hey Dale,

How are those zalman 5.1 headphones? I was looking at those at newegg and thinking about buying a pair. What's your review on them?
 
You people need to get your facts straight. The ATI's can play those games just fine, they just can't have specular yadda and soft shadow blah blah. And they will suffer the loss of a few fps also. BUT THEY WILL PLAY THEM FINE. ATI still has better aa and af anyway.

So... your saying that Shader 3.0 DOES improve performance? Or are you still dising NVidia... I'm not sure...

Can I please see a source for this aa/af better on ATI? It would be good reading.

The simple fact: NVidia owned this series of video cards, with the exception of perhaps the higher end ones (Which, from what I see, are at best equal). ATI will probably own the next round of video cards.

Its what I read, its what I believe.
 
nvidia did own this round, and thats cuz of PSM 3.0. this is coming from an owner of an X850 XT
 
They are a lot better than stereo I think havoc, but then again I that's only coming off of a reasonable plantronics and an altec lansing. Can't really compare them to DSP whatever $150 stereo set. But the surround sound is awesome and I can locate sounds really easily in-game, easy to target enemies, etc. Bass is great as well, and on maximum volume they get pretty loud and you can hear them across the house. The titanium speaker units must be doing their job, very top quality, I doubt they would ever break. While I think games sound awesome on them (you can hear very quiet sounds also) some have complained about how music sounds on them. Now some of these people are coming off of expensive DSP headphones. Personally I think the music sounds great after I changed around a few of the equalizer settings and what-not in windows media player. (I also cranked up the bass to full just for fun!) I also enabled in my audisy 2 zs sound panel this CMSS thing which plays the stereo sound across all the surround sound speakers. This made the music sound a lot more full, clear, and louder. Microphone is top-notch also, doesn't need to be anywhere near your mouth.
The headphones are reasonably comfortable I think, I have to shift the top part of it back and forth sometimes because there is a little discomfort from the band (which is padded btw) if you wear it in the same spot for a few hours. The ear muffs are very comfortable on my ears, no discomfort there. But in warm conditions your ears will get warm after a few hours of wearing them (especially in a tense match what you're sweating already) so you would have to vent them from time to time or have a fan blowing at ya. Though I just typically take them off and wipe the sweat on my shirt when my ears get warm. Oh and forgot to mention, I think they look sweet AND pimp at the same time.
All-in-all I think they are superior to stereo sets for locating enemies while gaming (fps-style) and they are probably the best surround sound set out there for the money. And for under $50 they are a great deal and you can't really go wrong I say. Heck, in 2 years of gaming I broke like 5 stereo headsets, I've had these zalmans for 9 months now and they are still going strong and probably won't die on me for years yet.
And get them from here:
http://www.xoxide.com/za51suulgahe.html
And if you want the zalman mic as well there is a box at the bottom where you can add it in to your order. This should be the best price you can find, and I have ordered form them many times. All transactions went good and shipping was just about on-par with the egg.

Back on topics.
@fading theory

Some people seem to have misconceptions about pixel shader 3. Everyone is saying that it is essential and that all current ATI cards suck because they do not have it. But the fact is nothing uses it right now so why are you complaining about your ati card not having it? Gosh dhw, for having an x850 xt I would think you would be happy with it as it is the 2nd best card on the market right now. Don't you enjoy it at all? Why are you even thinking about ps3? There are only a couple things that have it right now and even so...should still be good.
So the argument primarily lies in the use of future games which will utilize ps3. Some people seem to think that ati cards won't be able to run these games because they don't have ps3. This is not true. In such games nvidia users with the ps3 option will be able to select a few additional "ps3-only" special effects such as soft shadows, and specular lighting. These are good for eye candy but as one review I remember saying it hurts your fps and they found that enabling those was a little too cumbersome to get a suitable-enough fps. Now besides the eye candy offerings of ps3 there are added performance benefits as well. Nvidia card holders who run games that use a lot of ps3 in them will have improved performance, based upon that same review someone linked to, I believe the test was on some splinter cell game, they should gain around 10 fps on average in a ps3 shaded game like the new splinter cell. Now again, ATI card holders will still be able to run these games just fine but will get less fps and less eye candy availability. Think of it like a doom 3 type of situation. The ps3-enabled nvidia cards will simply outrun the ati cards in such an instance.
The common misconception is that ati cards won't be able to run ps3 games which is incorrect, so for the (currently) few games scheduled to be releasing with ps3 ATI users can still play. And for all the rest of the games enjoy even more! Also the comment on aa/af. Typically the trend has been that when you start tossing in high levels of aa and af ATI cards will suffer less of a hit then nvidia cards will when you start cranking it up. Nothing major, but I do believe that is the case.

Anyway I tried to clear a few things up. Not trying to flame anyone and hopefully I won't recieve any back. I just think as a happy customer of ATI I hate to see all these negative comments about them that are very over exaggerated. They make great cards, they didn't really screw up and lose the whole generation or whatever. I think a lot of people fight over this stuff way too seriously. I mean we're talking about very marginal little things here. Both vid card companies make good cards and they serve their purposes well. I don't think bashing ati for not having ps3 is a good idea. ATI still holds as the best for D3D games I think. In the AAO community I'm going to estimate that approximately 90% of the competitive community has an ati card. Most of those people laugh at people with nvidia cards and they all think ati is the best.

And ATI is not losing this generation. For proof I got this article. :)
This is for the year 2004.

"Graphics chipmaker Nvidia saw its profits drop nearly 80 percent in the second quarter due to turbulence in the desktop market and increased competition.

The earnings, coupled with a downgrade Friday from Pacific Growth Equities, sent Nvidia's shares plunging. In midmorning trading on Friday, the company's stock was down $4.67, or 32 percent, to $9.89.

On Thursday, the Santa Clara, Calif.-based company reported earnings of $5.1 million, or 3 cents a share, compared with $24.2 million, or 14 cents a share, in the same period a year ago. Revenue was $456.1 million, slightly less than the $459.8 million the company reported during the same period last year.

Analysts expected the company to report revenue of $501 million and earnings of 15 cents a share.

"Q2 was challenging and disappointing, as the desktop (graphics processing unit) segment declined significantly as a result of several unusual market events," Jen-Hsun Huang, CEO of Nvidia, said in a statement.

In a conference call, Huang attributed the decline to market share losses and competition with archrival ATI Technologies.

Analysts said the company lost market share to both ATI and Intel. In a research note released Thursday, Jon Peddie Research stated that Intel's share in the graphics chip market rose from 33 percent in the first quarter to 37.7 percent, while Nvidia's share declined from 27.2 percent to 23.2 percent. ATI lost market share--but only marginally, sinking from 24 percent to 23.2 percent.

One of the slowest-growing sectors in graphics was the market for standalone graphics chips, the market Nvidia specializes in.

In the early part of the decade, Nvidia surpassed ATI to become the largest graphics provider in the world. In late 2002 and early 2003, however, Nvidia had to postpone a new line of chips at a time when ATI's designs were catching up in terms of performance. The two companies and Intel, which makes graphics chips that are integrated into chipsets, are the three dominant players in the market."
http://news.zdnet.com/2100-9584_22-5299219.html
 
Well, ATI works great for AAO, but of course that game isn't the most revolutionary in terms of graphics. Thats like says "X800 is the best card in the world, it runs Quake 3 perfectly!" If I were you, I would quit basing my preference in graphics cards on old game engines. Look at how each side is handling the latest revolutionary engines such as Doom 3 and the Source engine.

I understand where you are coming from about PS3.0 and few games using it today. The truth is though, why pay $500 for a card that doesn't have PS3.0 and doesn't have the eye candy when you can pay $300 for one that does and gets almost as good of framerates in all games except CSS/HL2 (which ATI has the edge)? Maybe if 3DMark05 is whats most important to you, you should go ATI, but as far as games go, Nvidia has the upper edge right now, although I think ATI will pwn this next generation.
 
dale5605 said:
You people need to get your facts straight. The ATI's can play those games just fine, they just can't have specular yadda and soft shadow blah blah. And they will suffer the loss of a few fps also. BUT THEY WILL PLAY THEM FINE. ATI still has better aa and af anyway.

One more thing...you don't pay $400+ for a card to merely "play the games." The whole point of a graphics card that expensive is to play them with all the eye candy. If I wanted to "PLAY THEM FINE", I would go with something like a Radeon 9600XT, which can do just that.
 
beedubaya said:
One more thing...you don't pay $400+ for a card to merely "play the games." The whole point of a graphics card that expensive is to play them with all the eye candy. If I wanted to "PLAY THEM FINE", I would go with something like a Radeon 9600XT, which can do just that.
When buying a new computer for a games machine, it best to spend about 25% to 30% of the total price on the video card.
Anything more, and the video card is going to be slowed down by the computer.
Anything less, and you will be needing to buy another video card shortly.
 
Actually beedubya aao is not based on an old engine. The latest version (2.4) is the first game to be put on the new unreal 2.5 engine. After a couple more versions aao will also be the first game to be put on the unreal 3.0 engine. So they are certaintly not using outdated engines, and the graphics look absolutely amazing on the new levels. It's been a long time since you've seen it.

And I didn't mean play them fine like 9600 xt fine. Better than that. Look at that splinter cell article. I couldn't even tell the difference between the eye candy ps3 nvidia screenshot and the ati screenshot, and the fps were pretty close. This is what I mean by people like you overexaggerating things. The x850 and x800 will have just as much eye candy as it has today, just not that little extra bit that ps3 will offer. Personally I think games look good enough as they are. If I had a nvidia card and I was playing splinter cell I doubt that I would even enable specular lighting or soft shadows to be perfectly honest. They said in the article that they cut down the fps a good amount. And like I say, making a huge deal over current ati cards not offering soft shadows and specular lighting is like making a huge deal at nvidia for not offering the anti-alising options that ati has. All of these little eye candy things are so minor anyway. I honestly can't even tell the difference in the screenshots, maybe very acute eyes can, but still, I don't see why it's such a big fricking deal.
 
Status
Not open for further replies.
Back
Top Bottom