Anti-aliasing Bust

Status
Not open for further replies.

qblake

Baseband Member
Messages
100
Location
None of your business
I have a ATi Radeon HD 5770 running with a quad-core Athlon on a 400w PSU (If you say the psu cant run this configuration, you are very wrong). When i first got the card it ran perfectly fine, played all my games on a 1080p TV with all the settings on maximum. But a couple months ago, i noticed that the anti-aliasing would only go up to 8xmsaa, instead of 16xmsaa, like it had done before. Not that i have a problem with 8x, but i love quality, and i would love the quality of my games to go up higher, so I was wondering if anybody knew how to get this back up to max AA, and if anyone could please tell me how.

Thanks all!

PS PORTAL 2 WILL BE THE MOST EPIC GAME EVER!!!
 
Uhh, you are very wrong, actually. While the card will run on 400W, several parts on your system may not be able to run at its full potential. You need at least a 500W at that configuration. What make of PSU do you have? Full specs?
 
agreed with MoM, I wouldn't recommend running a quadcore and 5770 on a 400w psu. But to your question, there should be an option in the Catalyst Control Center to override AA, or there might be a force option in-game.
 
I GOT ANTIALIASING UP TO 24x!!!

Full Specs:

AMD Athlon II X4 635

ATi Radeon HD 5770

2 HDDs

6GB DDR3 1066 RAM

2 ODDs

thats about it, i dont have a dedicated sound card or anything else
 
An Athlon II X4 will have about a 95W TDP, and a 5770 definitely uses <100W
I'd expect total system power consumption to be less than 220W under load including drives, RAM


Anyway, what AA does is blur edges of objects so that they don't look jagged.
I think there is such thing as too much AA, causing unnecessary blurriness, and thin objects (like tree branches, fences) can often be blurred to the point that they become invisible.

I generally prefer to use about 4X AA even when performance is more than enough to run higher levels.

Instead of increasing the AA level, I'd rather use a different mode of AA that provides better quality (such as edge-detect, which can be set in CCC).
Edge detect is quite expensive for performance compared to regular MSAA though, but is better at detecting actual edges (hence the name), so textures are generally sharper while more edges are anti-aliased.

Actually, edge-detect AA uses less vRAM but uses more shader time. So it has the most benefit with older games that are less shader intensive.
 
Well, that 220W plus all his other devices.

But with the cpu and gpu out of the way everything else should easily be 100w or less combined. Provided it's a good quality 400w psu I don't see it having any problem running that system.

One of the Anandtech writers is running a core i7 965x and HD 5850 crossfire on a 450w psu so you can get by with a lot less power than most people think.
 
Well, that 220W plus all his other devices.
It would be including his other devices.
Let's say all 4 cores and the GPU are at 100% load; that would be ~95W + ~65W = 160W (5770 is half a 5870, which uses about ~120W in Crysis)
Adding the RAM (10W), two hard drives (~7.5W each, that's if they're both being used), and the unlikely event that both optical drives are also being used all at the same time (12W)
If all devices are being used at the same time, including the CPU + GPU, it may be close to 220W.

The main benefits to having a quality power supply are:
* not likely to die
* better efficiency at varying loads
* cleaner power (no ripples, spikes, stable voltage)
 
I was always taught to get higher than is required. So if you want to upgrade, you have the room. I'm sure the 650 on my built is overdoing it, but if I can keep it for awhile..

Well, plus I'm OCing my CPU. But I doubt he's doing that.
 
Status
Not open for further replies.
Back
Top Bottom