Performance: Consoles vs Computers

Status
Not open for further replies.

eddie06

Beta member
Messages
2
Rather than post up a topic outlining the differences in performance between these two gaming platforms, and rather than starting another debate on consoles vs computers, my intentions are more along the lines of answering a question which has always baffled me... why is it that consoles so easily outperform computers??? why does it take a tremendously powerful computer to even compare with a console?

I mean lets face it, the two most common consoles today are the Playstation 2 and the Xbox 1...

fair enough... both still have games being developed for them (despite both being aeging consoles) and both have remarkable graphics considering their age...

Looking at the technical side of things, the PS2 has a 300MHz CPU and 32MB of ram. Imagine if you built a computer based on identical Intel hardware... fat chance itd run anything newer than Unreal Tournament. yet the PS2 still supports all of the latest PC games including Splinter Cell Chaos theory, Battlefield 2 etc etc.... and while the graphics have been downtuned on these titles, I find it remarkable enough that they can even run at all on the PS2...

Yet looking at the computer i used to use, which was equipped with a 1.6GHz Athlon, 512 MB DDR ram and a 128MB Geforce FX5200 graphics card, it couldnt even run Chaos theory and battlefield 2 at an acceptable frame rate, even on the lowest graphical settings, whereas a 300MHz cpu couples with 32MB ram techically called the PS2 can handle these games without a problem...

Look at halo 1 as well, the Xbox's 700MHz Pentium 3 / Geforce 3 combo produces lovely pixel shader effects and silk smooth framerates, whereas the PC port of halo staggers (even with pixel shaders disabled) on my old rig, which was much better equipped at 1.6GHz with a DX9 Geforce 5200 card...
only now am i getting all of the nice effects on the games ive mentioned, without sacrificing my framerate.... but thats because i folked out around $900 for a Athlon 64 with a Geforce 7600GS...

Now i know computers are designed to multitask and to run general applications, whereas consoles are designed for the one specific purpose (games), but even so... how the **** can the OS be bogging down a computer to the extent that a 300MHz CPU (PS2) runs quicker than an Athlon XP 2000 on windows. equipped with much faster hardware.... it just dosent make sense.

If computers were just as efficient as consoles, my mates Pentium 2 300MHz with a Geforce 2 MX would be all I need to play all of the latest games...

Comparing these two platforms side to side really makes me wonder... is windows really any good for gaming at all? and is there a way, through utilising a better programmed OS, to extract more performance in games out of a desktop PC. cause surely, if a Xbox running at 700MHz can still run the latest games, then a Pentium 3 (PC) running on the right software / OS could do the same.... its just a matter of better utilising the computer hardware... right?
 
Well I didn't read all of your post because I'm just lazy this morning. There are some big reasons though to my knowledge, one being that the games are optimized EXACTLY for that hardware. For a computer... that's not the case because there is a wide variety of parts.

Secondly, the gpu and the cpu are made just for gaming as well as the ram. The gpu is not just a general computing graphics card, it's a gaming card so a lot of the extra stuff on and that a ... a 128mb computer video has or has to do isn't there.

It also has to do with the OS, the OS on ps2 is made for just gaming, unlike windows or so.
 
Like aaronkupen said, the os does have ALOT to do with it, as well as the way hardware instructions are used. For the xbox, ps and nintendo consoles, the os is geared for gaming. Nothing more. For a PC, the os has ALOT of fluff added to it. Even if your not running it at the same time, windows is constantly running many os critical programs in the background, as well as many apps are running there, too. Most of these apps and os critical are constantly monitoring what the user is currently doing, so that takes an enourmous amount of resources. Remember, the os also has to constantly monitor all hardware and ports, as well.
 
yeh for sure, its these points which spark some curiousity for me. when you weigh it up, performance wise windows OS is about 15% (for example) as efficient as a console for gaming all in all. so when you break it down, 85% of the power is lost to the other functions aside games, and the compatibility shortcummings of games and hardware. itd be really nice if there were a way to temporarily "switch off" these extra features while playing games, or even better, itd be nice if there were a Gaming os which existed which is optimised (much like the consoles OS / BIOS) for nothing but gaming.... im surpised no one has thought of this yet... if such a product came out many people would save thousands of $$$ in hardware upgrades.....

edit:

oh and just found out that the Xbox not only runs on a pentium 3 processor / geforce 3 combo, but its OS is actually a modified version of windows (wikipedia). So its pretty much a desktop PC (no suprise look at the size) with a tweaked OS.... if only such a thing existed for desktop pc's imagine how much better games would run :p
 
1. The OS's on consoles take up a lot less resources for.

2. The games dont have to talk to drivers, they can talk directly to the hardware.

3. The architexture is completely different. For examply the PS2 does have a 300 mhz processor but its 128 bits, while most computers running xp only use 32 bits. So four times the amount of stuff can get done per clock cycle.

4. Hardware level talking can take place, the CPU for example doesnt have to talk to the OS, then go from the OS to the GPU.
 
The generic tv resolution is 800x600, the last generation of consoles games were designed around this while computers would display twice that, grafix are not just toned down for console games, they lose features altogether from there cp counterparts such as draw distance, complex shadowing, particle effects, high resolution texture detail.

The next generation consoles it seems are a big upgrade, as it should be to keep up with the high end gaming software that continues to come out. It still will lose things like sound quality and utility to the pc, I don't think the platforms and pc are very compairable, I certainly enjoy some of the games that have come out in the past for both though.
 
Actually I think the normal TV res on a non-HD TV is actually 640x480. Like previously stated, since it's OS doesn't clog up the system's performance it runs more efficient. They also use less quality textures so it can fit in 32 or 64MB or RAM (just look at the Xbox version of DOOM3, textures are bad). You also have games that are optimized for the hardware. The Xbox uses a special GPU which is like a split from a Geforce 3 and a Geforce 4, so it's a custom GPU which can't be bought for a PC. They also use DX8, while many current PC games are running DX9. Most console games don't use AA, and they give horrid jaggies when paired up with the low res.

One good note is how bad the textures are in the BF2 games on the consoles, the ground looks like crap (once again thanks to only 32MB of RAM). Another is that many games have fps that are between 25-40, HL2 on the Xbox was locked at 30fps (and this is with DX8 so no reflections and other DX9 goodies).

It's fairly simple:

*No OS getting in the way
*Dedicated hardware with optimized games
*Low res and low res textures
*Most games are locked or run somewhere between 25-40fps


As for the PS3 being brilliant, yes but it's as complicated as quantum physics.
 
NTSC (N. America and others) tv's run at 640x480 resolution. The framerate they display at is 29.97fps. Those are the specs the consoles have to run at.
 
Status
Not open for further replies.
Back
Top Bottom