Rather than post up a topic outlining the differences in performance between these two gaming platforms, and rather than starting another debate on consoles vs computers, my intentions are more along the lines of answering a question which has always baffled me... why is it that consoles so easily outperform computers??? why does it take a tremendously powerful computer to even compare with a console?
I mean lets face it, the two most common consoles today are the Playstation 2 and the Xbox 1...
fair enough... both still have games being developed for them (despite both being aeging consoles) and both have remarkable graphics considering their age...
Looking at the technical side of things, the PS2 has a 300MHz CPU and 32MB of ram. Imagine if you built a computer based on identical Intel hardware... fat chance itd run anything newer than Unreal Tournament. yet the PS2 still supports all of the latest PC games including Splinter Cell Chaos theory, Battlefield 2 etc etc.... and while the graphics have been downtuned on these titles, I find it remarkable enough that they can even run at all on the PS2...
Yet looking at the computer i used to use, which was equipped with a 1.6GHz Athlon, 512 MB DDR ram and a 128MB Geforce FX5200 graphics card, it couldnt even run Chaos theory and battlefield 2 at an acceptable frame rate, even on the lowest graphical settings, whereas a 300MHz cpu couples with 32MB ram techically called the PS2 can handle these games without a problem...
Look at halo 1 as well, the Xbox's 700MHz Pentium 3 / Geforce 3 combo produces lovely pixel shader effects and silk smooth framerates, whereas the PC port of halo staggers (even with pixel shaders disabled) on my old rig, which was much better equipped at 1.6GHz with a DX9 Geforce 5200 card...
only now am i getting all of the nice effects on the games ive mentioned, without sacrificing my framerate.... but thats because i folked out around $900 for a Athlon 64 with a Geforce 7600GS...
Now i know computers are designed to multitask and to run general applications, whereas consoles are designed for the one specific purpose (games), but even so... how the **** can the OS be bogging down a computer to the extent that a 300MHz CPU (PS2) runs quicker than an Athlon XP 2000 on windows. equipped with much faster hardware.... it just dosent make sense.
If computers were just as efficient as consoles, my mates Pentium 2 300MHz with a Geforce 2 MX would be all I need to play all of the latest games...
Comparing these two platforms side to side really makes me wonder... is windows really any good for gaming at all? and is there a way, through utilising a better programmed OS, to extract more performance in games out of a desktop PC. cause surely, if a Xbox running at 700MHz can still run the latest games, then a Pentium 3 (PC) running on the right software / OS could do the same.... its just a matter of better utilising the computer hardware... right?
I mean lets face it, the two most common consoles today are the Playstation 2 and the Xbox 1...
fair enough... both still have games being developed for them (despite both being aeging consoles) and both have remarkable graphics considering their age...
Looking at the technical side of things, the PS2 has a 300MHz CPU and 32MB of ram. Imagine if you built a computer based on identical Intel hardware... fat chance itd run anything newer than Unreal Tournament. yet the PS2 still supports all of the latest PC games including Splinter Cell Chaos theory, Battlefield 2 etc etc.... and while the graphics have been downtuned on these titles, I find it remarkable enough that they can even run at all on the PS2...
Yet looking at the computer i used to use, which was equipped with a 1.6GHz Athlon, 512 MB DDR ram and a 128MB Geforce FX5200 graphics card, it couldnt even run Chaos theory and battlefield 2 at an acceptable frame rate, even on the lowest graphical settings, whereas a 300MHz cpu couples with 32MB ram techically called the PS2 can handle these games without a problem...
Look at halo 1 as well, the Xbox's 700MHz Pentium 3 / Geforce 3 combo produces lovely pixel shader effects and silk smooth framerates, whereas the PC port of halo staggers (even with pixel shaders disabled) on my old rig, which was much better equipped at 1.6GHz with a DX9 Geforce 5200 card...
only now am i getting all of the nice effects on the games ive mentioned, without sacrificing my framerate.... but thats because i folked out around $900 for a Athlon 64 with a Geforce 7600GS...
Now i know computers are designed to multitask and to run general applications, whereas consoles are designed for the one specific purpose (games), but even so... how the **** can the OS be bogging down a computer to the extent that a 300MHz CPU (PS2) runs quicker than an Athlon XP 2000 on windows. equipped with much faster hardware.... it just dosent make sense.
If computers were just as efficient as consoles, my mates Pentium 2 300MHz with a Geforce 2 MX would be all I need to play all of the latest games...
Comparing these two platforms side to side really makes me wonder... is windows really any good for gaming at all? and is there a way, through utilising a better programmed OS, to extract more performance in games out of a desktop PC. cause surely, if a Xbox running at 700MHz can still run the latest games, then a Pentium 3 (PC) running on the right software / OS could do the same.... its just a matter of better utilising the computer hardware... right?