No more e6600

Status
Not open for further replies.
Psh, you guys are lame. A bottleneck would only occur if you had say an Athlon 64 3000+ or something, even that's pushing it. The E6300 is plenty fast.
 
an e6300 you say? mind you, the e6400 isn't that much more. I'll probably end up going with that one.

Thanks all for your help
 
The General said:
Psh, you guys are lame. A bottleneck would only occur if you had say an Athlon 64 3000+ or something, even that's pushing it. The E6300 is plenty fast.

The old X-Bit Labs benchmarks showed a 2.0Ghz Athlon 64 being the bottleneck point for a 7800GT videocard. See for yourself. A 8800GTX is quite a bit stronger than a 7800GT. If you take that into account, it doesn't seem unlikely that a 2.4Ghz Athlon 64 or a 2.0Ghz Core 2 Duo (same thing) would be liable for a bottleneck.

I don't think that anyone really needs to worry about bottlenecks as long as they overclock, but if you don't, then you might want to consider where you spend your money.

And General, weren't you the one who had "Nope, no bottleneck here," on a macro or something? Why the change of heart?
 
My Athlon64 @ 2.4ghz is the bottleneck on my system. And my x1800gto only runs with 16pipes @ 700/700 which puts it around a gefore 7900gt. So I wouldn't see my processor keeping up with a 8800gtx.

I would just get the e6300 and overclock.
 
If you go cheap on a processor and buy a $650 video card, the CPU will most definitely be your bottleneck.
I wouldnt go any cheaper if I were you.

for all intents and purposes bottlenecks due to the cpu only occur at low resolutions. i don't think that link proves much at all. more and more people are tending towards playing at 1600x1200 and i can almost guarantee all the differences in those bench marks will smoothe out at that resolution. that and the fact notice that even Athlon 3200+ scored almost the same as an FX57. that is 2.0GHz versus 2.8GHz. honestly how does that tell us that an 8800GTX will be bottlenecked by an e6300?

summary: at relatively low resolutions performance is cpu limited because the graphics card has more than enough power to run it's "part of the bargain" at maximum speed, differences in cpu speed will therefore be noticed. who complains about the difference between 105 and 200 fps???

edit: for a good laugh here is the conclusion from the article that was linked to as proof that there will be a bottleneck

It is true: you donÂ’t need a high-end processor for real gaming with realistic settings and high image quality. The gaming performance will still be limited by the graphics card.
It is true that faster CPUs than those mentioned in the minimum system requirements do not really stimulate and significant fps rate increase. So, the slower processors models from the Pentium 4 and Athlon 64 processor families can cope easily with the latest generation 3D shooters.

and we are talking about the bottom in the next generation of processors.
 
Hmm, not quite. The benchmarks at the lower resolutions only show how future-proof a processor is.

CPU limitation isn't quite like what's been discussed in this thread. In any CPU-intensive zone in a game, that's where your framerate will drop.

Humor me; A system with a fast CPU, and another with a slower CPU, both running the same game, both using the same videocard.

For the majority of the game, the two systems will run you at similar FPS, with the faster CPU system winning maybe 5% of the time. But when you enter a CPU-intensive zone, that's when the CPU kicks in. A popular example used is a major city in World of Warcraft. The CPU has to start networking and requesting the character models, not to mention the placement of the characters.

That's just one example of a CPU-intensive situation, there are others. Something that threw many hobbists for a loop was the difference in performance seen with faster CPUs in Anandtech's Oblivion "Town" benchmarks. It was rather obvious, actually, that the CPU was having to calculate more AI and character placement and physics and sound than in any other game before that time.

True, CPU limitations become apparent at lower resolutions, but for different reasons. Those are more along the lines of the description of a GPU bottleneck.

Its those pesky "busy-CPU" times that you need a fast CPU for. Otherwise, you're fine with a slower one, as long as you can ignore those situations. I wouldn't want to be seen slowing down with a 8800GTX in my system though, it's just not decent. :)
 
lol - The biggest bottleneck in any PC to date is "Software". next in line to that is, "Hard Drives" or "Bus" or "Memory"... The CPU is 99% of the time, the fast component in the PC. Most of the time it's waiting for other "slower" components. Like HDD's and Memory - or lack there of.

The majority of CPU's out there today in gaming PC's will not bottleneck a 8800GTX. May-be a low, low end celeron or a sempron.

But and Athlon, X2, P4, PD, CD or CD2 at least 1.8GHz will not bottleneck even the new 8800GTX. The GPU is only 575Mhz on standard models. Even a Athlon 1.8Ghz is over 3 times faster then the GPU. The GDDR4 on the card is even over 3 times faster.


CPU limitation isn't quite like what's been discussed in this thread. In any CPU-intensive zone in a game, that's where your framerate will drop.


The GPU isn't sitting there in "CPU Intensive" zones going, "Oh come on already... would you please hurry up and give us the friggin' data we need??? I can tell you that it's the CPU saying that to the HDD, Memory, BUS and Software.

The obvious differences between benchmarks of CPU's w/ the same video cards... shows only one thing... That the CPU's are faster and produce better performance. "That doesn't mean that the CPU is bottlenecking the GPU."

You want to dispute bottlenecks in games? Start with the HDD.

Games Such as BF 2 and Oblivion rely upon the virtual memory/pagefile to buffer the textures upon the models. Windows defaulted the management of the pagefile to the Applications, and the pagefile was 1028 MB initial, and 2056 MB maximum.

In Battlefield 2 until Patch 1.4

In the Map “Daging Oilfields” because object density, and textures…. Everyone had issues with Lag……High End Box…or Low End Box….

The fix to this was implemented in 1.4 Patch and it made the pagefile size “Custom” and opened the pagefile to 2056 MB initial 3084 MB maximum.

So IÂ’m sorry but Hard Drives Directly effect the Performance in a Game, and if they are not up to the TaskÂ… you will BottleneckÂ…
 
HAVOC2k5 said:
But and Athlon, X2, P4, PD, CD or CD2 at least 1.8GHz will not bottleneck even the new 8800GTX. The GPU is only 575Mhz on standard models. Even a Athlon 1.8Ghz is over 3 times faster then the GPU. The GDDR4 on the card is even over 3 times faster.

Oh jeez, that's about the worst way possible to go about justifying your argument.

Lets see you get a A64 3000+ and try to run a 8800GTX on it, haha. That'd be amusing.

HAVOC2k5 said:
The GPU isn't sitting there in "CPU Intensive" zones going, "Oh come on already... would you please hurry up and give us the friggin' data we need???

Precisely. That what the GPU is doing in low-res benchmarks. Which are different than CPU bottlenecks at higher resolutions.

Basically to answer your post, while its obvious that such things as HDDs, Memory, etc. are bottlenecks in a system, adding a CPU limitation will make you lose additional performance. And there isn't much you can do about the other bottlenecks, that's just how a system is designed. True, they are bottlenecks, but there isn't much you can do about them. Just keep in mind that adding another bottleneck to those already mentioned, like a CPU, only hurts you more.
 
Status
Not open for further replies.
Back
Top Bottom