Conroe Reviews Released

Status
Not open for further replies.
AntiKryptonite.

Anyways, would you recommend buying the E6700 instead, even though a bit more? Because I kinda want something that will be good 2-3 years in the future, while also knowing all the new technology always coming out. I mean, look at my system, 2 years ago, it might have been all that, but not now.
 
No, the E6700 has a very very minor advantage over a E6600, and that to is eliminated if you do minor overclocking ;) .

BUT Between E6300/400 to the E6600 there is quite a few diffrences that makes the extra money pay off [shared L2 cache technology, and such]

-Jo.
 
Given everything said above and the real game performance compariosons between AMD and Conroe....it all comes down to price as both platforms are prettry much equal and it is the graphics card that makes the difference. I'll save my money and continue to run my AMD chip and spend my money on the real difference.....the video card. I'll wait for the G80 and AtI equivalent. The CPU and Mobo are going to wait for quite awhile!!!
 
on a new system build one needs to go with conroe but to update a 939 or AM2 one would just need a better CPU. i am going to build a fresh system so conroe looks really good to me. plus to be able to OC it that much is pretty good.
 
Does anyone have a comprehensive comparison chart of the features of the Conroe models? I cant seem to find one.....
 
no but I know of some,

E6300 - 65nm technology, I Think SSE/2/3 probably not 4 [I Think they keep that for the next models], 2x2mb of L2 cache.

E6600, Shared Cache technology [cpus connected on-die and not through the FSB anymore] 65nm, SSE4 included [for sure]

and then not much diffrences between 6600 and 6700,

X6800 - Unlocked multiplier

Theres some more new techs in the 'conroe' but those are applied to all of the models, those are the main diffrences [excluding clock speeds]
 
gaara said:
Because it's not realistic to expect the lowest spectrum of a previous family to meet the same standards as the highest end of the spectrum of a leading edge family

That's BS and you know it. If the CPU "doesn't really dictate the framerate at all," then it shouldn't matter what kind of CPU series it is.

Just look at the 40FPS difference between the E6300 and X6800! Same bleeding edge series, yet a very large difference in FPS.

Your idea is totally flawed Gaara. I'm all open for new ideas, but that just doesn't have any basis to it. Plus, you're apparently the only one who thinks like that, doesn't exactly give it much merit.

Tox1cThreat said:
what FSB speed are you talking about? 1066 or 800?

Also, you can OC the FSB, (depending on the chipset) independantly of the RAM, so you can get even higher.

Im looking at a E6300 right now, adn I dont care if its an Allendale or cut down Conroe, It still beats many of the procs in the AMD64 X2 series out now.


Its no secret or debate that currently (not future, etc) Conroe is the winning processor.

The new debate should concentrate on what chipsets are the best to max out the performance of these chips.


(and whats G80?)

I'm talking about the 1066Mhz FSB that all Core 2 Duos run on. And I know that you can overclock the FSB seperately from the RAM, but setting a memory divider leads to performance loss in a Intel system, due to the lack of an Integrated Memory Controller. Regardless, if you notice how I worded it, I never said that the memory was what was holding it back. It is literally the chipset that cannot stand the strain of a 420Mhz FSB, up from 266Mhz.

G80 is the core name for GeForce 8.

HAVOC2k5 said:
I could never mistake that type of attitude. "I'm right, you're all wrong." I'll quote you 7 thousand times and twist your words and post endless graphs and drawings until I make you pee your pants in tears...

Its usually because I AM right, and others ARE wrong. Case in point; Gaara's current argument. Gaara, Nubius, etc. aren't all-knowing beings. I should know, I listened to them while learning. I believed them, but I went outside of this forum and did my own **** learning. Now I can come back and correct them. Something of a "pupil has surpassed the teacher" kind of thing going on. You'll all be better off knowing that they're sorts of radicals.

"Twist your words"? My dear friend, I only quote exactly what others post and reply from there. I have never vandalized a quote.

Prove to me how I'm wrong and I'll stop. :rolleyes:
 
Yoad said:
no but I know of some,

E6300 - 65nm technology, I Think SSE/2/3 probably not 4 [I Think they keep that for the next models], 2x2mb of L2 cache.

E6600, Shared Cache technology [cpus connected on-die and not through the FSB anymore] 65nm, SSE4 included [for sure]

and then not much diffrences between 6600 and 6700,

X6800 - Unlocked multiplier

Theres some more new techs in the 'conroe' but those are applied to all of the models, those are the main diffrences [excluding clock speeds]

All the Core 2 Duo's (Conroe) have shared cache. The E6600/6400 have 2MB of shared cache and the E6600/6700 and X6800 have 4mb of shared cache.
 
That's BS and you know it. If the CPU "doesn't really dictate the framerate at all," then it shouldn't matter what kind of CPU series it is.
Ok so you took what I said literally and went out of your way to find something that illustrates your point. Let's examine the scenario a bit more closely:

1) You picked a game using the Source engine, which is probably the most well optimized engine ever built. Clearly if you're looking at an engine that isn't going to tax hardware, and the source engine is not really taxing at all, you're going to have a much wider framerate spectrum because rather than having framerates that may vary from 0-60FPS, as you saw, the framerate gets as high as 200+. Therefore small 10% discrepencies start looking bigger than they really are, but they're still not very big

2) Source engine is again probably the most CPU dependant engine built as it has some of the more advanced AI out there and no doubt the most advanced physics availible in an engine.

Again, I've been trying to answer your questions to clarify my position while you've been dodging mine. Now once again, where is the bus that's causing the bottleneck and how does one go about correcting it? You guys insist there's a bottleneck but you don't even seem to know where it is. i can't exactly prove that you're wrong if you don't even tell me what you're talking about
 
Status
Not open for further replies.
Back
Top Bottom