Use(less) 8-Core?

Thorax_the_Impaler

Minecraft Veteran
Messages
352
Location
127.0.0.1
Hello everyone!

A while back I saw a thread on here talking about an AMD 8-Core processor. I remember I posted that I felt it would be overkill for just about anything it could be used for, as for most modern tasks there would be a large amount of processing power going to waste; and by thatI mean not being used. I don't entirely agree with this anymore. However, I do still stand by one thing; and that was that as of now it would be unnecessary and....well, that's really about it.

I understand that years ago people were probably saying the same thing about dual-core processors, but six-cores aren't really popular yet, and I don't foresee eight-cores becoming necessary anytime soon. No computer builder I know stresses the need for six-core processors, and I don't think the majority of people would need an eight-core processor.

What's your take on this? Personally, I think an eight-core processor is a great technologic achievement, but I think it's highly unnecessary right now.
 
Let's put it this way, the only true 8 core CPUs are in servers, if that puts it into perspective at all.

I have a 3960x and ran it at stock, with turbo, at 4.2, and at 4.5 in the course of 6 months. Due to problems I am now back on my i5 750 (stock). In general tasks and 99% all gaming there is absolutely no difference. Maybe one or two FPS on games that literally take advantage of faster single threaded performance.

What does this have to do with your topic? Well for starters, the FX series isn't true "8 core". There are 4 modules with two IC's (integer cluster) per module that share resources going in, split the thread, process, then re-integrate, in simplicity. As opposed to a single Intel core that does not share any such information or resources to process a thread. In all reality, it is a more physical version of Intel's virtual Hyper Threading (single core + virtual core). So in other words, it's like having 4 people and say 2 1 armed men working. It gets the job done only slightly faster. Thus, the problem with IMC and single threaded performance. Most all applications rely heavily on single threaded per core performance because they simply aren't optimized for more than 1 or 2 cores and software patching (depending on software) splits certain aspects to off-load certain loads to other threads. This is better, but still the 1 armed man tactic where most of the load is still mainly on one or two cores.

But PP, get to the point. The point is, there simply isn't enough day to day software or graphics engines that fully utilize more than 2-4 cores making even the 1 armed man or Module strategy pointless. Intel accelerates in that single threaded (cycles per clock per core) performance giving their quads the upper hand making them 6 and 8 cores pointless. The only real programs that can utilize that kind of processing power wont be used on your typical consumer desktop which is why the real CPUs with more than 6 cores are left for servers and workstations. I explained this in my own article as to why a fast cheaper dual core will work perfectly for most if not all games for the next 5 years give or take.

We are at a time now where software is lagging far behind hardware and games further behind that. We wont see any game engines possibly take advantage of more cores or RAM until the new consoles come out. Even then, it's 50/50 if it will actually carry over to the PC after the port (excluding Cryengine 3, Frostbite 2, and Unreal Engine). Other software in the professional industry is taking advantage of more cores and RAM but we wont see that carry on to other more consumer useful products until much later. These 2 reasons are why I generally recommend an i3 3220 or even the Ivy Bridge Pentium to most budget rigs. Also, the help of Cuda and GPGPU acceleration as it is more implemented will reduce further the need for more cores as more general processing is done on the GPU which is much more powerful.

So in other words, I'm with you that they are great if and when we need them, but it's pretty pointless as it stands.

As of right now, the need for an SSD in every machine to further lower the $/GB is more of a necessity than more CPU power because the HDD/SSD is still the slowest part in your machine.
 
We are at a time now where software is lagging far behind hardware and games further behind that. We wont see any game engines possibly take advantage of more cores or RAM until the new consoles come out. Even then, it's 50/50 if it will actually carry over to the PC after the port (excluding Cryengine 3, Frostbite 2, and Unreal Engine). Other software in the professional industry is taking advantage of more cores and RAM but we wont see that carry on to other more consumer useful products until much later. These 2 reasons are why I generally recommend an i3 3220 or even the Ivy Bridge Pentium to most budget rigs. Also, the help of Cuda and GPGPU acceleration as it is more implemented will reduce further the need for more cores as more general processing is done on the GPU which is much more powerful.

So in other words, I'm with you that they are great if and when we need them, but it's pretty pointless as it stands.

As of right now, the need for an SSD in every machine to further lower the $/GB is more of a necessity than more CPU power because the HDD/SSD is still the slowest part in your machine.

Well let's be honest here; most modern day-to-day software doesn't and probably never will require a ridiculous amount of processing power. I like to think the majority of processor-demanding tasks these days comes from gaming. And even then as you said in some setups the GPU will do more of the general processing, taking a load off the processor so to speak. Let's say that in the near future common software will take advantage of more cores and more RAM. Sure the task will get done faster; but I think that notion is based on the ridiculous desire of ignorant computer users who find comfort in the idea that their computer will have room for them to do useless things without affecting performance. As an example I strongly believe single core setups are horrendously outdated, but if they have the right OS, they work fine for those people who just surf and stream youtube. Hell, I had an old Dell Dimension desktop with a 2.8 GHz single core processor and 1GB RAM running the YLMF OS that I gave to a friend, and it works just fine for him for streaming and such. I'll grant however, that single-core technology is falling further behind everyday, and someday soon they will be unusable.

I feel the bottom line is that a processor with eight decently clocked physical cores will not be needed for years. In consumer PC's the potential of six core processors hasn't even been scratched yet, and no consumer OS I know of requires more than 2 GB RAM and two cores of processing power. You are completely right that software is falling behind the advancement of modern hardware, but I firmly believe the lack of everyday use for that hardware is what stuck it's leg out and tripped it.
 
Well let's be honest here; most modern day-to-day software doesn't and probably never will require a ridiculous amount of processing power. I like to think the majority of processor-demanding tasks these days comes from gaming. And even then as you said in some setups the GPU will do more of the general processing, taking a load off the processor so to speak. Let's say that in the near future common software will take advantage of more cores and more RAM. Sure the task will get done faster; but I think that notion is based on the ridiculous desire of ignorant computer users who find comfort in the idea that their computer will have room for them to do useless things without affecting performance. As an example I strongly believe single core setups are horrendously outdated, but if they have the right OS, they work fine for those people who just surf and stream youtube. Hell, I had an old Dell Dimension desktop with a 2.8 GHz single core processor and 1GB RAM running the YLMF OS that I gave to a friend, and it works just fine for him for streaming and such. I'll grant however, that single-core technology is falling further behind everyday, and someday soon they will be unusable.

I feel the bottom line is that a processor with eight decently clocked physical cores will not be needed for years. In consumer PC's the potential of six core processors hasn't even been scratched yet, and no consumer OS I know of requires more than 2 GB RAM and two cores of processing power. You are completely right that software is falling behind the advancement of modern hardware, but I firmly believe the lack of everyday use for that hardware is what stuck it's leg out and tripped it.

I would have to agree to this as well. Software cannot catch up.

Btw, is it true that AMD FX processors are not true 6-8 core processors? That turns me off a bit. I guess it really comes down to the architecture of the processor for performance and not the amount of cores these days. Same goes for the GHz speeds as well.
 
Well let's be honest here; most modern day-to-day software doesn't and probably never will require a ridiculous amount of processing power. I like to think the majority of processor-demanding tasks these days comes from gaming. And even then as you said in some setups the GPU will do more of the general processing, taking a load off the processor so to speak. Let's say that in the near future common software will take advantage of more cores and more RAM. Sure the task will get done faster; but I think that notion is based on the ridiculous desire of ignorant computer users who find comfort in the idea that their computer will have room for them to do useless things without affecting performance. As an example I strongly believe single core setups are horrendously outdated, but if they have the right OS, they work fine for those people who just surf and stream youtube. Hell, I had an old Dell Dimension desktop with a 2.8 GHz single core processor and 1GB RAM running the YLMF OS that I gave to a friend, and it works just fine for him for streaming and such. I'll grant however, that single-core technology is falling further behind everyday, and someday soon they will be unusable.

I feel the bottom line is that a processor with eight decently clocked physical cores will not be needed for years. In consumer PC's the potential of six core processors hasn't even been scratched yet, and no consumer OS I know of requires more than 2 GB RAM and two cores of processing power. You are completely right that software is falling behind the advancement of modern hardware, but I firmly believe the lack of everyday use for that hardware is what stuck it's leg out and tripped it.
I agree and disagree. I agree on the lack of software as I mentioned myself, and the ignorant users who feel they need 8 cores for future proofing when in 5 years their GPU will become obsolete anyways. What I disagree on is the notion that single cores are still relevant for people who do basic tasks. I also have to say that everyday software has only really been held back by the mechanical HDD. The advancement in technology has always been the competition side of things but there is serious lack there as well.



I would have to agree to this as well. Software cannot catch up.

Btw, is it true that AMD FX processors are not true 6-8 core processors? That turns me off a bit. I guess it really comes down to the architecture of the processor for performance and not the amount of cores these days. Same goes for the GHz speeds as well.
Yes it's true. Google search the architecture of the AMD FX Module. A 4 core is 2 modules, an 8 core is just 4 modules. I explained this in my last post in simplicity and why AMD is lacking behind Intel. They use GHz speed to compensate which then increases TDP and heat which people don't want.

I have a feeling when software catches up all of us will be playing the hardware catchup game. It's been a cycle I have been watching and it seems that software will be updated to support newer features of increasing hardware technology then we will want better hardware to further increase speed and productivity.
 
I agree and disagree. I agree on the lack of software as I mentioned myself, and the ignorant users who feel they need 8 cores for future proofing when in 5 years their GPU will become obsolete anyways. What I disagree on is the notion that single cores are still relevant for people who do basic tasks. I also have to say that everyday software has only really been held back by the mechanical HDD. The advancement in technology has always been the competition side of things but there is serious lack there as well.



Yes it's true. Google search the architecture of the AMD FX Module. A 4 core is 2 modules, an 8 core is just 4 modules. I explained this in my last post in simplicity and why AMD is lacking behind Intel. They use GHz speed to compensate which then increases TDP and heat which people don't want.

I have a feeling when software catches up all of us will be playing the hardware catchup game. It's been a cycle I have been watching and it seems that software will be updated to support newer features of increasing hardware technology then we will want better hardware to further increase speed and productivity.

Gotcha. I am still good then for the future.
 
I wouldn't say so. The new consoles are still quad based. Games will use up to 4 cores natively and anything after that is about the way it is now. Cycles per clock will always have an advantage over more cores +GHz. What you are still good for in the future is the fact that a modern CPU of today will still handle future games fine much like my old 6000+ does. The only thing we will really need to concentrate on upgrading is the GPU year to year.
 
I wouldn't say so. The new consoles are still quad based. Games will use up to 4 cores natively and anything after that is about the way it is now. Cycles per clock will always have an advantage over more cores +GHz. What you are still good for in the future is the fact that a modern CPU of today will still handle future games fine much like my old 6000+ does. The only thing we will really need to concentrate on upgrading is the GPU year to year.

I expect that for the GPU and will always keep this in mind.
 
I expect that for the GPU and will always keep this in mind.

In all honesty it's nice if you think of it this way. I would rather have to upgrade my GPU to keep relative performance rather than the whole PC. Couple hundred every year on a GPU to stay up to date is the equivalent of somebody who eats a lot of fast food. I know if my best friend put his fast food fund into a GPU fund he would be able to afford a few 690s at the end of the year.

Since I was bored, I did the math. If you spent 5 bucks on lunch for work every day and put that into a GPU fund you would have 1,680 bucks to spend on your PC.
 
In all honesty it's nice if you think of it this way. I would rather have to upgrade my GPU to keep relative performance rather than the whole PC. Couple hundred every year on a GPU to stay up to date is the equivalent of somebody who eats a lot of fast food. I know if my best friend put his fast food fund into a GPU fund he would be able to afford a few 690s at the end of the year.

Since I was bored, I did the math. If you spent 5 bucks on lunch for work every day and put that into a GPU fund you would have 1,680 bucks to spend on your PC.

I have to say upgrading one part of your machine every year rather than the whole thing is a nice concept. Personally I'd rather buy a GPU every year or so than buy a whole new setup. Though I've never met someone who eats fast food everyday for lunch. lol
 
Back
Top Bottom