What You've Just Bought!

I recently modified the bios of my X79 board to accept and use newer UEFI based cards (Pascal, Turing) as well as NVMe M.2 SSDs. When I upgraded from my 3960x in 2016 I only did so because I wanted to move to faster SSDs. Well, I'm going back. Not long ago I did a comparison between my 3700x, 3960x, and 7800x which all had similar gaming performance. The only advantage the newer two had was AVX2 in benchmarks like 3dmark. I have a Samsung NVMe drive and RTX 2080 working on the board now and I plan to upgrade the CPU to a Xeon 1680-V2 which is an 8 core Ivy Bridge-E processor that has an unlocked multiplier and known to clock to 4.6GHz on an AIO or big air cooler. On my water loop I'm sure I can get 4.8GHz which is where my 3960x sat at 24/7 in my main rig for years. This will give me similar performance to the 7800x while also having 2 more cores. Hence the 32GB kit of 1866 DDR3.

My SB-E 8 core CPUs in my old server have on par gaming performance to first gen Ryzen CPUs. The one I'll be getting once overclocked will more than likely match my 3700x locked at 4.3GHz in gaming and only be slightly edged out by the 7800x because I can run it at 5Ghz all day long. Mind you I'm running Super Ultrawide res so raw CPU performance isn't required so much.
 
I was always told that the Xeons were great for servers but crappy for gaming and that you should go with a core processor like an i7 or i9 like for example the 50k mac pro doesn't game very well but for a server or for video editing it is great also why no one games on the older mac pros
 
Misinformation generally. A Xeon is nothing more than a more stable binned brother to the Core line. They aren't regarded for "gaming" because they're usually higher core count and have slower base clocks. Anything below the SB-E and IB-E lines will be even slower even when overclocked because the architecture is just generally too slow (like in that clapped old Mac Pro).
People only say go "Core i7 or i9" for gaming because for years AMD wasn't competitive and the fanboys want to rely on that 5 extra FPS at 1080p as a win when in actual reality the IPC of Zen 2 is ahead of Intel. If I could clock a Ryzen 3700x at 5GHz it'd actually run circles around a 9900k. Clock for clock tests show this with both at 4GHz.
Back to Xeons, when you get out of that cloud 9 of Intel fanboy misconception and back to reality when you compare things realistically having a 5GHz clock at the end of the day doesn't matter much. We can compare my work Xeon vs my 3700x running the same card.
https://www.3dmark.com/spy/10397341
https://www.3dmark.com/spy/9655112

The graphics score being so similar means the gaming experience will also be similar. Now compare that to a mildly clocked 3960x from 2011 at 4GHz.
https://www.3dmark.com/spy/12629518

All 3 chips that are almost a decade apart are going to give a similar experience, especially at 1440p where the machine is more GPU bound.

That 1680-V2 I talked about is a 7 year old chip, has 8 cores like my 3700x, and I can almost guarantee under water when clocked like my old 3960x is actually going to peak gaming performance over my shiny new Ryzen even as a Xeon. The kicker being, will I notice the difference? No, and most won't either. I built a rig for a friend that's sporting an E3 1240-V5 which is basically an equivalent to a 6700k. If he put a newer GPU in that machine it'd run just as fine as any other rig relatively. When playing the Xeon game it's all about paying attention to architecture, core count, and boost clocks. Some of them can even be overclocked.

Bottom line is CPU performance hasn't really evolved much in the past 10 years. It was stagnant for most of it and then became a core vs clock speed race which really doesn't benefit the average gamer much. It's why the 3300x is a slam dunk even being a quad chip with SMT. The funny part about that is, the guys running a 2600k clocked at 5GHz are still outperforming a 2020 chip without DDR4 or PCI-E 4.

Edit:
After looking at some of these charts it appears I'm in the top 100 for a few in the 3700x category.
Top 20 for Port Royal lol.
https://www.3dmark.com/newsearch#ad...de=false&showInvalidResults=false&freeParams=
 
Last edited:
I simply don't trust AMD because I mostly buy used laptops and all the cheap laptop makers would but cheap AMD cpus in them that wouldn't out perform a Commodore 64 resulting in a piece of junk. I'm no gamer but for everyday performance you will notice the difference even between one of those older amd cpus and even a processor as crappy as a pentium or a core 2 duo the intel ones win every time. The second thing about AMD is they are not as good for virtualization. I have just always seen AMD cpus as strictly for gaming and nothing else.
 
I simply don't trust AMD because I mostly buy used laptops and all the cheap laptop makers would but cheap AMD cpus in them that wouldn't out perform a Commodore 64 resulting in a piece of junk. I'm no gamer but for everyday performance you will notice the difference even between one of those older amd cpus and even a processor as crappy as a pentium or a core 2 duo the intel ones win every time. The second thing about AMD is they are not as good for virtualization. I have just always seen AMD cpus as strictly for gaming and nothing else.
I can agree on the laptops, I have 2 in my house now (dual core APUs) and even made a video on them. Thing is, what we've dealt with isn't Zen architecture and their newer stuff is crushing it in the mobile department. I've been running a 3700x for almost a year straight now and not only does it perform as good as my older 7800x in gaming, it's been more stable, cooler, consumes less power, and obliterates my Intel in productivity.
As to virtualization, big server farms would say otherwise. The move to Epyc for a lot was for a big reason. We're even moving to AMD where I work starting with the mobile scene Q4 of this year.
 
I will forever be loyal to intel. Intel has a larger share of the Consumer cpu's because they are frankly better however going into 2021/2022 possibly 2023 when apple starts making their own CPUs things might change for intel.
 
I will forever be loyal to intel. Intel has a larger share of the Consumer cpu's because they are frankly better however going into 2021/2022 possibly 2023 when apple starts making their own CPUs things might change for intel.
Misconception, they're not better they only have a higher clock speed which doesn't last past Thermal Velocity Boost limit. At the end of the day the difference is moot, so don't keep a blindfold on because a company doesn't benefit from fanboyism or brand loyalty. Learn to look at the picture as a whole rather than "who's better than who at this minute". This is especially true if you plan to work in the industry. Try to remember Intel only has dominance because AMD made a bad call rolling with Bulldozer for a decade. AMD got cocky when they released A64 and now Intel is in the same boat not expecting AMD to come back as hard as they did. One key thing to pay attention to is look at whatever architecture Jim Keller worked on, and it's almost guaranteed to come out on top. If Intel didn't sit on their asses and perfected their 10nm process Willow Cove would be out now destroying the competition. Complacency in dominance will only lead to failure, like it did with AMD 15 years ago and like it is with Intel now.

Apple's only a 4% share of Intel's market. Their goal is total walled ecosystem which won't mean much in the overlying outcome of the industry. Their chips are good but it won't do much to take down the x86 dominance of the industry as a whole.
 
Back
Top Bottom