Time for CPU upgrade - Going back to AMD

It matters quite a bit in games, as single threaded performance is the leading factor of the CPU in gaming. Something like the i5 4440 which is the same price as the 8350 will run circles around it while consuming less power and has no need to overclock to keep up. Intel platforms also have more in the way of features, like USB3 ports, or the more noticeable PCI-E 3 which is becoming more of a bigger deal now that things like storage are going PCI-E.

Thanks for the response. I do have USB3 ports, but the pci-e 2.0 limit was a small concern of mine. Looks like that 4440 chip is a bit outdated and a little more expensive, so maybe less future-proof but does outperform in the games I play. All the benchmarks I see show a stock 8350 hanging within a few fps of the highest Intel products, and I plan to overclock as far as she will go as usual. Aren't games going to be taking better advantage of more cores as time goes on?

Anywho, it's up and running without a reinstall of Windows surprisingly, and man did it wake my 660 Ti up! Can finally play BF4 on Ultra with probably 50fps average (dips to 40's at the lowest I saw, but mostly stays around 60). I'll have to do a proper benchmark here soon.
 
Thanks for the response. I do have USB3 ports, but the pci-e 2.0 limit was a small concern of mine. Looks like that 4440 chip is a bit outdated and a little more expensive, so maybe less future-proof but does outperform in the games I play. All the benchmarks I see show a stock 8350 hanging within a few fps of the highest Intel products, and I plan to overclock as far as she will go as usual. Aren't games going to be taking better advantage of more cores as time goes on?

Anywho, it's up and running without a reinstall of Windows surprisingly, and man did it wake my 660 Ti up! Can finally play BF4 on Ultra with probably 50fps average (dips to 40's at the lowest I saw, but mostly stays around 60). I'll have to do a proper benchmark here soon.
i5 4440 is 4th gen which is current and only usually about 10 bucks more expensive while something like an H81 board will only run you about 50 bucks. Overall to get the same features from an AMD setup it takes a board that costs about 100 bucks or more without PCI-E 3. PCI-E 3 on the graphics card front (in most situations) isn't really that necessary but everything so far is going PCI-E which means you'll need more bandwidth to keep up. For instance, my 955BE plays games fine (so does my 940BE) but lacks the bandwidth to use my M.2 SSD which is a bummer because I'd really like to use the speed of it during these tests I'm doing.

On the other hand the FX8350 is 3 years old, and the AM3+ platform is exceptionally dated. USB3.0 was an added functionality brought in by add in chips which add latency and driver headaches. Admittedly the FX8350 and the better bargain 8320 does quite fine in games. I've never said against this anywhere (for future reference), but the problem is you're already talking overclocking to keep up. You get faster performance with the Intel at 1/2 the power consumption, less heat, and without needing an aftermarket cooler to overclock. This has always been the AMD flawed logic. To push that thing to achieve closer to the same IPC you're looking at spending at least 60 bucks on a cooler that can handle the TDP output for a decent OC. Now you're over the cost of the Intel platform to get less performance in game. That doesn't make much sense does it? How about this, a 2014 review showing performance difference. Not cherry picked benchmarks on the 8350 vs 3570k that most show.
Intel Core i5-4690K Review - Gaming Performance | bit-tech.net
Discrete GPU Gaming - Devil’s Canyon Review: Intel Core i7-4790K and i5-4690K

Shows a bit more of a difference doesn't it? Not only that, but the Intel platform has better upgrade options. The 8350, end of the line.

As to DirectX 12 reference using "more cores" I'm assuming you were getting at. That's a clear no. First thing's first, developers have to natively develop the game off an engine developed off the DX12_1 feature set to properly utilize the API. Using "more" of the CPU as they don't clearly put it, is the API taking less overhead off the CPU and having a more "bare metal" interface with the GPU (This has been around for a couple years now and not really utilized anyway). This leave more room on the CPU for things like AI. So far the only implementation I've seen properly executing this (and the only thing really showed online at that) is simply sending more NPCs after a character for a more full "experience" making the screen seem less empty. Games developed with this in mind won't be coming out until 2018 or so. We're looking at slow Windows 10 adoption rates, and slow engine DX12 adoption. Games utilizing newer tech properly (or engines for that matter) has never been a transgression. So don't expect to see this properly implemented for a while yet. Bottom line, you won't see significant increases in performance from a game merely patched for DX12 use.
 
The i5 4440k was $30 more after I get the rebate that was offered, but has a locked multiplier, and I can overclock past it's single-core performance with the AMD. Motherboards available were roughly the same price. Looks like it depends greatly on the game and resolution. I plan to go with a 4k monitor as well this year, and the AMD looked to outperform in that arena, even at the stock speed. Is the Intel chip in this benchmark any good?

AMD FX-8350 powering GTX 780 SLI vs GTX 980 SLI at 4K - TweakTown's Tweakipedia
 
The i5 4440k was $30 more after I get the rebate that was offered, but has a locked multiplier, and I can overclock past it's single-core performance with the AMD. Motherboards available were roughly the same price. Looks like it depends greatly on the game and resolution. I plan to go with a 4k monitor as well this year, and the AMD looked to outperform in that arena, even at the stock speed. Is the Intel chip in this benchmark any good?

AMD FX-8350 powering GTX 780 SLI vs GTX 980 SLI at 4K - TweakTown's Tweakipedia
No you can't because an overclock doesn't overcome the latency that is inherent with their module. It's essentially 2 gimped cores sharing the same resources and the latency causes the lack of IPC. You'd have to be cranking past 5GHz to see similar figures at 1080p. More on resolution later. (Before an argument ensues, I actually have tested an 8320 under water with that same Gigabyte board vs my own 3960x and it took the 8320 a 4.7GHz clock to achieve similar numbers as my stock 3960x which uses a 3.9GHz turbo running matching cards)

As to the board and i5 4440. An Asus H81 is 54 bucks, much less than the Gigabyte board you purchased. That makes up for the *20 dollar difference between the 4440 and 8350. The i5 4440 is locked sure, but it doesn't need to be overclocked at all. Doesn't take an expensive Intel board to get more features either unless you want to go SLI or Crossfire which wouldn't make much sense considering the whole point of the 8350 is budget, right?

That brings me back to resolution and the article you linked. Firstly, the higher the resolution, the more GPU dependant frame rates are. Thing is, 4k in itself requires SLI or Crossfire just to get decent frames with medium to high settings on newer games as indicated by the article you linked. 980 SLI isn't cheap, and I can tell you on certain games 290x (which will be 390x as well) can't handle the pressure. I have Titan X SLI and I have a hard time keeping my frames high in certain games only at 1440p on maxed settings. 4k is a whole different ballpark. If you plan to get a 4k monitor just to play at a lower resolution then you're essentially wasting your money. My simple advice is unless you do creativity work that requires a bigger real estate that 4k offers don't hop on the early adopter bandwagon.
Back to performance, the links I put in my last post gives a more realistic representation on the difference between AMD and Intel at mainstream resolution (1080p) which shows that something like the 4440 will have a very clear advantage. And, the 4690k in the article is an Ivy Bridge-E chip, which is basically the same as comparing the 8350 to the 3570k from the articles I debunked earlier. Haswell simply increases the IPC gain that much further.

All in all what's purchased is done so at this point doesn't matter much. Everything I've said here I've basically outlined in my article. What you have will do what you want, just AMD as the cheaper route justification that most use is moot now when lower end i5's are about the same cost and don't need to be overclocked to achieve the performance that's needed. Not to mention the cooling requirements which also costs extra.
 
Okay that makes a lot of sense, thanks for the information and advice. I've been out of the loop too long! Let me just say I believe you when you say I could have done better, I'm just picking your brain here.

That said, the H81 motherboard is micro ATX with no option for SLI? That wouldn't have been an option for me.

This was a fed up with performance impulse buy, but I plan to make my first real investment in the 4k and gpu area this year. Never really wanted to spend money previously because any extra dough always goes to the fun car. But it's finally where I want it, so as long as it doesn't break I can have some fun dumping money into the gaming computer. 4k with a high end gpu in sli is my ideal setup. I didn't think I would need to go nuts on the CPU to make that work well, which the benchmarks I linked seem to show.
 
Well as the benchmarks also show, high end SLI still can't push 4k properly. Not even my Titan Xs. Actually, I combined my cards with my best friend's and 4 Titan X on a 4.8GHz 4790k still couldn't properly push 4k. You'll need to wait until at least next year when Pascal comes out. I believe it'll actually take until 2017 or 2018 when Volta* release to properly push 4k with high settings and good frames. By then 4k prices will definitely have dropped substantially and it won't be such a waste of cash. Until then you can get a very nice 144Hz 1440p monitor which is something a couple of 980s would run fine.

Seems you did the opposite I did. PC first, now car is getting money.

*Pascal and Volta are upcoming codenamed chips from Nvidia.
 
Just wanted to mention how ironically timed this is, but AMD themselves used a 4790k in their Quantum project box. Just goes to show those FX chips aren't powerful enough.
 
Good news! My cpu cooler from the Q6600 fits the new cpu! It gave me a decent overclock on the intel chip and kept things cool for years. Now time to see what it can do with the power hungry AMD. So far it idles at 15C, and Prime95 small fft gets it up to 49C.

Glad I found the AMD adapter for the cooler, I almost just bought a new cooler. No money wasted there, more to the gpu budget.
 
Last edited:
Back
Top Bottom