Post your rigs here.

BTW, upgrading simply due to the love of computers is a good reason as long as it does not hurt the budget and life expenses. It's better than committing a crime, right?

If I'm upgrading, I'll do a core swap (CPU/mobo/sometimes ram if needed) but unless something dies, or wanting for better something (I.E. Better cooling, bigger HDD, etc...) I keep everything else.

I might look at the new nvidia GPU's this time around, but unless it's significantly better than my 1080ti now, and the price tag won't cost me a new car, I won't bother...

I though you were gonna say something like GTX 970 or AMD RX 580.

If you're a 60 FPS gamer like myself, 1080 Ti can hold its own for a long time. Over high quality grafix is not noticeable in game play. 1080 Ti now supports Ray Tracing too (been for a while) but that is also controversial to notice in game play compared to traditional equal rendering like rasterized reflections. This could only leave one thing; DLSS, which if done well it could make the 1080p look almost like 4K.

Personally I wouldn't replace even a GTX 980 (non Ti). I had to replace the GTX 680 because it can't hold its own close to 1080p60 anymore and I'm on a 60" TV to play at 720p any longer. Try lowering the grafix a little. Some setting improve performance drastically without losing worthy quality.

CPU bottlenecking, or because of resolution in a CPU demanding game? When I did my testing of my own CPUs against each other I put my 2500k in there and I only had one game dip into the high 40s because I was benching everything at Ultra 1440p using a single RTX 2080. More than likely it just needed to be overclocked.

I upgrade what I need and only what I need, and usually it's because I want rather than a pure need. For instance if I moved to Ryzen 4000 series this winter I won't be upgrading my RAM or motherboard unless IF ratio can handle past 1800 @ 1:1. (Aka won't go above 3600 unless I can run 3800 or 4000 1:1). I've had the same PSU since 2009 and I intend on upgrading it but it's only to something I can keep another 10 years and sideload this to another rig.

Isn't "because of resolution in a CPU demanding game" basically CPU bottlenecking too? If a CPU is the bottleneck at any resolution, it will be so at any resolution and any setting that does not concern logic, I believe. I did OC the CPU to 4.4GHz from stock 3.4GHz but it did not help much. Newer games have more logical processing that affects FPS and older CPU's have slower IPC and less cores so over clocking won't help much. I'm talking CPU +7-8 years old. I believe 3-4 year-old CPU's are still good to go for 60 FPS. CPU's have tiers, true, but I'm talking gaming affordable tiers like the higher models of i5. But to be clear, and although CPU bottlenecking is not just this, the replaced CPU did jump near 99% so it's a clear culprit that it held GPU at usages like 50-60%. It could have been the RAM too, but my mobo can't support more than 2800MHz (new RAM is 4000MHz). I did study the case carefully. Many emulated games, Assassin's Creed Origins NFS Shift and some others caused this problem.

I might have calculated wrong, but I did my best in testing and asking around. Kudos for the efforts, right?
 
60?? Nope, most times I'm unlimited FPS with 144Hz. I also have a 4K monitor that I sometimes play on so that one is limited to 60Hz, but I still play with best settings. Better performance does not always mean max it. But why limit myself to something when this current hardware I have, can and will work hard; while I play hard. :p

... and besides *IF* I do replace this GPU, I'll most likely pass it down to the kid, just like I did when I gave him the 980ti.
 
144Hz? Now that's the true PC master race! I'm just a 60Hz peasant and my RTX 2020 will have to last for at least 5 more years ;)
 
My monitors, including my new laptop are all 1080p. Biggest reason for going to build a new computer is some of the software I'll need to use for astrophotography processing.
 
@TechnoChicken Oh, that! Words like graphics, though, through, tough and rough, I type as grafix, tho, thru, tuf and ruf in unofficial contexts to save time and effort. There was even a console with the name TurboGrafx 16 officially used that does refer to graphics. But much obliged, really!

Say, are you 2k resolution and above fellas really aware of the differences compared to 1080p with edge softening techniques? I also hear of using those high resolutions on smaller monitors like 32". Or do you guys PC-game on +42" screens? I'm confused. I'm not aware of the difference other than in some little cases I see aliasing on just some far objects in games without good anti-aliasing techniques, and on the 60" TV I'm using, mind you. Otherwise the image is so sharp and clean for me. Higher resolutions came out to get along with the increasing screen sizes, after all.
 
I am running three 24.5" 1080p 165Hz monitors (these) mounted on a 3+1 stand (here). I have an old NEC MultiSync LCD1970GX on the top mount that I use for desktop widgets and secondary/monitoring apps. My RX 5700 XT is enough to keep them busy.
 
BTW, upgrading simply due to the love of computers is a good reason as long as it does not hurt the budget and life expenses. It's better than committing a crime, right?



I though you were gonna say something like GTX 970 or AMD RX 580.

If you're a 60 FPS gamer like myself, 1080 Ti can hold its own for a long time. Over high quality grafix is not noticeable in game play. 1080 Ti now supports Ray Tracing too (been for a while) but that is also controversial to notice in game play compared to traditional equal rendering like rasterized reflections. This could only leave one thing; DLSS, which if done well it could make the 1080p look almost like 4K.

Personally I wouldn't replace even a GTX 980 (non Ti). I had to replace the GTX 680 because it can't hold its own close to 1080p60 anymore and I'm on a 60" TV to play at 720p any longer. Try lowering the grafix a little. Some setting improve performance drastically without losing worthy quality.



Isn't "because of resolution in a CPU demanding game" basically CPU bottlenecking too? If a CPU is the bottleneck at any resolution, it will be so at any resolution and any setting that does not concern logic, I believe. I did OC the CPU to 4.4GHz from stock 3.4GHz but it did not help much. Newer games have more logical processing that affects FPS and older CPU's have slower IPC and less cores so over clocking won't help much. I'm talking CPU +7-8 years old. I believe 3-4 year-old CPU's are still good to go for 60 FPS. CPU's have tiers, true, but I'm talking gaming affordable tiers like the higher models of i5. But to be clear, and although CPU bottlenecking is not just this, the replaced CPU did jump near 99% so it's a clear culprit that it held GPU at usages like 50-60%. It could have been the RAM too, but my mobo can't support more than 2800MHz (new RAM is 4000MHz). I did study the case carefully. Many emulated games, Assassin's Creed Origins NFS Shift and some others caused this problem.

I might have calculated wrong, but I did my best in testing and asking around. Kudos for the efforts, right?
Yes and no. If your GPU is sitting at over 85% at said resolution it's not a bottleneck regardless. If the GPU is the bottleneck at a higher resolution the CPU isn't a bottleneck anymore. That being said, I know how old that CPU is and my 2500k is older as it's almost 11 years old and the architecture prior to IB which the 3570k is from. Asscreed Origins and Odyssey run terrible in general so those aren't very good comparisons, and Shift hell I played that on a 940BE lol. Take note, Intel hasn't changed architecture since Skylake which is 5 years old, and an overclocked 2600k (Sandy Bridge) will game just like a 6700k. The bump you're noticing is pure clock speed and 2 more cores. I'm never one to say more power isn't better especially if you got the money to do the upgrade (and not a kid looking for a max all games stream rig for 600 bucks), I'm only discussing it cause I know that CPU would run things fine much like my 2500k did. The only title I saw a CPU bottleneck on was Modern Warfare and Tomb Raider. If the i5s had HT it wouldn't be an issue, that's why 6700k and 7700k CPUs still run fine today. The majority of the issue isn't so much the game using more cores but more Windows and background tasks.
Here's some numbers but I didn't complete the testing (I did Tomb Raider and CoD + synthetics but I moved before completing):
2500k stock

Metro Exodus (Extreme) - Avg 48.10 - Max 93.62 - Min 5.65
Farcry 5 (Ultra) Avg87 - Max 109 - Min 66
Wildlands (Ultra) Avg 56.23 - Max 62.07 - Min 37.38

3960x @ 4GHz

Far Cry 5 (Ultra) Avg 87 - Max 117 - Min 68
Wildlands - 58.84 Ultra preset
Exodus Extreme - avg 46.13 - max 90.93 - min 6.15

3700x @ 4.2GHz all core
Wildlands (Ultra) Avg 59.58 - Min 49.85 - Max 67.33
Metro (extreme) Avg 38.67 - max 87.15 - min 7.14
Farcry 5 (Ultra) Avg 107 - Max 120 - Min 92


144Hz? Now that's the true PC master race! I'm just a 60Hz peasant and my RTX 2020 will have to last for at least 5 more years ;)
I've been at 144Hz for 3 years now, about to jump to 240Hz.

Over high quality grafix is not noticeable in game play. 1080 Ti now supports Ray Tracing too (been for a while) but that is also controversial to notice in game play compared to traditional equal rendering like rasterized reflections.
Whoa there cowboy. I can tell textural detail differences and lighting differences between graphics modes in a lot of games. I can also tell the difference between prebaked lighting and dynamic lighting like what people are complaining about in Halo Infinite. Ray tracing when implemented correctly there's also a big difference like in Metro Exodus where GI changes the environment in movement drastically. Little stuff like the BF5 reflections was just stupid and a horrible way to introduce "the next big thing". Basically Metro, Control, and Deliver Us The Moon are the only good titles showcasing this right now. The shadows in Modern Warfare were just almost unnoticeable. Pascal does have support but it drastically reduces performance more than on RTX cards, and doesn't support DLSS to give that performance back. DLSS even 2.0 is still to me trash. The performance uplift is great, but the reduction in quality along with popin is terrible still. I'll give it another iteration and see how that is later.

@TechnoChicken Oh, that! Words like graphics, though, through, tough and rough, I type as grafix, tho, thru, tuf and ruf in unofficial contexts to save time and effort. There was even a console with the name TurboGrafx 16 officially used that does refer to graphics. But much obliged, really!

Say, are you 2k resolution and above fellas really aware of the differences compared to 1080p with edge softening techniques? I also hear of using those high resolutions on smaller monitors like 32". Or do you guys PC-game on +42" screens? I'm confused. I'm not aware of the difference other than in some little cases I see aliasing on just some far objects in games without good anti-aliasing techniques, and on the 60" TV I'm using, mind you. Otherwise the image is so sharp and clean for me. Higher resolutions came out to get along with the increasing screen sizes, after all.
Most of us are using regular PC monitors, like myself 27" for regular or super ultrawide. When you get into bigger screens and further viewing distance pixel density becomes less of an issue. 1080p on anything to me regardless of viewing distance or screen size is just too blurry. Mind you I have a 4k 28" monitor too.
 
Back
Top Bottom