Random Chit Chat

This is a heavily OC'd 2500k @ 4.5GHz with 16GB 1866, 240GB HyperX PCI-E SSD (pre-NVMe), and a pair of Titan Xp's (not in SLI, it's my other mining machine). Playing at 1440p res it games great like you would think with a better than 1080ti card but less than 2080 card. My usual main rig settings on CoD MW I get between 90 and 100fps. Halo about 150fps. RL pegged at 250. BF4-V 90-120fps. Problem is in regular tasks within Windows you can definitely tell where it's aged compared to the newer stuff. When you go from buttery smooth transitions and having multiple things open without CPU being pegged at 100% constantly to chugging on opening a browser or the start menu taking a while to open up it gets noticeable. When certain games are played that are more CPU bound (less FPS, more RPG/RTS) it gets more apparent on the gaming side. The average framerates in the FPS games are fantastic if looking at a sheet of numbers, but the frametimes sometimes spiking because of CPU usage being so high lessens the experience with stutters. This is why I said native quad, because I feel if I had an i7 2600k or 2700k it wouldn't be such an issue.
Interesting! My other native non-threaded quad; the 3570K, only showed age in some games regardless to OC. Tried 4.5Ghz and somehow the same settings refused to keep it stable after a couple of years so I went down to 4Ghz and noticed no difference in what mattered at the time; i.e. gaming. Just some later AAA games had trouble with it like ME: Andromeda and Boarderlands 3. Dark Souls 3 bottlenecked in only two cases the whole game. Average FPS was always good. It's minimum that suffered. I game at 60FPS so the drops were not frequesnt and coped with it for some time.

The other interesting part is that in regular tasks I never had any problems with the SATA SSD. I do open tons of browser tabs and windows with photo and video thumb nails and no problems at all. Delays only happen with heavy-regular tasks like navigating the computer while file compression and media encoding.

I had no plans upgrading then until games started behaving like that when I SLI'ed my GTX 680 4GB (was a bad experience but enough to give clear closure cuz the 2nd card was 2nd hand and defective).

I upgraded this 3570K to 10600K (big jump) only in mid 2020 along side a RTX 2060. For a 1080p 60FPS player with DLSS, this rig will hold up with me for ages to come. Not many games intrigue me so not much pressure is expected.
 
Thank you PP for your condolences. I think so as well. I believe all vaccines are useless and believe that natural immunity was the reason most other viruses disappeared as well. That's just my opinion. My observations influenced my opinion.
Vaccines aren't useless. We don't have smallpox, measles, polio etc for a reason. It's just the difference between a vaccine intended to properly eradicate an issue, vs ones designed to make money. The covid vac is an mRNA and it does give you antibodies towards the virus. The problem being, it mutates and that doesn't do jack all. You can get it and still spread it. You are not given a live version to actually help with immunity. That doesn't mean all live vaccinations are good either, just look at the flu vaccine. Covid in this country is being used to replace the flu because it makes more money, is good at controlling people, and because it's "free" you "have no reason not to get it". Since it doesn't actually get rid of it it then becomes a revolving door of "more shots" and "boosters". It's the perfect scheme because the flu was failing hard on being a money maker since everybody was questioning the validity of the flu shot for years.
 
Interesting! My other native non-threaded quad; the 3570K, only showed age in some games regardless to OC. Tried 4.5Ghz and somehow the same settings refused to keep it stable after a couple of years so I went down to 4Ghz and noticed no difference in what mattered at the time; i.e. gaming. Just some later AAA games had trouble with it like ME: Andromeda and Boarderlands 3. Dark Souls 3 bottlenecked in only two cases the whole game. Average FPS was always good. It's minimum that suffered. I game at 60FPS so the drops were not frequesnt and coped with it for some time.
I could feel the difference going from stock to 4.5, it was massive. When I'm at the hotel I'll be using a 60hz screen so I tested uncapped and capped. The frametime spikes were still there due to the CPU usage but it wasn't as prevalent as running at 240hz uncapped. I know for sure it's the CPU because my boy is running my old 3960x and I also have him at 4.5GHz. No frametime spikes for any games, even games that heavily use CPU because it's a 6 core.
The other interesting part is that in regular tasks I never had any problems with the SATA SSD. I do open tons of browser tabs and windows with photo and video thumb nails and no problems at all. Delays only happen with heavy-regular tasks like navigating the computer while file compression and media encoding.
It's not so much the SSD but the combination of it all really. He's not bottlenecked by CPU in games but still running a SATA SSD. Things are noticeably slower for regular tasks comparing my machine to his. Since he doesn't know any better it doesn't affect him, but I do so when I get on his machine to install something, or fix something I can notice the sluggish. Faster RAM, faster caching, and NVMe SSDs do play a big part in this as his sister's machine with an NVMe SSD, 16GB DDR4, but a Pentium G4560 isn't as slow within Windows. Hell my laptop is no slouch, 64GB DDR4 2666, 6 core 8th gen Xeon that boosts to 4.1GHz, and a Quadro P2000 and I can feel the difference between it and my main rig in terms of responsiveness. It's not as big of a difference compared to the 2500k machine though. I'm really curious to see if 2133 RAM and NVMe would close the gap but kinda a waste of money for an experiment lol.
I had no plans upgrading then until games started behaving like that when I SLI'ed my GTX 680 4GB (was a bad experience but enough to give clear closure cuz the 2nd card was 2nd hand and defective).
I kinda miss SLI, but damn was it pointless.
 
I remember having 780Ti SLI. My FPS was higher, but so was the stutter. Oh boy the stutter. Ended up disabling it for most games. On the plus side it did a great job at heating my room, had the turn the radiators off. These days I doubt I could be bothered with SLI even if it were still a thing. I just want a GPU that plays every game at 4K 120FPS with high ray tracing.

I played the Halo Infinite beta earlier and it max's out my CPU so hard. But it seems to do that for everybody, even those with the latest fastest Intel CPUs. Something broken. Because of the CPU issues it feels horrible and stutters so bad you can barely play the game properly.
 
I remember having 780Ti SLI. My FPS was higher, but so was the stutter. Oh boy the stutter. Ended up disabling it for most games. On the plus side it did a great job at heating my room, had the turn the radiators off. These days I doubt I could be bothered with SLI even if it were still a thing. I just want a GPU that plays every game at 4K 120FPS with high ray tracing.

I played the Halo Infinite beta earlier and it max's out my CPU so hard. But it seems to do that for everybody, even those with the latest fastest Intel CPUs. Something broken. Because of the CPU issues it feels horrible and stutters so bad you can barely play the game properly.
For a lot of games micro-stutter was directly related to CPU bottlenecking. I didn't understand this until much much later in the game, but by then there was only 2 more gens of SLI then they killed it for good. DX12 did a good job of that too.
 
Back
Top Bottom