You know me I typically do the same thing, but I went AMD in 2018 for a reason and that was heat. My 7800x cache OC to levels of 8700k performance plus core OC made it draw around 320W (hence the old dual 360 setup and monoblock). Since I was lucky a decade ago and managed to get that 3960x I wanted to stay HEDT "just because" but when they went mesh instead of ring it reduced gaming performance. So AMD gave me more cores for cheaper, better performance, and significantly less heat output from my CPU. Thursday last week I did a mild OC of all core 5GHz P and 4Ghz E and just running a few Premiere Pro renders shot power usage over 200W hitting 85c. Decided it wasn't worth it and dropped it back.
I'm not sure why, but I've done "heat monster" SLI setups and the room heat soak to me is never as bad as with CPU heat output. Scientifically, that makes 0 sense, but with a 450W card my room doesn't get that warm unless my wife turns on her PC to game with me. That 7800x gets going and I start sweating with the AC on and ceiling fan on. Again, makes 0 literal sense but just my observation (all while she's under a blanket saying it's cold lol).