Potentially the longest thread in history...

I haven't seen anything on it, but tbh haven't paid attention to anything because of Cyberpunk. Even if they did, it doesn't really matter. Their true colors were shown through their email to this fellow and it paints the picture I was saying 2 years ago. They were trying to say "it's a thing" when in all actuality they are doing nothing more than to force it to be a thing so they could get a shit ton more cash off their cards through mere marketing. It's been 2 years and there are 21 games that support it. Only 7 of those make any real difference, but nowhere near enough to justify the performance cost. It also shows very clearly that they want influencers to do nothing but push their agenda, no matter how hard they back pedal on this. Jay's point will indefinitely stand on games only having RTX right now because Nvidia paid them to add it.

Even then, it's still not really enough to justify the performance hit. Honestly, the photorealistic streamed assets UE5 showcased with the PS5 IMO is going to be a much bigger game changer than raytracing currently. People will concentrate on this way more because it's a much noticeable difference without such a performance penalty and I can bet money now devs will jump on this immediately without a paid shove.

Oh yeah I really want to see that in action. My guess is that it won't be as impressive once they try and incorporate it into an actual game that has a bunch of other things going on such as game logic, AI etc. Probably still going to be great, but a step down from what they show in the tech demo. Basically the same as all tech demo's... they end up being closer to reality after 5 to 10 years.
 
Oh yeah I really want to see that in action. My guess is that it won't be as impressive once they try and incorporate it into an actual game that has a bunch of other things going on such as game logic, AI etc. Probably still going to be great, but a step down from what they show in the tech demo. Basically the same as all tech demo's... they end up being closer to reality after 5 to 10 years.
From what I was to understand, it's a minimum performance difference due to how they are doing it. It's basically streaming photorealistic 8k textures directly off the drive and in the PS5 case decompression is done via hardware on the SSD. With PCs being still more powerful even on the CPU level it's very easily doable. The framerate issues they were trying to cover up was from the lighting tech.
 
Im wondering how the SSD tech is going to play out on the PC if any PS5 games are ported to PC. If I remember correctly, the SSD decompressor on the PS5 SSD is pretty powerful and if replicated on the PC it would take a significant chunk of actual CPU power to achieve performance parity. But unless you have a 12 or 16 core Zen 3 you don't really have a whole lot more CPU power than a PS5 spare to throw at the decompression task.

ah yeah this was the quote from the PS5 architect. "PS5’s custom SSD decompressor can offer equivalent performance of up to an unbelievable nine Zen 2 cores in a regular CPU according to the console’s chief architect in his deep dive of the PS5’s tech."
 
Im wondering how the SSD tech is going to play out on the PC if any PS5 games are ported to PC. If I remember correctly, the SSD decompressor on the PS5 SSD is pretty powerful and if replicated on the PC it would take a significant chunk of actual CPU power to achieve performance parity. But unless you have a 12 or 16 core Zen 3 you don't really have a whole lot more CPU power than a PS5 spare to throw at the decompression task.

ah yeah this was the quote from the PS5 architect. "PS5’s custom SSD decompressor can offer equivalent performance of up to an unbelievable nine Zen 2 cores in a regular CPU according to the console’s chief architect in his deep dive of the PS5’s tech."
It's mostly been proven to all be marketing. I think any logical thinking PC person could deduce that if a chip like that was that powerful it wouldn't be cheap to produce for a console OR it'd be used for multipurpose tasks in other areas. If it's a basic decompression ASIC that can be cheaply made it'll find it's way to PC considerably quick if the need is really there, but I doubt it since we can easily attain larger capacities for uncompressed data. In any case the Xbox proves with current titles there's no need and it's all fluff. I have a serious feeling Directstorage will become the standard to be used in future titles based on current tech, and without compression on PCs. Since we have Phison E18 controllers coming with SSDs peaking over 7GB/s raw now it seems a moot point. And hell with the way things are, we won't see real titles using this for another 2-3 years anyways, giving ample time for it to come to our market. Nearing the end of the lifecycle for these consoles PS5 exclusive titles will probably utilize what it has better simply to gain more performance out of optimization. Much like what we've seen with PS4 Pro and Xbox One X.
 
It's mostly been proven to all be marketing. I think any logical thinking PC person could deduce that if a chip like that was that powerful it wouldn't be cheap to produce for a console OR it'd be used for multipurpose tasks in other areas. If it's a basic decompression ASIC that can be cheaply made it'll find it's way to PC considerably quick if the need is really there, but I doubt it since we can easily attain larger capacities for uncompressed data. In any case the Xbox proves with current titles there's no need and it's all fluff. I have a serious feeling Directstorage will become the standard to be used in future titles based on current tech, and without compression on PCs. Since we have Phison E18 controllers coming with SSDs peaking over 7GB/s raw now it seems a moot point. And hell with the way things are, we won't see real titles using this for another 2-3 years anyways, giving ample time for it to come to our market. Nearing the end of the lifecycle for these consoles PS5 exclusive titles will probably utilize what it has better simply to gain more performance out of optimization. Much like what we've seen with PS4 Pro and Xbox One X.
I hope you're right. I would imagine it's just a semi-custom ASIC. A general compute chip capable of 9 zen 2 core's of compute would cost as much as a 9 core zen 2.

Basically the only reason I was even considering upgrading my mobo and CPU was to get one of these 7GB/s PCIE4 SSDs. But doesn't seem worth it right now.
 
I hope you're right. I would imagine it's just a semi-custom ASIC. A general compute chip capable of 9 zen 2 core's of compute would cost as much as a 9 core zen 2.

Basically the only reason I was even considering upgrading my mobo and CPU was to get one of these 7GB/s PCIE4 SSDs. But doesn't seem worth it right now.
I would get a Sabrent Rocket 4 Plus when they release before flash prices skyrocket again.

Edit: Whoops, they're already out.
 
Last edited:
Not really, they still run in PCI-E 3 mode. Only sequential is capped. It hands all Sammy SSDs an L outside of sequential too. That's the only reason I'm looking at replacing my tired 960 Evo with one. What I don't like is how the 2TB outpaces the 1TB quite a bit in 4k performance. A whopping 300k IOPS difference in read, whereas writes are matched.
 
I'm gonna sound like an Apple shill but that is par for the course for my posts on TF over the past decade, so may as well continue the trend.

Ever since I got my M1 MB Air, other Intel laptops feel so shit. I have a pretty new and high spec work laptop. An Intel i7 HP Elitebook with 16GB RAM and SSD. It is crap. It's sat on my desk *doing nothing* other than running a gmail webpage and google chat and an RDP session, and I can hear the fan on it. And it's pretty warm to the touch. And it doesn't even feel very quick when I try and do anything on it. My M1 Air feels way faster, is completely silent and only slightly warm to the touch.

Though all those negative points of my work laptop were true of my old Intel MBP, to be honest.

/rant
 
Back
Top Bottom