I cant figure this out for the life of me. Someone please help.
A 9 second simulation takes 20 hours to simulate. Just assume its output at 30fps.
So how much faster would a PC have to be to run the same simulation in real time at 30fps ? I came to 242,000x faster but I think I went very wrong because that seems like an absurd figure.
A 9 second simulation takes 20 hours to simulate. Just assume its output at 30fps.
So how much faster would a PC have to be to run the same simulation in real time at 30fps ? I came to 242,000x faster but I think I went very wrong because that seems like an absurd figure.