Was running at 1033 or some such
Setting to the X one didn't change anything... I guess because of the gray out. Setting to the D one sets to 1804Mhz speed- which increased my temps by over 10C
setting on "auto" but changing to 1600 and 3200 gives me +7C or so on my CPU temps...
The setting in CPUZ is saying it's at 800.7 or so (so 1600.x) and 9-9-9-24 settings at 1T ... it was advertised as 7-7-7-24 though...
Why does every little tool read my I7 wrong? it's been labled a mobile processor and a Esomething series... not one benchmark has had the i7 920 list as what I have....
and because I was feeling screenshot happy-
This is what Fallout 3 looked like on the x700 card (roughly):http://i154.photobucket.com/albums/s...9-45-28-50.png
and THIS is what it looks like now: http://i154.photobucket.com/albums/s...9-22-15-33.png
Just look at the difference... lots of grass, detailed tree branches in the distance, less "chop" lines from AA and sampling rates way up - the ground itself looks better, and the lighting is wonderfully realistic. Must say.. makes this game look like a whole new game compared to what I am used to (which I expected).