okay 1... the human eye can far surpass 60fps, but refresh rates of monitors, usually 60-75hz limites the screen to only displaying 60-75 fps... the rest being dumped.... and 2. that 30fps is gonna make a HUGE difference in games in the future when that 320mb can't take it.... dx10 performance already shows the 8800gts 640mb pwns the 320 mb (dx10 benches at least)...
now i'm not saying the 8800gts is a bad card, i have owned an 8800gts 640mb myself, and it in fact did play games flawlessly... but me being a person who needs games in the future to run well.. i wouldnt go for the 320mb.... the way crysis is running for me on this 2900xt/pro i'm not satisified! so i'm hoping it is just beta fault... but i am gonna go crossfire anyway.. lol
if been on every side of the field of graphics card... low end (7300gt) mid ranged (8600gts) and high end (x1950xtx, 8800gts 640mb, and now 2900xt/pro)... i would say i'm pretty knowledgable in the field of graphics card in some sort of way, due to actual hands on experience, and not just basing it off some stupid websites....
but if i were to suggest anything.. it would be the 2900pro as stated earlier.. these things overclock easily to 2900xt speeds for less than 300 dollars! i've never been so satisfied with a graphics card
and as for games not using more than 128 mb... HA, you make me laugh that is hilarious.. oblivion uses well over 500mb's of texture... let me take you over to xtreme systems... and i'll show you many games under different circumstances that'll use far more than 128mb (and as a side note, i am not saying that the amount of ram is going to make a huge different.. only stating that games can go far beyond 128mb use....) and the ram of the 8800gtx's and ultra's can clock far beyond that of an 8800gts