hi, i was wondering if the common way to represent video card memory is the actual clock speed, or does it figure in double data rate? For example, almost all computer, main system memory is doubled to represent ddr (i.e. 400 MHz is actually 200 MHz with two operations per clock cycle) I was wondering if this is the same way video card memory is. Thanks for any help.