Watch Dog specs leaked

I would wait to see some benches first on the 290x. Try to remember, AMD drivers are still AMD drivers.

Well of course. I'm just assuming it'll live up to it's 1.1TFLOP advantage over Titan. If it flops, i'll just get a 780.

What I need is a metric **** tonne of VRAM, atleast 4GB, i'm hoping for some 6GB R9 290X's further down the line (or even 8 :D ? ) . I'm basically upgrading purely for GTAV on PC, knowing that most games after that will be less demanding anyway - though i'm hyped for Titanfall on PC though that doesn't look all that demanding. If they give it the graphical goodies they gave Max Payne 3, and then pair that with it being open world with epic draw distances, it's going to munch major amounts of VRAM.

Even unmodded GTA4 at max uses 1.9GB of VRAM, and it looks like crap.

As for Watch Dogs, I think that'll eat VRAM too. Open world games tend to, in my experience.
 
I need to do some research 'cus I can't for the life of me figure out why 512Bit memory bus results in 6GB not being possible.

I've found out there won't be 6GB because there are no extra spaces for more VRAM modules on the board. So all they can do is increase density from 4 to 8, but nothing to do with 512Bit bus restricting it to 6GB. I'm interested now, so some research and maths is in order.
 
The way it works is quite simple, the bus width pretty much controls the number of memory chips that can be used on the card.
A GPU with 256-bit bus has 8 memory chips minimum, since each memory chip have a 32-bit wide bus.
Some cards with double the memory amount will have 16 memory chips.
The 8 extra chips usually on the back of the card shares the bus with the 8 chips soldered to the opposite (front) side of the card.

So, 192/384 = 1.5GB/3GB/6GB
256/512 = 2/4/8

There are deviants from this, but usually those are the rules so you don't have a weird ass configuration like the 192bit 2GB 660ti. From what I can tell, it's a "special" card and they utilize the extra half gig differently, and it's offloaded to the CPU or something. Don't quote me on that though.

So technically yes, there could be a 6GB card, but with a 512bit bus the odds are slim because from what I can tell AMD utilizes that bandwidth differently.
 
Mantle has failed before it even launched IMO. The One does not support Mantle, and since there isn't a 100% support set from all platforms there won't be a large developer group supporting it.

Xbox One Doesn't Support AMD Mantle API - techPowerUp! Forums

Next up, Trueaudio :)

Not necessarily dead. DICE are supporting it, which means the biggest PC game of the year in the form of BF4, and future sure hits such as Command and Conquer, Star Wars Battlefront, Mirrors Edge 2 can all use it. Also bare in mind DICE are EA's go to developer to make cool tech, e.g. Frostbite 2 which is now the foundation of many of EAs games. It's not an ilogical step to assume DICE have integrated Mantle nicely into the workflow of Frostbite engine. It may not be hard at all for other EA based developers to support it.

I don't know why you'd want Mantle and TrueAudio to be unsuccessful, other than the prove you are right. It'd be a shame if either of these, particularly Mantle, flopped.

Besides, just because it doesn't support Mantle officially does not mean the Xbox One low level API and Mantle arn't incredibley similar, which considering they are both made by AMD and both for GCN architectures, is pretty likely.
 
Not necessarily dead. DICE are supporting it, which means the biggest PC game of the year in the form of BF4, and future sure hits such as Command and Conquer, Star Wars Battlefront, Mirrors Edge 2 can all use it. Also bare in mind DICE are EA's go to developer to make cool tech, e.g. Frostbite 2 which is now the foundation of many of EAs games. It's not an ilogical step to assume DICE have integrated Mantle nicely into the workflow of Frostbite engine. It may not be hard at all for other EA based developers to support it.

I don't know why you'd want Mantle and TrueAudio to be unsuccessful, other than the prove you are right. It'd be a shame if either of these, particularly Mantle, flopped.

Besides, just because it doesn't support Mantle officially does not mean the Xbox One low level API and Mantle arn't incredibley similar, which considering they are both made by AMD and both for GCN architectures, is pretty likely.
At the end, now you're just copy and pasting basically what I'm saying. Yes they are similar, but the kernel has to allow it, which it won't. That's like saying, I have a DX10 GPU in a Windows XP machine. Sure it could work easily, but Microsoft has to allow the support, of which obviously they didn't. It's still an API in the end.

Bringing me to, just because EA/Dice are using it for BF4 (which is not the game of the year, and cannot be in any way) in Frostbite 3, doesn't mean its other titles will use this tech. Take note, they are patching it in post launch, and I believe this is only in part because BF4 is an AMD gaming evolved title. If it wasn't, I don't believe they would give a rats ass considering the newer consoles won't be utilizing the tech. Not only that, but EA/Dice are just one out of many. It takes the many to make a proprietary (I'll cover that later) tech to catch on.

Which leads me to, why do I want them to flop? Because I hate brand specific ****, because proprietary tech like the DX API have held us back quite a bit. It forces us to take directions that normally we might not want. For instance, DX10 and the move to Vista. Want DX10? Move to the ****ty OS. Don't want it? Don't play Crysis in DX10 graphics. Want to leave XP for the better Windows 7? Gotta ditch that 300 dollar sound card and EAX 5.0 that was the **** back in 06. My bad brah.
Mantle and Trueaudio are not proprietary, such as that as Cuda/PhysX, BUT Nvidia will never incorporate GCN (which Mantle requires), or AMD's DSP onto their cards. What does that mean? Guys like myself (even though I'm on the fence about the 290x) won't have the tech. Or other Nvidia consumers, just like how AMD doesn't get PhysX. If they did, I have a feeling it would run ****ty either way for reasons we can all fathom to ourselves.
I would rather more open source tech widely available for all that use any hardware take off, such as OpenGL, or a physics API like Havok to become standard. Simply because I'm sick and tired of being tied down on such an open platform. I have a PC, not a console.

Edit: I'll also add this. I know that Mantle was developed specifically for PC and high/low level API to hardware management for smoother access to resources. Thing is, it's one extra step to the port process, and AMD is claiming it'll be easier to port to PC from console. But PC is lead development platform for most all AAA studios. That being said, why would they want to make things more difficult on themselves by implementing extra **** when the move to x86 hardware just made things a whole lot easier? If I was a developer, I would not jump on this short lived bandwagon.
 
Last edited:
But open **** is no where to be seen. Nvidia and AMD having their own proprietary low level API is better than both having neither, imo.
 
No, proprietary **** has been holding the graphics scene back since OpenGL goofed and DX won the API war back in the day. **** has been so stagnant due to proprietary technology and APIs. We need to be rid of DX and go OpenGL.
 
Back
Top Bottom