AMD shows off 3.0Ghz Phenom

Status
Not open for further replies.
Infinite is just a concept. You can't actually get infinite anything.

The only reason infinite exists (as a concept) is because theoretically there is no such thing as a "highest number" either.

Case in point: what's the highest number you can think of?
What if you add 1 to that number?

:rolleyes: You remind me of that annoying smarta$$ friend we all have that hangs on people's every word and over analyzes way beyond the realm of context in which it is used.

I'm simply saying, pretty soon the pressure will be on GAME developers rather than hardware developers because the hardware will be of irrelevant power. We will need less power, we will have excessive speeds, we will have excessive space, we will have low heat. The pc's performance will surpass our demands and we will begin to let our imaginations free-roam to utilize it. Game graphics will be "prefected" and no longer "improved". You'll see more pressure for better stories, more concepts, a more enthralling experience overall.
 
All we really need is a unified language for the GPUs... the technology is already out there to put them in ZIF sockets.

As components shrink, begin to run cooler, and become less power-hungry, the feasibility of doing something like this climbs higher and higher.
the technology of GPU's is much more likely to change than with CPU's.

CPU's might have some instruction sets added from time to time, but generally most changes are to make them faster and more efficient.

GPU's have to take into account many new technologies and advances in graphics (including DirectX versions), as well as making them faster

While I do like the idea of ZIF socket GPU's, I think it would be much harder to accomplish.
:rolleyes: You remind me of that annoying smarta$$ friend we all have that hangs on people's every word and over analyzes way beyond the realm of context in which it is used.
On the other hand, I'd hate for people who do actually know what they're talking about to use words or phrases that simply don't apply (eg my hard drive crashed, when it is actually the software)
 
On the other hand, I'd hate for people who do actually know what they're talking about to use words or phrases that simply don't apply (eg my hard drive crashed, when it is actually the software)

It just makes you sound really literal and uptight. That's all. I mean, so what if it didn't "crash" or "experience a radical decelleration causing structural damage to the device/object". He's still screwed and everyone knows what he meant.

So when is the soonest that we will see something of the "next gen" from AMD on newegg? Late august with the barcelona?

FYI: New AMD roadmap reveals 16-core chips News - PC Advisor

Amazingly clear layout of this "blizzard" of codenames.
 
It just makes you sound really literal and uptight. That's all. I mean, so what if it didn't "crash" or "experience a radical decelleration causing structural damage to the device/object". He's still screwed and everyone knows what he meant.
It appears that we're making two different kinds of argument here. Mine is an objective stance, and yours is a subjective one.

So when is the soonest that we will see something of the "next gen" from AMD on newegg? Late august with the barcelona?

FYI: New AMD roadmap reveals 16-core chips News - PC Advisor

Amazingly clear layout of this "blizzard" of codenames.
It will be this year, I know that much.
 

But the focus that intel/amd are making is that we need to keep our heat/energy levels DOWN. And these things ARE getting a lot faster. I don't believe its a matter of speed more-so than just "a lot of computing" to be done. It's easier for the cpu makers to create a lot of cores that work together to tackle a huge work load. If the programs can be subdivided, like games w/ sound/physics/AI/etc on a diff core, then its all good. But if the program is just one measly process that you need done lightning fast, then yes, you are in trouble.
 
But the focus that intel/amd are making is that we need to keep our heat/energy levels DOWN. And these things ARE getting a lot faster. I don't believe its a matter of speed more-so than just "a lot of computing" to be done. It's easier for the cpu makers to create a lot of cores that work together to tackle a huge work load. If the programs can be subdivided, like games w/ sound/physics/AI/etc on a diff core, then its all good. But if the program is just one measly process that you need done lightning fast, then yes, you are in trouble.

ooh thats a good idea
 
It's not exactly my idea. I'm pretty sure Crytek will be doing that with the quad cores. I've heard Yuri say it a few times. But 16 cores from amd? What are they crazy? I never really thought about it in the way that that article did before. We are not seeing a further push for raw speed, just more horsepower and torque. We've gone from racecars to mac trucks.
 
It's not exactly my idea. I'm pretty sure Crytek will be doing that with the quad cores. I've heard Yuri say it a few times. But 16 cores from amd? What are they crazy? I never really thought about it in the way that that article did before. We are not seeing a further push for raw speed, just more horsepower and torque. We've gone from racecars to mac trucks.
I don't think CPU's are really comparable to automobiles.

Reminds me of that Bill Gates vs General Motors article.
 
Status
Not open for further replies.
Back
Top Bottom