Intel bans Nvidia

Status
Not open for further replies.
^^Larrabee being used exclusively for ray tracing was just a rumor. It can use ray tracing but it will also use rasterization like every other graphics card.

Also ray tracing is a technique for generating an image by tracing the path of light through pixels in an image plane. It does not mean the cpu does all the work.
 
^^Larrabee being used exclusively for ray tracing was just a rumor. It can use ray tracing but it will also use rasterization like every other graphics card.

Also ray tracing is a technique for generating an image by tracing the path of light through pixels in an image plane. It does not mean the cpu does all the work.

it is a technique yea, but the load is dropped on the cpu, the only need for a gfx card is to send the display to the monitor, which can be done by integrated graphics.

And if you read my link that i posted, there is a reason why they had to use 16 cores to be able to play it; and notice how they didnt post what gpu they used, because thats ray tracing, the gpu's were used to make the image.
 
I thought everyone was going the other way with this, ie, cpu's are going to become something of the past and GPU's will do nearly all the work.
 
Ray tracing has nothing to do with the cpu. It is a technique just like rasterization or vectorization.

Ray tracing (graphics) - Wikipedia, the free encyclopedia


The name describes what it does. PUddle Jumper is right on this one.

I know that it is a technique, but that doesn't mean that the cpu does not do the work.
If you actually read the link that i posted earlier, you would know what im talking about.

Intel Converts ET: Quake Wars To Ray-tracing - Tom's Hardware
Intel demonstrated ET: Quake Wars running in basic HD (720p) resolution, which is,a ccording to our knowledge, the first time the company was able to render the game The Game - Wikipedia, the free encyclopedia using a standard video resolution, instead of 1024 x 1024 or 512 x 512 pixels. Seeing ETQW running in 14-29 frames per second in 1280x720 has brought up our hopes for Intel's CPU architecture, since we do not believe that CPUs would deliver a similar performance when rasterizing graphics. For the record, the demonstration ran on a 16-core (4 socket, 4 core) Tigerton system running at 2.93 GHz.
i got that scrap game with my 8800gt, and it is not a demanding game; i was able to max it even with a s939 4400+ and only a gig of ram at 1680 x 1050
notice how they said nothing about a gpu being used, just how many cores the CPU's had and what system....if you can find out a reason why a 16 core system would be getting 14-29 frames while only running at 720p resolution, then i would like to hear it....
 
Ray tracing is done by CPUs at the moment only because the gpu APIs like DirectX and OpenGL are not written for ray tracing.

A GPU is massively better than a CPU for tasks like ray-tracing because they are built for parallel processing....the RV670 with 800 stream processors, the G92 with 128 highly clocked stream processors....etc.

It's just that rewriting an entire API will make a LOT of work for everyone, it will be done one day but not yet.
 
I think that GPU's are far better suited to ray tracing; Intel is trying to do it on an x86 CPU through software. x86 is extremely general and hasn't really changed aside from getting faster.
Like said, GPU's are just massively parallel processors. They also specialise in doing one thing; processing graphics.
Not only that, but GPU companies specifically make GPU's to be able to have technologies added onto them, to improve the way graphics are rendered. It happens every time a new DirectX revision comes out, and sometimes even when there isn't (new modes of anti-aliasing, etc..).
x86 might have had a few instruction sets added, but they're there to increase performance in things which the x86 processor is already capable of doing anyway.

GPU's are adaptable, yet specialised...
 
yup, with ATI's GPGPU and Nvidia's CUDA we are getting closer to implementing other methods like Ray Tracing, as well as even going the other way and adapting some x86 instructions :)
 
Nvidia doesn't have an x86 license (unless they acquire Via, which might not be out of the question, that's not likely to change), though now that AMD owns ATI, ATI GPU's could probably easily get some x86 support.

But I think the best idea is to make hardware specifically for ray tracing, rather than just making software to do it in x86. You could optimise it more that way....

Intel's doing it in x86, because it's easy for them to do it that way.... and because they see an opportunity to take market share from Nvidia and ATI by trying to be the only people with this technology. They went to Microsoft first to make a standard while specifically excluding Nvidia and ATI....

I think adding ray tracing support is a good idea, but I think the way Intel is doing it is questionable...
 
One time i heard a techie at a lan describe GPU's and CPU's to me like this.

Cpu cores are like Einstein. 4 of them working together can do alot of "smart" work.

But GPU's are like 300 highschool students working together in the same room. Not as "smart" but together they can get alot done.

Basically like apok said
GPU's are adaptable, yet specialised...

Ray tracing excits me... If you ever see a ray traced image it is freaking amazing. I can just imagine Half Life 6 with ray tracing ;D
 
Status
Not open for further replies.
Back
Top Bottom