Intel Reveals Details Of Larrabee Graphics Chip.

Status
Not open for further replies.

maroon1

Banned
Messages
3,479
Intel reveals details of Larrabee graphics chip




Source:Intel reveals details of Larrabee graphics chip | News | Custom PC#
Larrabee has been the subject of much debate since Intel's CEO, Paul Otellini casually said that it would move the company 'into discrete graphics' at the Intel Developer Forum last year. Until now, all we knew was that it was going to feature multiple x86 cores, and many assumed that it was going to push ray tracing into the mainstream. However, Intel has now finally given us some more details on the mysterious new chip.

Featuring many IA++ (Intel Architecture - basically x86) cores, Intel claims that Larrabee will be scalable to TeraFlops of processing power, while Intel's new vector instruction set is capable of both floating point and integer calculations. As well as this, Intel says the chip will feature a new cache architecture, although no details of this have been specified yet.

What's interesting is that Intel sees Larrabee as a strong alternative to today's traditional GPUs. In a presentation slide shown at a US press conference yesterday, Intel listed ‘triangle/rasterisation' and ‘rigid pipeline architecture' as problems with today's GPUs. However, it listed ‘life-like Rendering e.g. Global Illumination' as a benefit of Larrabee.

Considering that Global Illumination is a part of DirectX 10.1, and is supported by ATI's latest Radeon HD 3000-series GPUs, which also feature multiple stream processors rather than traditional pixel pipelines, you could think of this as pretty rich. However, the fact that the cores are based on Intel's x86 architecture with a new vector processing unit, rather than being simple scalar stream processors, could mean that the chip is capable of some impressive calculations.

One example is physics, and Intel claims that current mainstream graphics cards are ‘inefficient for non-graphics computing' such as this. Intel sees the programmable and ubiquitous nature of the x86 cores as a big benefit of Larrabee over traditional GPUs, although the company also says that Larrabee will function with DirectX and OpenGL, so it will still need to be able to perform traditional rasterisation in games.

There's still a lot that we don't know about Larrabee, but it's now clear that Intel is taking the gaming graphics market very seriously, and that it plans on shaking up the traditional GPU architecture. Could Intel take on Nvidia and ATI in the graphics business, and are current GPUs too limited? Let us know your thoughts.




it_photo_84258_26.jpg
 
This is SO exciting.

We're reaching a limit with what we can do with DirectX and how we've been developing games, this is extremely exciting. Ray tracing would be fantastic.

Imagine playing a game that looks as good as Beowulf.
 
i can see larrabee being a total failure and a great success at the same time

like the x2900xt 1gb lol. amazing 3dmark 06 scores but bad game performence, alltho with the money intel has im sure the drivers will be amazing. i see this being a very very good card, and will force a revolution in the gpu industry if its as powerfull as they clame.
 
Well, knowing how enthusiastic Intel is about the future of Ray tracing, I look at this more so as a step in hardware for ray tracing to become incorporated into video games at a point. Not necessarily games being rendered only using ray tracing, but it being used at times.
 
im excited to see what happens with this. Better graphics is always a good thing.
Its about time we see something different....
 
I agree. Having a hybrid CPU/GPU someday is a big step in the right direction IMO. MUCH faster. I really don't know much about GPU architecture, heat issues, VRAM issues, and all that stuff that would come with it, but you'd think we're talking higher clocks and more speed with intergrating the two. I can see heat being a nightmare though.
 
AMD think fast.....

and think more then knocking off one core of a defective phenom for the ultimate TRI CORE!....

yeah, cool idea, but i dunno, i think what they should do before anything else, is flip the **** cards over so you can see the top cause you could have the sickest gfx card design ever and never get to see it.... also not on topic at all, but they need to make a new mobo design...
 
Status
Not open for further replies.
Back
Top Bottom