Linked by Thom Holwerda on Fri 8th Dec 2006 19:59 UTC, submitted by borker
Intel A shadowy organization called Larrabee Development Group has set a most ambitious goal: unseating AMD/ATI and Nvidia as the largest producers of high-end graphics chips. And it might just succeed. It may seem unfathomable for an unknown such as Larrabee to knock off two of the most powerful processor companies. That is until you realize that Larrabee is little more than a weak disguise for Intel. The chip giant has, in fact, ramped up its graphics efforts in recent weeks to create a product code-named Larrabee theoretically capable of besting AMD and Nvidia's best kit.
Order by: Score:
yes!
by helf on Fri 8th Dec 2006 20:30 UTC
helf
Member since:
2005-07-06

let the graphics wars begin anew!

Reply Score: 5

Ray Tracing
by FunkyELF on Fri 8th Dec 2006 20:39 UTC
FunkyELF
Member since:
2006-07-26

I thought the next generation of graphics was going to be ray traced.
With ray tracing wouldn't you get all of the stuff that is 'hacked' into current processors for free?

Reply Score: 2

RE: Ray Tracing
by tomcat on Fri 8th Dec 2006 20:50 UTC in reply to "Ray Tracing"
tomcat Member since:
2006-01-06

Ray tracing is computationally very expensive, even if you do most of it in hardware.

Reply Score: 2

RE[2]: Ray Tracing
by archiesteel on Fri 8th Dec 2006 21:50 UTC in reply to "RE: Ray Tracing"
archiesteel Member since:
2005-07-02

I have to agree with tomcat here, it's not worth it to even try to do ray tracing in real-time. There are many tricks you can do (such as bump- and normal-mapping) to achieve better graphics, but ray-tracing isn't one of them.

Reply Score: 2

RE[3]: Ray Tracing
by npang on Sat 9th Dec 2006 05:06 UTC in reply to "RE[2]: Ray Tracing"
npang Member since:
2006-11-26

Ray tracing is parallel processing friendly. Imagine a card manufacturer embedding two or more cores dedicated to tracing rays into gfx cards,

Reply Score: 1

RE[4]: Ray Tracing
by Arakon on Sat 9th Dec 2006 06:57 UTC in reply to "RE[3]: Ray Tracing"
Arakon Member since:
2005-07-06

Or while we are at it, let's just imagine a beowulf cluster of these....

Um No. Imagine Imagine Imagine... it's like Hollywood magic!

Ray tracing is expensive, the hardware required to do it real time at a performance rate that would be acceptable to a gamer is even more so. Adding more cores, increases complexity and the cost to manufacture the item. The goal of a business is not to produce a product that is the best on the planet but no one will buy becuase it costs too much. (SEE SGI Workstations for more information).

They want to make the best product they can for as cheap as they can so that the average consumer will *buy* it. It's all about volume these days. Margins are very tight in the hardware business so I doubt you'll see anyone making the "Real-time Raytracing" leap for a while yet. We still have a few more(dozen maybe?) generations of hardware to go through before we approach hardware that's affordable and capable of that.

Reply Score: 1

RE[5]: Ray Tracing
by GatoLoko on Sat 9th Dec 2006 22:45 UTC in reply to "RE[4]: Ray Tracing"
GatoLoko Member since:
2005-11-13

SaarCor Project is developing realtime raytracing hardware and do it prety well and fast. In the web you can see scenes from quake raytraced at 15FPS in one of the first test cards (the card the made in 2004). www.saarcor.de

Reply Score: 1

Hmm
by Ultimatebadass on Fri 8th Dec 2006 20:40 UTC
Ultimatebadass
Member since:
2006-01-08

It would be really nice if they came out with something that's at least competetive against nvidia/ati chips. More competition = lower prices ;)

Reply Score: 2

I'll buy one..
by tux68 on Fri 8th Dec 2006 20:41 UTC
tux68
Member since:
2006-10-24

How long does it take to go from hiring engineers to shipping a product I can go out and buy? If they offer open source drivers like they already do for other products, it can't happen fast enough for my liking..

Reply Score: 5

tomcat
Member since:
2006-01-06

Desiring to knock-off NVidia and ATI isn't the same thing as doing it. Intel has thrown lots of dollars at this problem in the past and, so far, it hasn't been successful, either. As they say, I'll believe it when I see it.

Reply Score: 5

abraxas Member since:
2005-07-07

Desiring to knock-off NVidia and ATI isn't the same thing as doing it. Intel has thrown lots of dollars at this problem in the past and, so far, it hasn't been successful, either. As they say, I'll believe it when I see it.

Intel has never had a competitor like this before, that offers both high end CPUs and high end graphics cards. In fact Intel never wanted to make high end graphics cards before. They were content offering integrated solutions. Lately we've seen Intel get more serious about their graphics chipsets and it seems this project is an extension of that. Intel has a lot of money and a lot of engineers. If they really want to make a high end graphics card they will.

Reply Score: 3

tomcat Member since:
2006-01-06

Intel has a lot of money and a lot of engineers. If they really want to make a high end graphics card they will.

Sure, but ATI and NVidia aren't exactly just going to sit still while it happens. All of them are throwing heavy-duty R&D dollars at the problem of building faster cards. Plus, ATI & NVidia have been working with Microsoft for years on graphics card technology. To some extent, they have been helping to drive the technology forward, so it will take a concerted effort on Intel's part to match that kind of focus.

Reply Score: 1

abraxas Member since:
2005-07-07

Sure, but ATI and NVidia aren't exactly just going to sit still while it happens. All of them are throwing heavy-duty R&D dollars at the problem of building faster cards. Plus, ATI & NVidia have been working with Microsoft for years on graphics card technology. To some extent, they have been helping to drive the technology forward, so it will take a concerted effort on Intel's part to match that kind of focus.

Intel has been working with Microsoft for years and they have a lot more money, especially for R&D. I'm not saying Intel is going to take on ATI and Nvidia over night but they have the resources to do it.

Reply Score: 1

JimBroad Member since:
2006-06-29

"In fact Intel never wanted to make high end graphics cards before..."

umm... remember this garbage?
http://en.wikipedia.org/wiki/Intel740

Definitely not the first time they are trying to squeeze into this market. Sounds like they might be a little more intent on making a decent showing this time around though.

Reply Score: 1

If it's cheap?
by ronaldst on Fri 8th Dec 2006 21:08 UTC
ronaldst
Member since:
2005-06-29

I am all for it.

Reply Score: 1

Interesting
by tmack on Fri 8th Dec 2006 21:38 UTC
tmack
Member since:
2006-04-11

If Intel is as open source friendly with their nextgen graphics technology, as they are with their current technology, I can't wait to see Nvidia/ATI put in their place.

I have an GeForce GTX8800. It won't work with my Dell 30" FP because Nvidia's Unix drivers don't support dual-link TMDS on the G80 GPU yet.

If this was open source, I would have had a $600 graphics card sitting in its box for over a month.

I like Nvidia better than ATI, but Nvidia is merely the lesser of two evils.

Reply Score: 2

RE: Interesting
by Wintermute on Fri 8th Dec 2006 23:59 UTC in reply to "Interesting"
Wintermute Member since:
2005-07-30

Dude, why did you buy a card like that without research? I mean why would you want to put yourself in a position where your $600 graphics card doesn't support your platform of choice?

It doesn't take much to check the Nvidia site to see if they have drivers for *nix....

I would understand if your where bitching about performance being worse under Linux with SLI, or something of that sort, but this was your own fault.

Reply Score: 1

RE[2]: Interesting
by tmack on Sat 9th Dec 2006 01:19 UTC in reply to "RE: Interesting"
tmack Member since:
2006-04-11

Dear Genius,

Please review the following info from their driver:

Beta Driver - Linux Display Driver x86

Version: 1.0-9742
Operating System: Linux x86
Release Date: November 8, 2006
BETA Driver

Release Highlights

* Adds support for GeForce 8800 GTX and GeForce 8800 GTS GPUs.

Yet with dual link TMDS, all you get is a black screen (one of the Nvidia devs later revealed they have this in their bug tracker as a known bug).

This card in combination with a dual-link TMDS monitor is pretty rare and they never mentioned in their driver documentation that this was an issue.

The nvidia devs told me this will be fixed in the next release. It's just too bad that it's closed source, so I have to wait until it's financial prudent for Nvidia to fix their damn product.

Edited 2006-12-09 01:22

Reply Score: 2

Also
by tmack on Fri 8th Dec 2006 21:40 UTC
tmack
Member since:
2006-04-11

I'd like to add that if Intel provides open source drivers for THEIR "good" video hardware, it could possibly drive Nvidia and ATI to do the same.

Reply Score: 2

RE: Also
by archiesteel on Fri 8th Dec 2006 21:53 UTC in reply to "Also"
archiesteel Member since:
2005-07-02

I'd like to add that if Intel provides open source drivers for THEIR "good" video hardware, it could possibly drive Nvidia and ATI to do the same.

Indeed. I do believe that this is Intel's plan - after all, it has nothing to lose (and everything to gain) by open-sourcing its drivers in the current context.

Note, however, that there are persistent rumors that AMD will open-source the ATI drivers as well, which would leave only Nvidia with closed-source drivers.

Reply Score: 3

RE[2]: Also
by Flatland_Spider on Sat 9th Dec 2006 12:16 UTC in reply to "RE: Also"
Flatland_Spider Member since:
2006-09-01

I hope that's that case as there is a freeBSD install that is languishing because of ATI x1600 that is installed.

Reply Score: 1

time for nvidia and ati to bleed
by jango on Fri 8th Dec 2006 22:49 UTC
jango
Member since:
2006-11-22

nvidia and ati (ati more so) have both punished linux and foss, now that intel is open source, its seems its payback time, PS sooner or later ati and nvidia will have to open their drivers.


wow tech is really hotting up, Microssoft is running scared, Intel is pushing base, Oracle is invading, wat a year

i for one welcome this

Reply Score: 1

It makes sense
by Wes Felter on Fri 8th Dec 2006 22:54 UTC
Wes Felter
Member since:
2005-11-15

GPUs used to be very different from CPUs, but recent GPUs are starting to look like multicore processors with a little graphics-specific logic added on. Intel's decades of experience in processor design can now be applied to graphics.

Reply Score: 2

RE: It makes sense
by makc on Sat 9th Dec 2006 11:09 UTC in reply to "It makes sense"
makc Member since:
2006-01-11

Actually GPUs *are* multicore processors. Let's say they are "RISC" somehow ;)

And I'd add: http://developer.nvidia.com/object/cuda.html

C extension framework for general programming on the GPU. Looks quite more comfortable than writing the shaders and reinventing addressing schemes for gpgpu ;)

Reply Score: 1

not suprised
by poundsmack on Sat 9th Dec 2006 02:25 UTC
poundsmack
Member since:
2005-07-13

intel wants to off customers the complete solution. personaly i think they should have partnerd with nvidia but its cheeper for intel to do it this way. ATI, Nvidia, S3, and intel let the games begin....

Reply Score: 1

How to unseat ATI and NVIDIA 101
by jo42 on Mon 11th Dec 2006 16:25 UTC
jo42
Member since:
2006-02-20

In order to do this, you need a fan-boy base. To generate a fan-boy base, you need over-clockable hardware and more FPS than anybody else.

That's it.

Reply Score: 1