Linked by Thom Holwerda on Thu 26th Oct 2006 21:01 UTC
AMD If there was any doubt that CPU maker AMD's principal reason for acquiring graphics chip powerhouse ATI was to build a mobile computing platform that would rival Intel's Centrino, it was erased this morning with AMD's announcement of a platform project it's currently calling 'Fusion'.
Order by: Score:
Deja Vu
by linxdev on Thu 26th Oct 2006 21:15 UTC
linxdev
Member since:
2006-10-26

Is this not what we had before the GPU? A regular CPU that was the GPU. ;)

Reply Score: 3

RE: Deja Vu
by jziegler on Thu 26th Oct 2006 23:02 UTC in reply to "Deja Vu"
jziegler Member since:
2005-07-14

Fashion/history repeats itself ;) .

But I guess you always had some extra chip to do the low-level signalling and synchronization on the output port - DVI / HDMI these days, NTSC / PAL previously.

Reply Score: 1

RE: Deja Vu
by Morin on Fri 27th Oct 2006 10:37 UTC in reply to "Deja Vu"
Morin Member since:
2005-12-31

> Is this not what we had before the GPU? A regular CPU that was the GPU. ;)

--> deleted my reply because I realized I was wrong ;)

Edited 2006-10-27 10:41

Reply Score: 2

RE: Deja Vu
by Terracotta on Fri 27th Oct 2006 10:38 UTC in reply to "Deja Vu"
Terracotta Member since:
2005-08-15

No, the cpu used to render the graphics, now there's going to be a graphics procesor onboard the cpu, meaning specialised hardware for specialised means, connecting to the cpu. This can be applied to more than just graphics as well.

Reply Score: 1

RE[2]: Deja Vu
by someone on Fri 27th Oct 2006 13:00 UTC in reply to "Deja Vu"
someone Member since:
2006-01-12

Is this not what we had before the GPU? A regular CPU that was the GPU. ;)

Not quite.

Even with SIMD instructions, the CPU still can't match the GPU's power at doing the same calculations on large amounts of data. However, GPUs cannot handle branching at all. The combination will allow better communication between the two (most likely in the form of coprocessors) and allow both to handle the type of processing they were optimized for.

Edited 2006-10-27 13:11

Reply Score: 1

RE: Deja Vu
by phoenix on Fri 27th Oct 2006 18:57 UTC in reply to "Deja Vu"
phoenix Member since:
2005-07-11

No, this is what we had back when Cyrix released the MediaGX chip: a CPU, a GPU, a soundcard, all on one die, in one socket on the motherboard.

Only now, we're going to have modern, fast CPU, with modern, fast GPU on one die, in one socket on the motherboard. Something that hopefully won't suck nearly as much as the MediaGX did when it was released.

Reply Score: 1

It will benefit everyone in the long run
by Moulinneuf on Thu 26th Oct 2006 21:40 UTC
Moulinneuf
Member since:
2005-07-06

The AMD-ATI merger will force Intel to improve its graphic solution , and Nvidia making a CPU will force ATI and Intel into some much needed competition.

I just wonder why Nvidia dont go after VIA ( CPU + CHIPSET ).

Lets hope they will be willing to fight on price too.

Reply Score: 3

cptnapalm Member since:
2006-08-09

Nvidia do appear to be set to make their own CPU already. There was an article somewhere, Slashdot maybe, about how they would most likely make an x86 processor, but they may have some hurdles in front of them called Intel and AMD.

Reply Score: 1

Moulinneuf Member since:
2005-07-06

"they may have some hurdles in front of them called Intel and AMD."

Thats why I mentioned VIA , its not as much the technology that will be the biggest hurdle IMO , but the patents.

What AMD-ATI will have an edge is with the Imageon chipset and Radeon® Xpress a mixup of all of those with a CPU could be able to bring a Centrino killer.

Imagine a Chipset that does Videoconferencing , picture taking , TV broadcasting , with decent PC graphic level and a fast and effcient power saving CPU all integrated.

Cant wait to see what fusion will be.

Reply Score: 3

The beer hunter
by Sphinx on Thu 26th Oct 2006 22:53 UTC
Sphinx
Member since:
2005-07-09

That's what it's all about, one chip.

Reply Score: 1

All-in-one chipset from AMD
by egon_spengler on Fri 27th Oct 2006 02:26 UTC
egon_spengler
Member since:
2005-11-20

I seem to recall a HORRID Cyrix that did this several years ago, the MediaGX. The system performance dropped to its knees under any sort of video load whatsoever. Will this be any better?

Edited 2006-10-27 02:31

Reply Score: 1

RE: All-in-one chipset from AMD
by cptnapalm on Fri 27th Oct 2006 02:49 UTC in reply to "All-in-one chipset from AMD"
cptnapalm Member since:
2006-08-09

Cyrix just sucked. Not much more commentary is necessary than that.

Reply Score: 2

RE: All-in-one chipset from AMD
by MamiyaOtaru on Fri 27th Oct 2006 03:26 UTC in reply to "All-in-one chipset from AMD"
MamiyaOtaru Member since:
2005-11-11

Its descendant, AMD's Geode, is still with us today doing its thing in the embedded space.

Combining the two could be interesting for laptops,but I don't think it will catch on for performance. The graphics portion would presumable share system memory, instead of using DDR3/DDR4 like on today's discrete cards.

Reply Score: 2

RE: All-in-one chipset from AMD
by flywheel on Fri 27th Oct 2006 07:17 UTC in reply to "All-in-one chipset from AMD"
flywheel Member since:
2005-12-28

Well for some obscure reason Cyrix chose to use the 5x86 core (Downsized 6x86) for the MediaGx.

It wasn't especially powerfull, but not as bad as you describe - compaired to its competitors, which was the cheap i486s and later the WinChip (Funny enough VIA later bought the Winchip and Centaur still develops it under the name of C5/C7).

NS sold most of Cyrix to VIA, but they did hold on to the MediaGX - for the Webpad (What Microsoft later invented under the Tablet name). Later the MediaGX was sold to AMD, that used it for the Geode line.
But most of the Geode products today uses a modified K7 core.

Reply Score: 3

Run hot
by Xaero_Vincent on Fri 27th Oct 2006 04:08 UTC
Xaero_Vincent
Member since:
2006-08-18

Yeah but having a powerful GPU and CPU integrated into one die centralizes more heat. I bet these chips will need liquid cooling units or massive fans with radiator heat sinks.

Reply Score: 2

RE: Run hot
by Morin on Fri 27th Oct 2006 10:40 UTC in reply to "Run hot"
Morin Member since:
2005-12-31

> Yeah but having a powerful GPU and CPU integrated into
> one die centralizes more heat. I bet these chips will
> need liquid cooling units or massive fans with radiator
> heat sinks.

AMD will probably not use the fastest generation of both CPU and GPU together. My guess is that they build low-power versions of both and use the combo for laptops.

Reply Score: 2

Uhh
by WereCatf on Fri 27th Oct 2006 06:45 UTC
WereCatf
Member since:
2006-02-15

It seems most of the readers here misunderstood the point of the article: they are talking about taking parallelism technology used in GPUs as an example and trying to apply it to CPUs. As far as I understood, they don't actually mean to create a CPU with integrated GPU.

Reply Score: 1

RE: Uhh
by cyrilleberger on Fri 27th Oct 2006 07:01 UTC in reply to "Uhh"
cyrilleberger Member since:
2006-02-01

If you read carefully it clearly states that instead of mutliplying logical core inside the cpu, they will put a gpu unit: "In other words, AMD’s future “Fusion” platform, won’t just be an Athlon and a Radeon sharing the same silicon. "

Reply Score: 2

This IS interesting!
by B. Janssen on Fri 27th Oct 2006 07:59 UTC
B. Janssen
Member since:
2006-10-11

If the article is right, this can be very interesting. CPUs and GPUs are both very fast in their respective fields of expertise and any chip that includes both techniques and is able to determine during run-time which code segments to pipe into which PU while maintaining code integrity will probably benefit huge time.

There are problems, however, heat has already be mentioned here, the potential switch to more advanced programming techniques, e.g. functional programming, could be a huge financial investment, and finally any chip that is able to determine what to do with a given code segment probably has to look at the code before executing, which sounds like double work. Nonetheless, it is interesting.

Reply Score: 1

cptnapalm
Member since:
2006-08-09

One thing which I am concerned about is whether or not AMD will make *nix drivers a priority. I've had so many problems with ATI graphics chips that I am very wary of building or buying a machine with one. If they don't, then I'll simply have to steer clear of AMD CPUs. And that kind of sucks.

Reply Score: 3

Not a good sign.
by mickrussom on Fri 27th Oct 2006 09:41 UTC
mickrussom
Member since:
2006-05-13

With Intel Conroe Woodcrest leading the spec.org pack, and the quad core QX6700 out now and K8L still a long way off it seems that intel simply has to put the memory controller onchip now and AMD is more or less obsolete.

Intel had the money and the marketing engine to stay profitable and alive while AMD had the lead. The reverse will NOT be true.

Also, ATI's video drivers are bad, so I don't use their products either. Seems that AMD and ATI are destined to screw up.

AMD: Focus on beating Conroe. The E8000 is a 1333MHZ FSB 3.33 GHZ Dual core due out soon. Beat it. Or die.

Reply Score: 3

MMX ultra?
by miro on Fri 27th Oct 2006 11:33 UTC
miro
Member since:
2005-07-13

I mean a long time ago (Pentium Pro?) a cpu had SIMD instructions. And they were designed for graphic operations (like add + clamp eg. 200 + 200 = 255).

Having 8 cores with only 2 threads running won't be any faster than only having 2 cores. BUT if somebody can figure out how to effectively determine that some code is a loop that can be SIMD-ed, having 8 cores would be a lot faster.

However I think that only the compiler is able to "effectively" determine what can be SIMD-ed, not the cpu.

Reply Score: 2

why Nvidia wouldnt buy VIA
by poundsmack on Fri 27th Oct 2006 16:01 UTC
poundsmack
Member since:
2005-07-13

ok VIA and Nvidia have totaly different goals in mind. Via for embebded stuf or small low pwoer low heat computers. not power house gaming machines I know one of my little toys runs off a C7 its nice, but withouts customizing windows down to minimal ram usualge would be nno good for real games. also Via has everyhting it needs to be a complete platform, its own netwkr stuffm sound, and video S3. amoung its other commponents.

Reply Score: 1