Linked by Thom Holwerda on Mon 6th Nov 2017 15:31 UTC
Intel

Well, this is the kind of news you don't hear every day: Intel and AMD are teaming up to develop a processor that combines an Intel CPU with an AMD GPU. From Intel's press release:

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD's Radeon Technologies Group* - all in a single processor package.

It’s a prime example of hardware and software innovations intersecting to create something amazing that fills a unique market gap. Helping to deliver on our vision for this new class of product, we worked with the team at AMD’s Radeon Technologies Group. In close collaboration, we designed a new semi-custom graphics chip, which means this is also a great example of how we can compete and work together, ultimately delivering innovation that is good for consumers.

This is the first partnership between these two sworn rivals in several decades, and that alone makes it quite notable. I didn't really know whether to put this in the Intel or AMD category, but I chose Intel because it appears above AMD in our list (which isn't alphabetical because reasons).

Thread beginning with comment 650731
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[6]: Not April Fool's?
by tylerdurden on Wed 8th Nov 2017 21:43 UTC in reply to "RE[5]: Not April Fool's?"
tylerdurden
Member since:
2009-03-17

nvidia had a nice business going with the nforce chipset for intel during the Core days. They were basically the best chipsets for SLI (obviously), although they had some reliability issues.

Yeah, I think Intel got very protectionistic of X86 by the middle of the 90s. I was a kid back then, but I remember how they basically introduced the name pentium, so they could copyright the product line after the 486. And at some point they also ended up kicking out any 3rd parties from their sockets.

I don't know if ISAs were commonly copyrighted back in the 70s/80s. So probably that was the reason for the number of x86 cloners back then?

Reply Parent Score: 3

RE[7]: Not April Fool's?
by zima on Thu 9th Nov 2017 23:35 in reply to "RE[6]: Not April Fool's?"
zima Member since:
2005-07-06

Hm, for some reason I forgot about Nforce for Core / SLI. Perhaps because ultimatelly it was a fairly niche product (so it probably didn't hurt Nvidia that much when Intel blocked it) - essentially, a halo product.

And I thought we already established that there were plenty of x86 cloners also, or perhaps especially (with indpendent designs; and even more advanced than Intel, for example IIRC Nexgen was the first that translated x86 instructions into internal RISC-like), in the 90s... ;)

Reply Parent Score: 3

RE[8]: Not April Fool's?
by tylerdurden on Fri 10th Nov 2017 03:17 in reply to "RE[7]: Not April Fool's?"
tylerdurden Member since:
2009-03-17

Actually losing the chipset business hurt nvidia plenty. Part of the settlement that nvidia got from Intel were used to cover the damages.

The microinstructions is really more of an out-of-order thing. Intel also did it starting with the Pentium pro in the mid 90s as well.

Reply Parent Score: 2