A post by Torvalds on the linux-kernel mailing list suggests that Intel should be more than a little ashamed of itself when it announced its 64-bit extensions last week. Torvalds was replying to a post which asked whether there was any difference between X86_64 and X86-64. He said the real name for the instruction set should be X86-64, and always has been. Torvalds said he was “a bit disgusted” at Intel for not even mentioning AMD in its documentation or its press releases. Read more at the Inquirer.
http://www.theinquirer.net/?article=14301
Can someone give me more detailed info on this please? To be more precise, I need details about the “Intel’s announcement of its 64-bit extensions last week.” SOmehow I missed it.
Also I need more info on the Intel’s IA32 or was it IA64 platform. I am sort of confused. What did Intel announce last week and was it server based or consumer based. So Intel based their 64-bit instruction set on AMD’s one?
http://www.osnews.com/story.php?news_id=6052
This is intel architecture and instructions with 64bit registers , that’s all. It has the same instructions and operation codes. It is 100% intel.
Intel should have given credit to AMD and should have mentioned AMD’s name because they essentially borrowed the idea from AMD.
If now they try to act as if its their innovation then its plainly cheap.
No matter how many years AMD tried to catch up with Intel, today it has proved that it has good techincal brain to make Intel follow them too.
Nadav, Even if its Intel’s opcodes, the design is AMD’s. Intel was pushing Itanium which proved to be a failure atleast for the time being, so all credit goes to AMD for taking up the lead and providing customer cheaper and faster solution.
That any post by Linus is taken out of a mailing list and reported on the web. Linus wantsnto be just another engineer which is why he handed of maintainer ship to Andrew Morton, leave the man alone, let him work in peace he has better things to do that sit there and get mauled with inquiries and responses to his online posts. By the way, I do agree with him in this case. Intel tho is a company that screws the Linux community over every chance they get.
> Intel tho is a company that screws the Linux community over every chance they get.
Are you referring to the WLAN (radio) chips inside “Centrino” laptops?
> get mauled with inquiries and responses to his online posts.
If you don’t want that, think before you post.
“Intel tho is a company that screws the Linux community over every chance they get.”
I seem to remember Intel supporting financially some Open Source projects.
I never really see AMD doing anythng for Linux. They seem to get the support of overlapping Linux/AMD audience based on their (AMD’s) technology alone.
1) The Intel x86_64 instruction set is the same as the AMD x86-64 set.
2) Intel’s design is 100% AMD compatable.
3) AMD has always advertized themselves as being Intel compatable and gave credit to Intel for the instruction set.
4) Intel is now AMD compatable and they don’t want to acknowlege that they are now cloning the AMD chip.
AMD generates almost the same amount of zealotry as Linux. But why?? Because it’s cheap and zealots can afford AMD, but not Intel? This is pathetic.
Wow! Grade two logic: “if you like _____, then you’re ____.” In this case, we see this synaptically challenged poster saying extactly this – to paraphrase: “If you like AMD, then you are a zealot (meant in a derrogatory context).” and “If you don’t like intel, then you are pathetic”.
From a business perspective, it makes no sense to purchase a product that performs less and costs more, unless the manufactorer is more stable, offers a more comprehensive support packge, or offers kick backs. Neither stability or support are different in dealing with AMD and intel.
From a technical standpoint, AMD is the clear winner (a 2.4GHz AMD Athlon 64 CPU vs a 3.2GHz intel Penium 4 CPU): http://www.tomshardware.com/cpu/20040106/athlon64_3400-05.html
What other perspectives are left? Perhaps, just slicker advertizing campaigns.
Sorry for being a bit off topic but
>”I never really see AMD doing anythng for Linux. They seem to >get the support of overlapping Linux/AMD audience based on >their (AMD’s) technology alone.”
Check out http://www.x86-64.org/ – “This site is dedicated to porting open source software to the AMD64 architecture, including: GNU/Linux, FreeBSD and NetBSD.”
Setup by AMD, Suse and others.
LOL Bagdadbob, from your post:
“From a technical standpoint, AMD is the clear winner (a 2.4GHz AMD Athlon 64 CPU vs a 3.2GHz intel Penium 4 CPU): http://www.tomshardware.com/cpu/20040106/athlon64_3400-05.html ”
Ok, I read the review, and here is the actual conclusion (you convenviently only linked to the results for first person shooters):
“That leaves us with a clear description of the Athlon64 3400+: it’s a top quality CPU that’s especially suitable for games and that also lives up to its model name – albeit only in this category. At the end of the day, it still lags slightly behind the Pentium 4, a deficit that the 64-bit architecture could compensate for in the medium term, however.”
Very hardly a “clear winner”. Now, I’m a Mac and Unix user, so I’m not a fainboi either way on this issue, but I believe it’s precisely this type of selective memory and promotion that separates the “zealots” from the enthusiasts.
Tomsharware is an Intel-oriented site.
Intel’s original platform began (AFAIK) with the 8086/8088 processors. The 80286, 80386, and 80486 processors continued to build on this platform. Eventually, the 80×86 line was rebranded Pentium by Intel as a means of differentiating between the various clones which were made by AMD and Cyrix. Sometime thereafter followed the Pentium II and III chips. AMD began its own rebranding with its K5 (Pentium-level performance), K6 (Pentium II lev. perf.), and K7 (Pentium III lev. performance) chips; the last of which was called the Athlon.
As these microprocessors evolved, they adopted new instructions, but they also remained compatible with earlier chips’ instructions. Thus, it was traditional for developers to call this platform the i386 platform since the 80386 was the earliest chip in this line to implement functions required for modern software. Somewhere in the chip wars, Intel decided to rebrand the i386 architecture to IA32, which means Intel Architecture 32-bit even though both Intel and its competitors were implementing it. (Gotta love that marketing department.)
Around the time when the Pentium IV and Athlon XP chips were being introduced, Intel and AMD were working on 64-bit chip designs. (Its common to start designing a new chip years before it comes to market because of all the tedious work involved.) While AMD’s Athlon 64 design implemented 64-bit instructions as an extension to the existing instruction set, Intel’s Itanium adopted both a radically different design philosophy (VLIW?) and a completely incompatible instruction set. Just to make things more confusing, Intel rebranded the Itanium architecture IA64 even though it had no direct evolution from or compatibility with IA32.
Both Intel and AMD secured separate deals with Microsoft for 64-bit native versions of Windows for their chips, but Intel’s design fizzled for a variety of reasons, but the most important was that Itanium had no compatibility with IA32 software; programs had to be either ported or run at greatly decreased performance via emulation. Not only do emulated programs run with a substantially lower number of instructions per clock cycle, but Itanium chips also run at a substantially lower clock than their IA32 counterparts, which meant that a shiny new $900 Itanium would run IA32 software about as well as a 5-year-old chip that you could pick up on eBay for $20-$30. Ugh.
One of the reasons that Windows is so popular is the large variety of software that runs under it. Windows with Itanium would have been a complete and utter disaster in the desktop market. (To be fair, Itanium does better in the server and workstation markets.) AMD’s Athlon 64 offered a much more practical solution: it runs existing software with no penalty – in fact, it actually runs them faster than the 32-bit-only Athlon XP’s due to other improvements – but it also requires no work on the part of third-party application developers when the chip is eventually used under a 64-bit OS. The only things that need to be done to port Windows XP to AMD64 are to rewrite some assembly in the OS, make the high-level code 64-bit clean, and then rewrite the drivers.
(The drivers are about the only thing that needs to be done yet, and you should see Windows XP 64-bit for AMD64 on store shelves at the end of this year.)
Intel did exactly what I thought they would do; they simply waited while AMD worked on its 64-bit extensions and then jumped in at the last moment when it was clear that AMD had something useful. What Linus is complaining about is that Intel rebranded AMD’s technology as IA32e and then left AMD’s name completely out of the docs. That would be like distributing the Linux kernel as your own work with no mention of the original authors.
From the consumer’s point of view, this is even more frustrating. Had AMD not developed their 64-bit extensions, we would have been stuck with either a terribly inconvenient move to Itanium now or a somewhat less inconvenient move some time in the fairly distant future. And I seriously doubt that Intel would have been motivated to increase the performance of its 32-bit line so drastically in the past few years if AMD had not created the Athlon. It’s silly for Intel to treat AMD as just a little parasitic clone maker when their chips are such great competition.
The man expresses itself…
I don’t think it’s weird that Intel like complicated designs and such, like the Itanium (or the x86 for that matter!) Have a look at this interesting link which describes Intel’s work on an object-oriented CPU which they wanted to be their flagship product:
http://www3.sk.sympatico.ca/jbayko/cpu7.html
Crazy, I tell ya!
Personally I am interested in the potential of MISC/stack computers, they are nice and simple compared to RISC and CISC register machines (though the stack is implemented in registers for speed, like a circular queue). Tiny codesize too because operands are implicit.
http://www.ultratechnology.com/
There’s an interesting read in the stack computers (the new wave) book about a bloody fast Lisp implementation where the lists were executable data structures, how cool is that? There’s also a hint that they might be excellent for running functional languages in general, this would be a good thing to research further.
I think it’s pretty funny what AMD are doing, Intel are probably screaming to get away from x86… haha…
</randomness>
“This is intel architecture and instructions with 64bit registers , that’s all. It has the same instructions and operation codes. It is 100% intel.”
Absolutely wrong. It has 16 registers instead of 8, which must be accounted for when compiling. As for the instruction set, I do believe AMD has added some things in with the Athlon that still appear in AMD-64.
LOL LOL “anonymous”. My point wasn’t that Intel even approaches AMD as far as price and performance. My point was, as you so succinctly put, that the original poster
“backed up the truth with crap link “. That is a very prime example of zealotry in a post protesting the very opposite: choosing only evidence that’s support your argument in favor of your position, regardless of whether that piece of evidence actually supports your case.
I don’t know which chip is better for the average user, and as I don’t own either I don’t really care (by the way I have an MS in computer science from Stanford and work in an AI lab), but I was pointing out that this guy who was hell bent on proving his point didn’t even care what the evidence he posed actually said, he just chose to put his spin on it.
“still doesn’t invalidate the truth”….
I’ll take your word for it anonymous, but it doesn’t change the truth that the guy who whined “If you like AMD, then you are a zealot” pulled whatever he could out of his ass for the sake of supporting AMD. And I’m not sure what that has to do with how much I know about computers.
If Intel failed to make its Itanium competitive, how is this suddenly AMD’s fault. By your logic, everyone should have jumped to BeOS whilst it was still invomplete because it had the potential to be much better than windows. the world does not have that patiance. Besides, there is a chip that is as good, it not much better designed that the Itanium. Its called the Power4. Its made by IBM, and its derivative is used in Apple’s G5 workstations.
That people should think the fact that AMD acknowledged Intel compatibility was not a marketing ploy. At the time Intel had a complete stranglehold on the x86 market, being compatible with the Intel instruction set was a HUGE reason for people to consider using AMD processors. AMD advertising themselves as Intel compatible was done solely in their own self-interest.
No such reason exists for Intel to use AMD compatibility as a marketing tool, therefore it hasn’t been done. I don’t understand why this should come as a shock.
***
Off topic I think it’s kinda wrong to extrapolate Torvalds’ position from a single, possibly off-the-cuff, remark on the kernel mailing list. And in any case just how does Torvalds opinion actually matter in this instance? Would we give credence to RMS’ views if he starts blathering on about gun control? I fail to see how commentary on this issue from a party not representing either AMD or Intel is newsworthy. If it was John Doe and not Torvalds this article wouldn’t be here.
This is so true. My comp. architecture instructure just told the class the other day that Intel was making an innovation breakthrough by introducing the 64-bit instruction set. I was floored. He never mentioned AMD once!
Intel Positions Xeon VS. OPteron…. but I like other REASONABLE people, know that there isn’t really ANY competition from Xeon…. just Intel’s clout in the marketplace.
Imagine this… AMD on top and Intel trailing behind….
Amd uses insurrmountable funds to STIFFLE its younger counterpart… even flat out takes credit for an instruction set that “Intel ” MADEUP…. (btw this is JUST a SKIT)
now Do you think that Intel would ONLY use clock speed if they were in the position that I have “IMAGINED” them???
hell-no!
but because they ARE the one with ALLL the money… that makes them confident and Monopolistic (MS?) some companies WONT EVEN TOUCH AMD’S superior products… sure Intel HAD a GREAT CPU, but they just killed the P3 for Clock speed.
amd rather have performance
any one who has info to why any one else should even consider Prescott or other (Pentium M is OK..)
speak.
and mabye I’m just the stupidist person I know?
If it was John Doe and not Torvalds this article wouldn’t be here.
Is this meant to be surprising? He’s worked for a CPU manufacturer and he started one of the most popular operating systems in the world, which runs on the architectures that are being debated. I can see how you think his opinion isn’t relavant to this, oh yes.
That Intel didn’t give credit where credit was due.
This is just wrong!
It doesnt matter if it was a “no brainer” or not, you should ALWAYS give credit where credit is due!
You should NEVER claim someone elses work as your own!
Like I said, it doesnt matter how small the work actually was, Intel implemented AMD64 extensions and called them their own, and that’s wrong.
Happens all the time doesn’t it? Smaller company catches bigger company with it’s pants down, gains market share
and the Big Company comes out with the same thing a year
or two later. Of course they claim everything is new and
improved and not-at-all like their competitors.
Not as blatant as this though. They changed x86-64 to
x86_64? I would like to have been at the meeting when
they thought that up.
Not a big deal though, but when I go 64 bit, I’ll get AMD.
Looks like there’s a new leader in the CPU industry.
…why Linus would even expect Intel to give any positive ink to AMD. What he calls petty is simple business practice. When threatened by another company’s innovation, corporations introduce their own response, and it would be stupid to praise or credit a competitor. I’ve yet to see a press release that says:
“We have this new, super cool thing, but wanted to make sure we point out that we are only doing this because that other company did it first, so as not to appear petty. If you want to buy from the folks who came up with a similar idea first, make sure you don’t give us your business.”
Yes, it’s crazy to think that anyone might have some morals in this day and age.
I don’t think he was expecting the press release to menton AMD at the top, but just somewhere would’ve been at least decent of them.
ok.
your second post has clarifed. i for a moment, thought that you were a zealot, calling out a zealot.
teapot calling kettle black.
but from your post I can tell that you are a rational person.
i hereby retract my biting sarcasm and insults 😉
i think we see eye to eye on the intel/amd thing.
no doubt the competition is heated up (again). and for instance, i know that p4 outrenders AMD in 3d max. So to my friends who are animators using max, i recommend p4.
i also have friends who are gamers. i recommend them to go with amd64.
then i have friends who don’t like messing with computers, but want to author dvds and movies.
of course, i tell them to go mac.
Looks like there’s a new leader in the CPU industry.
Leader? To my knowledge, AMD’s market share is still below 10%. How can you be a leader when most people are not evey bothering to see what you offer?
AMD have won a battle… but they didn’t won the war. They still have a lot of work to do before claiming the PC market crown.
Personally, when I’ll go 64-bit, I won’t choose an architecture from the 70s, but hey, that’s just me. I like AMD (I only have AMD-based computer here) but I don’t think they’re developing the best innovation since sliced bread.
Leader? To my knowledge, AMD’s market share is still below 10%
I think he meant that AMD is now technology-leader when it comes to consumer CPUs. And actually, this is the first time that Intel adopts AMD technology (and not the other way round).
“”Is this meant to be surprising? He’s worked for a CPU manufacturer and he started one of the most popular operating systems in the world, which runs on the architectures that are being debated. I can see how you think his opinion isn’t relavant to this, oh yes.””
Hmm, well since Transmeta choose to give no credit to Intel for being the originator of the x86 instruction set (Check their website and technical docs if you don’t believe me) his position is actually surprising.
Either way I still see no reason why Torvalds opinion, on whether or not Intel should credit AMD in their documentation, is any more relevant than that of a bloke you met in the pub.
I don’t think it’s the first time but it still doesn’t make them the technology leader to me. For example, I doubt developers would jump on 3DNow v3 even if AMD were pushing it in their chips. However, SSE3 found in the Prescott will probably get widely adopted… That is a sign of true leadership.
To me, Intel’s move to x86-64 is not because its superiority but rather because of its backward compatibility. Okay, the Itanium is far from being the best CPU ouyt there but the technology was better than x86. The implementation wasn’t great, though.
Anyway, to be back on topic, I’m not sure if Intel should had credited AMD. I don’t remember the opposite for SSE or the x86 instruction set but I must admit that I don’t read their documentation every day.
Wrawrat: Marketing/backward compatibility; yup. Intel is probably not happy about the money they invested in the Itanium, only to have to use what is essentially x86-64. I’m still searching for the Cringely column where he describes this as only a temporary setback/solution for Intel, and in what manner. I think it involved adapting code morphing for the Itanium. or rather adapting the Itanium for it.
I used to love Intel. But of late I’ve been disgusted by some of their actions too. I wish AMD will contribute a lot to GCC just for the sake of killing ICC once and for all.
If they do that, I can bet you all Linux users will be purchasing AMD chips exclusively. I check the GCC mailing list frequently, and sadly, I hardly see any contributions from AMD. Or, maybe they do contribute, but they use their personal emails.
Afterall, Linux always happens to be the first operating system that runs on their new processors. Windows gets supported two years later, as usual.
From a business perspective, it makes no sense to purchase a product that performs less and costs more, unless the manufactorer is more stable, offers a more comprehensive support packge, or offers kick backs. Neither stability or support are different in dealing with AMD and intel.
AMD based platforms have historically – but particularly since the introduction of the K7/Athlon family – had motherboard compatibility and reliability problems. Most of these problems are not the fault of AMD per se, but caused by poor quality motherboard chipsets (mostly from VIA) and the inevitable QA corner-cutting that comes from targeting a more budget-oriented marketplace. Note that while similar problems exist on intel based platforms using these same chipsets, at least buyers in that market had the alternative of competitively priced and vastly superior products using intel manufactured chipsets (eg: the BX chipset, which was unarguably the best chipset to have on a motherboard in its day and _still_ being used in new products 6 – 12 months after being EOLed).
This is probably the biggest single reason AMD have made few inroads outside of the consumer market, where reliability takes third place to – firstly – cost, and secondly, performance.
Personally I have an Athlon in my gaming machine, because when the weird little problems start popping up (that always seem to be rectified by the latest set of chipset drivers (and their accompanying hacks and bug workarounds) they only impact on the time I can spend playing games. However, I wouldn’t want those same types of issues popping up in my servers. Even my home server is only ever based around intel CPUs and chipsets because I’ve simply had *way* too many problems trying to get non-intel based motherboards to work reliably with maxed out memory and/or PCI slots.
Hopefully, the newer Opteron and co. CPUs, which integrate some chipset functionality onto the CPU itself, will help to rectify the reliability problems that plague the AMD based platform. We shall see.
However, to say “from a business perspective, it makes no sense to purchase a product that performs less and costs more” and completely ignoring some of the most important aspects of a platform to a business – stability, reliability and compatibility – indicates either a great deal of naivety or a lack of experience in dealing with non-trivial environments.
Platform stability is king. Potential for lost revenue due to downtime and/or troubleshooting is orders of magnitude greater than potential lost revenue due to marginally lower performance or marginally higher (initial) hardware costs.
I think what those who do not believe x86_64 is an embarassment to Intel missed was several public statements in which Intel insisted that 64-bit processors should be relegated to the server realm (i.e. Itanium) and were not needed by ordinary desktop users. Intel repeatedly insisted upon this, and claimed they would not develop a 64-bit desktop processor. Fast forward to 2004, they do a complete 180 and announce support for a 64-bit processor for desktop systems. By taking such a strong stance then completely reversing their opinion, they demonstrate uncertainty about their processor roadmap and consequently the future of the x86 platform as a whole.
I believe Intel finally decided that offering an x86_64 processor would not hurt Itanium sales, although Opteron is certain to damage the sales of low-end IA64 servers and is likely to make IA64 workstations completely unmarketable.
has AMD ever credited Intel with anything? not trying to be flamebait, i honestly don’t know…
I used to love Intel. But of late I’ve been disgusted by some of their actions too. I wish AMD will contribute a lot to GCC just for the sake of killing ICC once and for all.
Care to say why? ICC seems nice to me. I don’t see why everybody should blindly support GCC just because it’s open-source.
If they do that, I can bet you all Linux users will be purchasing AMD chips exclusively. I check the GCC mailing list frequently, and sadly, I hardly see any contributions from AMD. Or, maybe they do contribute, but they use their personal emails.
Again, care to say why? I don’t see why people would suddently rush to AMD. I use AMD products but not because they’re the underdog or because I don’t want to support an “evil” corporation. AMD would be as “evil” as Intel if they were in the opposite situation, anyway.
Afterall, Linux always happens to be the first operating system that runs on their new processors. Windows gets supported two years later, as usual.
All the previous Athlon processors are supported by MS Windows since they came out… It’s true that we’re still waiting for an AMD64 version of Windows and that Linux already support it but I heard the support isn’t top notch. Many applications are unstable on it…
[i]I think what those who do not believe x86_64 is an embarassment to Intel missed was several public statements in which Intel insisted that 64-bit processors should be relegated to the server realm (i.e. Itanium) and were not needed by ordinary desktop users. Intel repeatedly insisted upon this, and claimed they would not develop a 64-bit desktop processor. Fast forward to 2004, they do a complete 180 and announce support for a 64-bit processor for desktop systems. By taking such a strong stance then completely reversing their opinion, they demonstrate uncertainty about their processor roadmap and consequently the future of the x86 platform as a whole. [i]
Uh, saying they’ve done a complete 180 is a bit extreme. It’s pretty clear they’ve been forced into this position by the market and are not doing so voluntarily. Also, it’s not like intel have got the men in bunny suits doing the hard sell on their version of x86-64, it’s something they’re supporting because the customers are (supposedly) asking for it.
Personally I’m inclined to agree with intel – the advantages of x86-64 over regular ia32 to the average punter are miniscule, where they even exist at all. Intel’s position does not seem to have changed enough from this philosophy to justify the term “complete 180” (if it had, they’d be targeting Itanium at the desktop).
Prescott and Nocona are almost the same, The only difference is in the cache size and the bus speed. Nocona is for servers and Prescott for desktop as far as I know. What is the IA32 platform? Is this for servers too?
The real issue was not about saying nice things about your competitors in your press releases just for the heck of it; the issue was about taking someone else’s hard work and claiming it as your own, after fudging some of the terms (x86-64 vs. x86_64). That’s childish and unethical.
What, you don’t think so? How about I take one of your posts, change a word here and there, and then repost it to message boards all over the Internet under my screen name without mentioning you? Wouldn’t like that, eh? Gee, I wonder why…
It was fairly obvious that Intel would jump on board with AMD at some point, but I must admit that I expected Intel to react with more maturity than their fanboys; it’s too bad I was wrong. 😛
> Wrawrat: Anyway, to be back on topic, I’m not sure if Intel should had credited AMD. I don’t remember the opposite for SSE or the x86 instruction set but I must admit that I don’t read their documentation every day.
IA32e means “Intel Architecture 32-bit extended to 64-bit”. Intel’s rebranding effort is rather surprising given that it comes off their failure to develop a viable 64-bit instruction set for the desktop. I don’t see AMD rebranding SSE as ASSE, though of course there might be more than one reason for that. 😉
> Alex (The Original): Prescott and Nocona are almost the same, The only difference is in the cache size and the bus speed. Nocona is for servers and Prescott for desktop as far as I know. What is the IA32 platform? Is this for servers too?
The IA32 platform is the same platform that modern x86-based computers use, and the software is the same. Workstations and low-end servers use IA32 in addition to a ton of desktop PCs.
There was a lawsuit between AMD and Intel a few years ago over the Intel architecture. As a result, AMD got a stack of cash and Intel got the right to use some of AMD’s tech. AFAIK, this includes the AMD64 architecture.
On the other hand, Intel doesn’t have the right to use any of AMD’s trademarks, including “AMD”, “AMD64” and “X86-64”. In order to even mention AMD or their work in their documents and press releases, they would have had to get permission from AMD, right?
This may just be an example of silly business laws. Not that I wouldn’t believe malice or stupidity…
Uh, saying they’ve done a complete 180 is a bit extreme. It’s pretty clear they’ve been forced into this position by the market and are not doing so voluntarily.
In other words, Intel attempted to dictate what the market wanted, and the market disagreed.
Personally I’m inclined to agree with intel – the advantages of x86-64 over regular ia32 to the average punter are miniscule
Not really, considering existing IA32 applications see a number of speedups when executed on top of an x86-64 kernel, especially in terms of VFS operations. Virtually every operating system now features a 64-bit VFS, and most implement fully 64-bit filesystems underneath. There’s also the performance increase accrued through the presence of additional general purpose registers, which speeds overall kernel operations. Furthermore, there are a host of other benefits only available with a 64-bit VMM, such as drastically simpler prebinding implementations and memory mapped I/O support for large files (which was typically limited to 2GB on IA32).
x86-64 solves a number of fundamental constraints in the IA32 architecture… it’s not simply the addition of support for 64-bit integer/address operations, which are arguably only useful in a handfull of cases.
Intel’s position does not seem to have changed enough from this philosophy to justify the term “complete 180” (if it had, they’d be targeting Itanium at the desktop).
You can nitpick semantically if you want, but the point is Intel flip-flopped on their position.
All they had to say was that the new architecture was/will be compatible with the current AMD chips.
The biggest reason for this is mostly to assure everyone that there isn’t going to be yet another technology split over 64 bit extensions.
They didn’t have to cede credit to AMD or admit they were playing catch up. There is difference between spin and credibility.
That answers my question. I appreciate your reply.
I don’t think it’s the first time but it still doesn’t make them the technology leader to me. For example, I doubt developers would jump on 3DNow v3 even if AMD were pushing it in their chips. However, SSE3 found in the Prescott will probably get widely adopted… That is a sign of true leadership.
AMD has specified the standard for the next generation of the x86 architecture, and Intel is being forced to adopt it. Yet the point you broach is extensions? AMD has garnered considerable developer support for x86-64/AMD64, from such big name players as Microsoft and Oracle.
…the Itanium is far from being the best CPU ouyt there but the technology was better than x86. The implementation wasn’t great, though.
The Itanium addresses an entirely different problem domain than x86 processors. There is a small amount of overlap between Opteron and Itanium in the low-end server/high performance workstation space, but neither of these are the Itanium’s primary market. x86-64 is the next step in the evolution of the x86 architecture, and for the first time it’s been defined by AMD instead of Intel. AMD is now not only in control of x86’s future, but they’re once again the performance leader in the x86 realm.
In other words, Intel attempted to dictate what the market wanted, and the market disagreed.
No more than anyone else who tries to introduce a product “dictates to the market”.
Not really, considering existing IA32 applications see a number of speedups when executed on top of an x86-64 kernel […]
And how many of these advantages are going to be apparent without requiring new applications and/or operating systems ? How many of them are going to be noticable to the typical web browsing and emailing user ? What sort of performance increase is going to occur ?
You can nitpick semantically if you want, but the point is Intel flip-flopped on their position.
You an I obviously have vastly different definitions of “flip flopped”. Microsoft GPLing Windows would be a “flip flop”. The Left suddenly advocating capital punishment would be “flip flop”. Intel introducing an x86-64 compatible CPU and clearly *not* targeting it at the home (or even typical business) market is not even close to being a “flip flop” of their previous position.
http://www.intel.com/technology/64bitextensions/faq.htm
Q9: Is it possible to write software that will run on Intel’s processors with 64-bit extension technology, and AMD’s 64-bit capable processors?
A9: With both companies designing entirely different architectures, the question is whether the operating system and software ported to each processor will run on the other processor, and the answer is yes in most cases. However, Intel processors support additional features, like the SSE3 instructions and Hyper-Threading Technology, which are not supported on non-Intel platforms. As such, we believe developers will achieve maximum performance and stability by designing specifically for Intel architectures and by taking advantage of Intel’s breadth of software tools and enabling services.
Entirely different architectures? I like how their Intel chips have extra additional features as though none of their competitors have similair things.
And how many of these advantages are going to be apparent without requiring new applications and/or operating systems ? How many of them are going to be noticable to the typical web browsing and emailing user ? What sort of performance increase is going to occur ?
Most of the things that Bascule mentioned will actually be used automatically by modern software without a recompile.
As for real performance increases, I can’t give you any numbers, but extending the architechure to 64-bits makes it possible for more performance enhancements at this stage then simply throwing more transistors at the 32-bit core.
Do you honestly think that AMD (and now intel) would waste the increased transistors enabled by the new CPU processes? If they could get more speed by staying 32-bit then they would but they know that in the future, it’s possible to get more speed by extending the architechure this way instead of keeping it all 32-bit.
For the people who only browse the web and email, they don’t need a new computer, any one made after 1998 is more then good enough. (you could use an even older computer if you really wanted to)
Intel introducing an x86-64 compatible CPU and clearly *not* targeting it at the home (or even typical business) market is not even close to being a “flip flop” of their previous position.
The first step for intel is always the server space. They have operated that way for years. Remember the Pentium Pro, that was for servers only, but the Pentium II which is for ‘consumers’ is a pentium pro with MMX. It’s a total flip flop because their previous position was that they were never going to extend the x86 to 64 bits and that the future of 64-bits was with Itanium.
They were trying to force people to move to Itanium, but now they are retreating on that position.
What surprises and saddens me is that so many people tote the line: “well, this isn’t wrong/immoral/unethical/petty, it’s just the way business/marketing work”.
This is probably the same line of reasoning of the CEO’s at Enron and Worldcom. I don’t see why people think that business are somehow privy to a different set of moral and ethical standards. If a student can get kicked out of a University for plagiarizing, then why isn’t it morally unacceptable for a company to not give credit where credit is due? More to the point, why do so many consumers not care?
Do consumers turn a blind eye as long as they get the best deal/performance/value for their money? Me personally, I buy AMD precisely because their ethical business standards are higher than Intel’s (though they aren’t perfect either I have to admit). So even if AMD processors cost more or perform worse, I will buy them. Why? Because I happen to believe in principles and ethics, and I wish to reward the company that also believes in them.
It’s funny how many IT people are complaining that they lost jobs because their company started outsourcing jobs to foreign companies. And yet when it comes down to it, how many people are going to pay $10 for an American product, when they can get the same or about the same quality product from Mexico for $6? I find it ironic that these people fail to see the irony and hypocrisy of their own actions. Moral of the story? The bottomline is about more than “business as usual”, or price-to-performance value.
Most of the things that Bascule mentioned will actually be used automatically by modern software without a recompile.
As for real performance increases, I can’t give you any numbers, but extending the architechure to 64-bits makes it possible for more performance enhancements at this stage then simply throwing more transistors at the 32-bit core.
However, the whole point here is that a) more performance for most people is hardly a necessity and b) if the performance gains are anything less than around 20% they’re probably going to be imperceptible anyway.
Do you honestly think that AMD (and now intel) would waste the increased transistors enabled by the new CPU processes? If they could get more speed by staying 32-bit then they would but they know that in the future, it’s possible to get more speed by extending the architechure this way instead of keeping it all 32-bit.
I made no comments about this topic. These are not the straw men you are looking for.
The first step for intel is always the server space. They have operated that way for years. Remember the Pentium Pro, that was for servers only, but the Pentium II which is for ‘consumers’ is a pentium pro with MMX. It’s a total flip flop because their previous position was that they were never going to extend the x86 to 64 bits and that the future of 64-bits was with Itanium.
Like I said, you and I obviously have vastly different definitions of “flip flop”. My understanding of intel’s position has always been that 64 bit computing was for high end servers and workstations, not desktops and regular servers and, similarly, that the Itanium and the P4/Xeon/Celeron targeted these two different market segments. It seems to me that expanding the 64 bit plan to encompass medium range servers and workstations by adding it to some parts of their x86 line is a deviation in their philosophy – hardly a “complete 180”.
If they’d announced they were immediately replacing all shipping ia-32 products and moving everything from the Celeron up to x86-64, *that* would qualify for a “complete 180”.
I suspect you’ll also find intel still consider the future of 64 bit to be Itanium and they consider this little more than an ultimately dead-end, legacy-supporting interim step (like Windows 9x was for Microsoft). That may change over time – not even behemoths like intel can exert any direct influence on a market this size.
I do have a question though – if intel’s behaviour qualifies as a “flip flop”, then how would you describe something that really was a massive change in direction and philosophy – something like, say, Debian including and endorsing proprietry software in its distribution ?
What is your point? You continue to nitpick at semantics in addition to giving misinformation.
First, Intel is not only launching a server processor with 64-bit extensions (codenamed Nocona) but will also be launching 64-bit Prescotts (see http://www.arstechnica.com/news/posts/1077046290.html and http://news.com.com/2100-1006-5160169.html)
Intel has reversed their position from their previous statements. By holding out for so long, they’ve allowed their competator AMD to not only define the standard for 64-bit x86, but they’ve also allowed AMD to usurp their position as performance leader.
AMD has specified the standard for the next generation of the x86 architecture, and Intel is being forced to adopt it. Yet the point you broach is extensions? AMD has garnered considerable developer support for x86-64/AMD64, from such big name players as Microsoft and Oracle.
To me, AMD64 is basically just an extension of the x86 architecture. AMD might have gained considerable developer support but I’m pretty sure the world won’t massively move to x86-64 until Intel does. I give credit to AMD for their recent innovations but they sadly have a reputation to build. If only they were developing their own chipset…
The Itanium addresses an entirely different problem domain than x86 processors. There is a small amount of overlap between Opteron and Itanium in the low-end server/high performance workstation space, but neither of these are the Itanium’s primary market.
I know that, but Intel eventually wanted to push the Itanium technology on desktops.
x86-64 is the next step in the evolution of the x86 architecture, and for the first time it’s been defined by AMD instead of Intel. AMD is now not only in control of x86’s future, but they’re once again the performance leader in the x86 realm.
Well, it’s not like x86’s future was bright…
You should be careful with “performance leader” as both CPUs are comparable, depending on the task. AMD is the clear winner in performance for the MHz, though.
Linus Torvalds talking about hardware? Why are we even listening to this guy? This is like Torvalds talking about Osama and nukes. Please tell the guy to shut his mouth.
The hardware community calls the extensions x86-64 (x86_64). Most people don’t even know amd’s cpus are compatible because of intel let them use x86. AMD doesn’t say intel x86, they say x86. Anybody reading the documentation should know the extensions are amd’s. Otherwise, what are you doing?
anybody read the kernel changlelog lately?
i’ve noticed alot of @intel.com contributions….
What is your point?
That the term “complete 180” is hyperbole.
You continue to nitpick at semantics in addition to giving misinformation.
I’d call the main point of the comment I replied to a touch more than “semantics”.
I have given no misinformation.
First, Intel is not only launching a server processor with 64-bit extensions (codenamed Nocona) but will also be launching 64-bit Prescotts (see http://www.arstechnica.com/news/posts/1077046290.html and http://news.com.com/2100-1006-5160169.html)
Intel has reversed their position from their previous statements.
Then:
64 bit is for servers and high end workstations. Buy an Itanium.
Now:
64 bit is for servers and high end workstations. Buy an Itanium. Or, if you’d prefer, we’re gritting our teeth and selling 64 bit Xeons and P4s.
I’m just not seeing a complete reversal in opinion there.
As I said, if they’d suddenly replaced their entire product line with x86-64 CPUs and were actively marketing them at everyone from Grandma on up, I’d agree with the “complete 180” comment.
By holding out for so long, they’ve allowed their competator AMD to not only define the standard for 64-bit x86, but they’ve also allowed AMD to usurp their position as performance leader.
AMD have “usurped” Intel’s position as performance leader numerous times over the years. It’s hardly the first time, I doubt it will be the last.
This is copy of what I sent to the Inquirer yesterday:
Credit should be given as deserved. Besides, nothing makes me feel better than reading David-n-Goliath stories – especially if Goliath is also simply too rich.
Intel probably doesn’t care. Otherwise the worry could be alienation. Don’t want that warm air between Intel and LInux replaced by cold. And Intel befriending Linux’s enemy even more.
Hopefully this will help AMD shed their inferior image. So the battle can continue with no other weapons but price.