The CEO of chip maker Advanced Micro Devices Inc. is stepping down.
Hector Ruiz had been just the second person to lead AMD after company founder Jerry Sanders. He’ll be replaced by the chip maker’s No. 2 executive, Dirk Meyer.
The CEO of chip maker Advanced Micro Devices Inc. is stepping down.
Hector Ruiz had been just the second person to lead AMD after company founder Jerry Sanders. He’ll be replaced by the chip maker’s No. 2 executive, Dirk Meyer.
First: Buy ATI
Second: AMD struggels
Third: If they get doomed, they killed ATI too.
Thank you AMD! ๐
I would miss AMD much more than I would miss ATI.
Not much information in the article. Hopefully now given the chance maybe Meyer has some ideas to help AMD.
Ruiz hasn’t been entirely removed. He is still on the board as executive chairman. And as AMD’s former number two, Meyer must have been at least as involved as Ruiz in setting the company’s path over the past few years. So this smacks of a board circling the wagons against predators and bankers rather than a company with the confidence to make a fresh start with fresh blood – or a board closing ranks because they’re sitting on a nearly done deal we don’t know about. Considering the scale of AMD’s losses at the moment and the lack of any clear AMD Intel-beater on the horizon, one wonders how much worse things have to get.
Mr. Sanders made Mr Ruiz the heir apparent of AMD. obviously he was well qualified
from wikipedia: “Ruiz earned a B.S. and M.S. in electrical engineering from the University of Texas at Austin in 1968 and 1970 respectively and a Ph.D. in electrical engineering from Rice University in 1973. He worked at Texas Instruments for six years and Motorola for 22 years, rising to become president of Motorola’s Semiconductor Products Sector before being recruited in 2000 by AMD founder Jerry Sanders to serve as AMD’s” president and chief operating officer, and to become heir apparent to lead the company upon Sanders’ retirement.
Edited 2008-07-18 04:40 UTC
I wish my CV read like that ๐
I am sorry but your quote doesn’t mean anything.
It doesn’t mean much if you are a qualified engineer or a Nobel prize winner, or have an IQ of 180.
You can still suck as an CEO.
Really, it doesn’t suggest at all that the best person has been chosen for the job.
I was only pointed out Mr Sanders made Mr Ruiz the heir apparent of AMD.
Well, the $2.3 million a year AMD won’t have to shell out anymore should free up some cash.
In a few days, we’ll be hearing about how well financially AMD is doing.
AMD will be around for many years to come.
They were near-off the cliff back in 1990.
Managed to copy the i386 acquire another
CPU manufacture and started stealing market
share from Intel. And back then they were
only a $1B company… today $6B… dont count
them out yet…
Anyone recall Sanders?
I’ll happily buy another AMD processor. There’s something about Intel I do not like.. Maybe it’s that stupid sound they play 5 times in their ads.
Dun.. dun dun dun daaa
Gives me the shits.
It is sad to read news like this. I read somewhere that Hector Ruiz said in a statement that “Dirk is a gifted leader who possesses the right skills and experience to continue driving AMD and the industry forward in new, compelling directions,”. I sure hope Ruiz is right because replacing him with Meyer is just window dressing unless Meyer takes the opportunity to change the company’s business model. It is time for AMD to realize that, even though it has the best engineering team in the world, parroting Intel’s x86 technology is a losing proposition. Nobody can beat a behemoth like Intel playing Intel’s own game in Intel’s own backyard.
Now that the industry is transitioning away from sequential computing toward massive parallelism, AMD has the opportunity of a lifetime to take the bull by the horns and lead the world into the next era of computing. Intel is making a grave mistake by adopting multithreading as their parallel programming model. AMD must not make the same mistake. There is an infinitely better way to design and program multicore processors that does not involve threads at all. To find out why multithreading is not part of the future of computing, read ‘Parallel Computing: Why the Future Is non-Algorithmic’:
http://rebelscience.blogspot.com/2008/05/parallel-computing-why-fut…
Edited 2008-07-18 00:35 UTC
maybe if people had actually picked up amd 64…
that wasn’t parroting, it’s just ignored because windows is still floating around on 32bit. why would you abandon x86 when nearly all computers in home use are running it?
IMO, the old computers will not die overnight but they will be supplanted by a new type of machine built for super fast parallelism and super complex programs that do not fail. The x86 single and multicore multithreaded machines (together with all the legacy OSes from the last century) will eventually go the way of the dinosaurs, the buggy whip and the slide rule. AMD should let Intel hang on to those while it is forging a new market worthy of the 21st century. Intel can’t do it because Intel is too married to the old stuff.
um, the slide rule is still alive and well
Yeah, some of us (I included) are still in love with the beauty, power and elegance of the slide rule. Still, I would say that the slide rule is more like the old radio vacuum tubes. It’s still around but it does not dominate the market like it used to.
Actually slide rules are alive and well in aviation. Flight ‘computers’ are circular slide rules used for calculating time, fuel loads etc. They sacrifice high precision for total reliability, low cost, speed and simplicity.
Recall, why Itanium sank?
I honestly hope that is not your blog, because that dude over there clearly has no clue about what he’s talking about. He doesn’t understand the basic premise of an algorithm and takes great offense at anyone pointing it out to him.
His general hostility and disdain for anything in the academic world does him no favours. And ranting and raving about the cult of the Turing machine, while at the same time not understanding what algorithm means … sorry but such people are not to be taken seriously.
You took offence, didn’t you? That proves his point about academia’s thin skin, IMO. Unfortunately for the Turing machine cult and the computer science community in general, his ideas are being taken seriously by a lot of people in the industry. Why? Because he is right about parallel programming, that’s why. Your point about his use of the term algorithm is lame. It is a matter of definitions. He uses the original definition. Whether or not it’s the definition that you choose has little to do with his argument about parallel programming, about which you obviously have nothing interesting to say.
“His” point? When people start talking in the 3rd person, you know they’re screwed up somehow.
Academics rarely lack thin skin. Quite the contrary if you have ever spent any time in an academic institution surrounded by brilliant and cocky people, you develop “thick” skin very quickly as you struggle to argue your piece while others attempt to pick holes in your arguments.
You criticise the cult of the Turing Machine (lol, first time I’ve heard that phrase ) and yet you fail to provide a solid alternative. All the hand waving in your posts (which spend far more time ranting about the Turing machine) than in expounding what this amazing UBM is. Where are the theorems, axioms, proofs, etc. Such things are not just for the stuffy academics, they are necessary if anyone is going to make head or tail of what you are writing about and they are also important for anybody who wishes to implement your ideas. Lacking those, it’s just hot air.
I picked out your definition of algorithm. One, misusing commonly understood terms annoys people and two, it makes it a lot harder to follow what you are saying. How are you going to implement a graph without an algorithm (i.e. a sequence of instructions)? Move the bits of information by sheer force of will?
Having your ideas rejected can be a very discouraging experience, especially if you’ve spent 10 years trying to advocate it. However, in those 10 years it doesn’t look like you’ve developed your COSA idea beyond the vague hand waving that I see on your blog. You might think that you’re a Galileo Galilei, but in reality you’re coming across more like a Frank Chu.
Yeah, you’re offended all right. See you around.
Some people are technically competent and have no vision, nor can they crack the whip to drive a vision–it’s hard to crack a whip when the vision is hazy.
I’m looking forward to seeing this company streamline and extend into areas it now can with the merger of ATi.
with AMD’s seventh consecutive quarterly loss. Mr Ruiz should of stepped down sooner. I hope AMD can get out of this miss.
In India Intel rules… inspite I purchased AMD Turion notebook. I’m glad I did. AMD rocks!!! I’ve helped 2 more people to buy AMD laptop and they are happy too. In India AMD is cheaper than Intel.
Using AMD I have not seen any difference from Intel, apart from cost.
I wonder if this would be the moment to buy a good deal of AMD stock…
Only if you mean to buy them as a long term investment, they wont suddenly get 20% mor market overnight
But I for one am looking to invest, banking on the NEXT cpu generation beating intel’s
as with any stock, its a gamble!! If only I had the money in the 90s to buy apple stock like I wanted to…
I know the stocks won’t jump up suddenly… still, I’m willing to believe that AMD will stop losing money eventually.
They’re in the situation they’re in because they took their eye off the ball and excessively focused on the desktop and server when the growth area was low powered components for the laptop/notebook market.
Take a look at their product offerings, they have no competitor to the Centrino for example; the whole point of purchasing ATI was to allow them to supply the whole widget – which they’ve failed to do.
They need to come up with a Server, Workstation, Desktop and Laptop platform strategy, where the completely widget is provided from the top down. Stop having three different wireless companies on offer, choose one and exploit the economies of scale to reduce the costs.
Sell off all the fabs, they’re a white elephant – let those who are experts at running fabrication plants to do just that. There is no benefit to owning fabs to be honest, and given how long it tacks to transition from one generation to the next, no time would be lost in the process.
Start producing some quality drivers because lord knows the ATI ones aren’t too pretty at the moment, and please, stop creating chipsets that require special drivers; conform to the openstandards and they shouldn’t be required. Stop trying to be a special snow flake, its not impressing anyone.
Edited 2008-07-18 18:30 UTC
Except that if they sell their fabs, their CPU which are already made on older process than Intel’s one, will be made on an even “worse” process: as a fab holder they can tweak the process to fit their CPU however they wish (well however they’re able to), as a fab client they wouldn’t have this possibility, so their CPUs would become even less performant compared to Intel’s one..
That said they may no have a choice: AMD used cleverly Intel’s failures (no 64bit for x86 and stupid focus on CPU’s frequency) to work around the fact that Intel has bigger fabs with better process but what can they do when Intel doesn’t make (big) mistake anymore?
Now there are still technological improvements such as going 3D which may change the balance of power, but being a fab client instead of owning fabs doesn’t help being at the forefront of progress..
For the longest time I was an exclusive supporter of AMD (and ATI for that matter). Everything worked great and I was rather happy. Then a few years back I moved from Windows to Linux and that made all the difference. When I go out to buy a computer if its all Intel I don’t have to worry about the drivers for the wireless card, or Video card cause for the most part Intel has opened them. I might at some point move back to buying ATI cards but as it stands right now Nvidia still has much better Linux drivers.