Linked by MOS6510 on Fri 17th May 2013 22:22 UTC
Hardware, Embedded Systems "It is good for programmers to understand what goes on inside a processor. The CPU is at the heart of our career. What goes on inside the CPU? How long does it take for one instruction to run? What does it mean when a new CPU has a 12-stage pipeline, or 18-stage pipeline, or even a 'deep' 31-stage pipeline? Programs generally treat the CPU as a black box. Instructions go into the box in order, instructions come out of the box in order, and some processing magic happens inside. As a programmer, it is useful to learn what happens inside the box. This is especially true if you will be working on tasks like program optimization. If you don't know what is going on inside the CPU, how can you optimize for it? This article is about what goes on inside the x86 processor's deep pipeline."
Thread beginning with comment 562188
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[10]: Comment by Drumhellar
by theosib on Mon 20th May 2013 23:55 UTC in reply to "RE[9]: Comment by Drumhellar"
theosib
Member since:
2006-03-02

I'm going to slightly abuse the term "NP-hard" here, but anyhow, the search space for place and route is such that if you had the optimal solution, you would not even be able to verify that it was. Immense doesn't even begin to describe the complexity of automatic circuit layout. Oh, and humans still do better: Bulldozer performed sub-par by about 20% for the technology because they didn't bother to have humans go back in and hand-optimize critical circuits.

Reply Parent Score: 2

RE[11]: Comment by Drumhellar
by Alfman on Tue 21st May 2013 00:54 in reply to "RE[10]: Comment by Drumhellar"
Alfman Member since:
2011-01-28

theosib,

"I'm going to slightly abuse the term 'NP-hard' here, but anyhow, the search space for place and route is such that if you had the optimal solution, you would not even be able to verify that it was."

Why is this abusing NP-hard? I think it is (can be converted into) an NP-hard problem.


"Immense doesn't even begin to describe the complexity of automatic circuit layout. Oh, and humans still do better: Bulldozer performed sub-par by about 20% for the technology because they didn't bother to have humans go back in and hand-optimize critical circuits."

I had to look it up, you must be referring to this:
https://en.wikipedia.org/wiki/Bulldozer_%28processor%29

How do you arrive at the 20% figure? I have no idea what kind of algorithmic strategy was used to design bulldozer, but in any case being only 20% sub-par still sounds impressive to me. We're on the cusp of computers being able to beat humans in many specialized domains, it's only a matter of time before they can beat humans for circuit optimization.

Being NP-Hard means it may never find the "best" solution, but as long as it can find incrementally better ones over each automated generation, that's quite amazing.


Does your university do work on active research projects? Sometimes it seems like being a professor could be a lot of fun for professors who get to make a living overseeing cutting edge research projects. This assuming the distracting "deploma mill" responsibilities don't get in the way, haha.

Reply Parent Score: 2

RE[12]: Comment by Drumhellar
by theosib on Tue 21st May 2013 02:37 in reply to "RE[11]: Comment by Drumhellar"
theosib Member since:
2006-03-02

Although I have teaching responsibilities (that I take very seriously), my primary job is research. And doing better on automated synthesis is one of the projects I'm working on.

As for the 20%, this is where I got the figure from:
http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_AMD_Engi...

Reply Parent Score: 2