Linked by Thom Holwerda on Fri 15th Feb 2013 10:40 UTC
General Development "Since I left my job at Amazon I have spent a lot of time reading great source code. Having exhausted the insanely good idSoftware pool, the next thing to read was one of the greatest game of all time: Duke Nukem 3D and the engine powering it named 'Build'. It turned out to be a difficult experience: The engine delivered great value and ranked high in terms of speed, stability and memory consumption but my enthousiasm met a source code controversial in terms of organization, best practices and comments/documentation. This reading session taught me a lot about code legacy and what helps a software live long." Hail to the king, baby.
Permalink for comment 552742
To read all comments associated with this story, please click here.
RE[9]: Code Review
by Alfman on Sat 16th Feb 2013 12:51 UTC in reply to "RE[8]: Code Review"
Alfman
Member since:
2011-01-28

moondevil,

"As I said it all depends how complex it is the processor you are trying to target."

I hear that, but most compilers don't attempt to optimize for all the things your talking about and they still need your help with setting the right flags and runtime profiling when they do. Some of them are still doing a bad job, for example we tested GCC's SIMD vectorization last year and it did not do a great job of it. It would be neat to try that again and see if it's improved. But all too often people simply assume that compiler output is optimal without even doing any benchmarks. While this might be just fine for their purposes, it's most definitely wrong to make assertions about it since they're not the people who know much about optimizing in the first place.

At times I'm able to beat GCC's output, I'd rate my skill as above average but you'd be right if you said that it's usually too much work for too little gain. It's not usually worthwhile especially when non-portable code is the result.

Reply Parent Score: 2