Linked by Thom Holwerda on Fri 28th Sep 2012 21:51 UTC, submitted by MOS6510
General Development "When I started writing programs in the late 80s it was pretty primitive and required a lot of study and skill. I was a young kid doing this stuff, the adults at that time had it even worse and some of them did start in the punch card era. This was back when programmers really had to earn their keep, and us newer generations are losing appreciation for that. A generation or two ago they may have been been better coders than us. More importantly they were better craftsmen, and we need to think about that." I'm no programmer, but I do understand that the current crop of programmers could learn a whole lot from older generations. I'm not going to burn my fingers on if they were better programmers or not, but I do believe they have a far greater understanding of the actual workings of a computer. Does the average 'app developer' have any clue whatsoever about low-level code, let alone something like assembly?
Thread beginning with comment 537025
To view parent comment, click here.
To read all comments associated with this story, please click here.
Richard Dale
Member since:
2005-07-22

Programmers used to have to be more precise simply because a run was a big deal. You didn't have your own machine at your beck and call. Instead, you shared a machine and computer time was a scarce resource (it cost money or it took you a long wait in line to get your program on the machine).

Also back in the punch card era, it wasn't so convenient to alter your code as it is today.

So I don't know whether old programmers were "better," but I do believe they had to code more carefully and rely much more on desk-checking than simply running a program over and over to eliminate error.


Yes, you're right. Thats how it was late when I started programming in the 1970s. If you didn't have your own computer or a terminal on your desk, just coding sheets and punched cards then you spent a lot of the day reading code and running programs in your head. As it happens that is still an essential skill today. Being able to visualise a program in your head is one of the things that separates an expert programmer from a poor one.

In fact, the reason why programming hasn't actually changed much is because you mostly do it in your head. It's a bonus if you can edit code easily or get more than two compiles a day, but it doesn't change what if feels like when you are doing programming much.

I started at University using a teletype with an interactive language, which was really very similar to a ruby or python console that we use today. When I started my first job 1978, it wasn't interactive like that though - everything was batch processing even developing online applications wasn't done interactively.

To me what has changed is that a programmer today can teach themselves new languages and skills because we have the internet and personal computers. We can learn more by collaborating with other people to write Free Software on github. In the 1970s when I started you couldn't do that. If you wanted to learn something new you could buy a book on 'Jackson Structured Programming', but you would need your employer to send you on a course if you wanted to learn a new programming language. We are much more in control of our careers than we were then in my opinion.

The article seems to believe that because programming used to be harder, it attracted better people somehow. I don't see how that follows at all. There used to be just as many poor programmers 30 years ago as there are today.

Reply Parent Score: 3