Linked by Thom Holwerda on Fri 28th Sep 2012 21:51 UTC, submitted by MOS6510
General Development "When I started writing programs in the late 80s it was pretty primitive and required a lot of study and skill. I was a young kid doing this stuff, the adults at that time had it even worse and some of them did start in the punch card era. This was back when programmers really had to earn their keep, and us newer generations are losing appreciation for that. A generation or two ago they may have been been better coders than us. More importantly they were better craftsmen, and we need to think about that." I'm no programmer, but I do understand that the current crop of programmers could learn a whole lot from older generations. I'm not going to burn my fingers on if they were better programmers or not, but I do believe they have a far greater understanding of the actual workings of a computer. Does the average 'app developer' have any clue whatsoever about low-level code, let alone something like assembly?
Thread beginning with comment 536996
To read all comments associated with this story, please click here.
Old Programmers had to write precise code
by benali72 on Sat 29th Sep 2012 16:05 UTC
benali72
Member since:
2008-05-03

Programmers used to have to be more precise simply because a run was a big deal. You didn't have your own machine at your beck and call. Instead, you shared a machine and computer time was a scarce resource (it cost money or it took you a long wait in line to get your program on the machine).

Also back in the punch card era, it wasn't so convenient to alter your code as it is today.

So I don't know whether old programmers were "better," but I do believe they had to code more carefully and rely much more on desk-checking than simply running a program over and over to eliminate error.

Reply Score: 5

ssokolow Member since:
2010-01-21

Makes sense. Necessity and perfectionism both tend to result in more cautious coding. For example, I'm an admittedly very overworked perfectionist and, the more time I can find, the more unit tests and auditing I do.

Once I finish my current project to the point where I can make the code public without worrying about schema migration issues, I'm planning to go back and give all my older little utilities a full audit and unit test suite.

(I may be auditing my own code, but the methodical approach I use for writing comprehensive unit tests does a pretty good job of doing double duty as an effective self-audit)

Edited 2012-09-29 16:28 UTC

Reply Parent Score: 2

Richard Dale Member since:
2005-07-22

Programmers used to have to be more precise simply because a run was a big deal. You didn't have your own machine at your beck and call. Instead, you shared a machine and computer time was a scarce resource (it cost money or it took you a long wait in line to get your program on the machine).

Also back in the punch card era, it wasn't so convenient to alter your code as it is today.

So I don't know whether old programmers were "better," but I do believe they had to code more carefully and rely much more on desk-checking than simply running a program over and over to eliminate error.


Yes, you're right. Thats how it was late when I started programming in the 1970s. If you didn't have your own computer or a terminal on your desk, just coding sheets and punched cards then you spent a lot of the day reading code and running programs in your head. As it happens that is still an essential skill today. Being able to visualise a program in your head is one of the things that separates an expert programmer from a poor one.

In fact, the reason why programming hasn't actually changed much is because you mostly do it in your head. It's a bonus if you can edit code easily or get more than two compiles a day, but it doesn't change what if feels like when you are doing programming much.

I started at University using a teletype with an interactive language, which was really very similar to a ruby or python console that we use today. When I started my first job 1978, it wasn't interactive like that though - everything was batch processing even developing online applications wasn't done interactively.

To me what has changed is that a programmer today can teach themselves new languages and skills because we have the internet and personal computers. We can learn more by collaborating with other people to write Free Software on github. In the 1970s when I started you couldn't do that. If you wanted to learn something new you could buy a book on 'Jackson Structured Programming', but you would need your employer to send you on a course if you wanted to learn a new programming language. We are much more in control of our careers than we were then in my opinion.

The article seems to believe that because programming used to be harder, it attracted better people somehow. I don't see how that follows at all. There used to be just as many poor programmers 30 years ago as there are today.

Reply Parent Score: 3

Doc Pain Member since:
2006-10-08

Programmers used to have to be more precise simply because a run was a big deal.


Programming began in the head, not on the terminal's keyboard (if you had one). I think there was more emphasize of the "pre-coding work". Today you don't need that approach anymore as "trial & error" is inexpensive, all on company time. :-)

You didn't have your own machine at your beck and call.


You could be happy to even see the machine you're woring on (or for) once in your life. :-)

Instead, you shared a machine and computer time was a scarce resource (it cost money or it took you a long wait in line to get your program on the machine).


So you know where "accounting" in relation to computer resources (CPU time, storage, hardcopy) originates from. The UNIX and Linux operating systems still have this functionality built in.

I remember my CS professor stating: "'Trial and error' is not a programming concept!" :-)

Also back in the punch card era, it wasn't so convenient to alter your code as it is today.


For those not familiar with this important era of computing, I suggest reading "Programming with Punched Cards" by Dale Fisk:

http://www.columbia.edu/cu/computinghistory/fisk.pdf

It's more funny than you may think, and as a historic sidenote, it depicts the role of women in IT when IT wasn't actually called IT (but data processing).

So I don't know whether old programmers were "better," but I do believe they had to code more carefully and rely much more on desk-checking than simply running a program over and over to eliminate error.


I admit it's hard to compare in terms of worse or better. At least it's much different, even though some basic elements have been shared throughout IT history. Those who are willing to read, to learn, to think and to experiment will always be superior to "code monkeys" and "typing & clicking drones" involved in so many parts of corporate IT.

Reply Parent Score: 3

lucas_maximus Member since:
2009-08-18

I think visualising the problem first is what separates the good from the bad developers.

I am trying to get our junior developer to break down problems into a set of steps. I sat him down with me and went through what I was doing and why.

He just wrote everything down, completely missing the overall point.

While I rarely write a lot of pseudo-code anymore, I normally have a diagram, set of rough steps or something similar I wrote first to get the problem well understood in my head.

Edited 2012-09-30 13:28 UTC

Reply Parent Score: 3