Linked by Howard Fosdick on Fri 13th Apr 2012 20:21 UTC
In the News Six-month-old web site Codecademy claims you can learn programming through its online tutorials. The free modules on JavaScript are now available. The site also allows anyone to post their own programming courses. The site has good funding, but question is: can you really learn programming this way? One blogger enthuses that Codecademy's approach "looks like the future of learning to me," while another slams it saying "Seriously? Wow, bull**** badging and sh**ty pedagogy wins the day in ed-tech investing." What do you think?
Permalink for comment 514168
To read all comments associated with this story, please click here.
School gives a larger package
by Neolander on Sat 14th Apr 2012 10:36 UTC
Neolander
Member since:
2010-03-08

I have mostly learned programming through books and tutorials on the web, so I guess I can try to give a balanced view of it. Beware though, that will be a bit long.

The main advantage of learning programming at school, as I see it, is that you will also acquire a lot of other knowledges that may also prove useful when you take part of a programming project. Like, how to manage large projects, how and when to delegate work to other persons in a team of programmers, how to design good GUIs, and so on...

This does not happen when you learn programming on your own. Most programming courses will teach you how to translate thoughts into code and how to avoid common mistakes, nothing else. Now, learning, say, UI design, is not terribly difficult in itself if you have a good book or website at hand. The difficulty, however, lies in figuring out that you need to learn it without having messed up pretty badly first. And thousands of crappy GUIs out there are here to teach us that this is not easy.

This weakness of self-taught programming, however, also becomes a strength if it is managed properly. Computer science degrees, like most other degrees, tend to include everything and the kitchen sink, with few ways to focus on what you're interested in until very late in the teaching process. And this is a good thing when you don't have a clear view of where you're heading later, like many young college students. And HR people also like it because it allows them to put a clear label on you without being familiar with your area of knowledge. *But* when you know very well where you're heading and what you want to learn, this inflexible and highly generalist system becomes a PITA very quickly.

Let's say that, like me, you hate and despise computer networks and web programming with passion, and that neither your job nor your hobbies force you to learn about it instead of leaving it to skilled professionals of that realm. Isn't it great to be able to skip it entirely and focus the subjects that you like, in my case OS theory and the design and implementation of good code and user interfaces ? Now, try to find a CS course that lets you do that. I don't know of any myself.

In some cases, CS courses can even completely fail to achieve your goals. As an example, more and more schools choose to teach very high-level languages like Java or Python on the behalf that this is what is most likely to be useful in nowadays' jobs. However, by doing this, they conveniently ignore the fact that while it is fairly easy to switch from a low-level language to a high-level language, the reverse process is less pleasant than being trepanated by a zombie mad scientist while a legion of rats is eating your legs. Programmers who have been taught Java in college only to face Motorola 6809 assembly at work, I feel for you.

---

To sum it up : Programming is not very hard to learn without a teacher (unlike, say, foreign languages). But one has to remember that writing good software involves more skills than just coding mastery, and to acquire these skills as needed. In contrast, academic courses will teach you pretty much everything you need, but along with a bunch of other boring stuff which you may well know you won't need later. If you choose to go the self-learning way, I suggest you take a look at CS programmes in colleges to get a broad view of what you should learn and not forget everything.

And for the mandatory OS metaphor, academic CS courses are like Mac OS X (a bloated mess of features, most of which you won't need, but in which you know that you should get pretty much everything you want, in a cohesive package that has been tested to work by many people), whereas self-teaching is more like Arch Linux (a minimalist package to which you only add what you need, but which requires more time and willpower to get fully working and may well turn out to be an unusable FrankenOS in the end).

Edited 2012-04-14 10:59 UTC

Reply Score: 2