Linked by Thom Holwerda on Thu 27th Dec 2012 10:19 UTC, submitted by anonymous
General Development "Computers are ubiquitous in modern life. They offer us portals to information and entertainment, and they handle the complex tasks needed to keep many facets of modern society running smoothly. Chances are, there is not a single person in Ars' readership whose day-to-day existence doesn't rely on computers in one manner or another. Despite this, very few people know how computers actually do the things that they do. How does one go from what is really nothing more than a collection - a very large collection, mind you - of switches to the things we see powering the modern world?"
Order by: Score:
Programming for all
by kwan_e on Thu 27th Dec 2012 11:08 UTC
kwan_e
Member since:
2007-02-18

Unlike a lot of programmers who prefer programming to remain a secret magical art, I think it will be a fact of life in the next 50 years that programming will just be something people do at a basic level as part of everyday life. My guess is that something like the world of Neal Stephenson's "Diamond Age", in which your average first world citizen would have technology that could build physical stuff. We have 3D printers becoming a lot more affordable, for example.

I don't think teaching a programming language should be the centre of "writing for computers". People would do better to learn programming through understanding algorithms, and structural design. Most programming languages today, just like "flat design", are mostly superfluous and any special "features" are just fads that gain prominence due to being different from the past, but not introducing new ways to think about design.

Reply Score: 3

RE: Programming for all
by woegjiub on Thu 27th Dec 2012 12:15 UTC in reply to "Programming for all"
woegjiub Member since:
2008-11-25

This seems naive. The average person is far less intelligent than most intellectuals actually realise.

It is not purely due to poor teaching that first year university programming courses have immense failure rates.
It does seem to be beyond most people.

Reply Score: 9

RE[2]: Programming for all
by kwan_e on Thu 27th Dec 2012 12:36 UTC in reply to "RE: Programming for all"
kwan_e Member since:
2007-02-18

This seems naive. The average person is far less intelligent than most intellectuals actually realise.

It is not purely due to poor teaching that first year university programming courses have immense failure rates.
It does seem to be beyond most people.


Thereby completely missing the point of my comment.

Thinking algorithmically and structurally should come before learning programming languages. A lot of first year courses assume people already know how to think properly and that programming is a matter of writing code.

What I suggest is that it shouldn't be taught at university first but in high schools and possibly earlier.

Using university failure rates as proof is lazy, and is probably indicative of a mind not suited for good programming either. Programming requires foresight, hindsight and lateral thinking.

Reply Score: 3

RE[3]: Programming for all
by woegjiub on Thu 27th Dec 2012 13:00 UTC in reply to "RE[2]: Programming for all"
woegjiub Member since:
2008-11-25

I had the same experience in high school comp science as well; students would completely fail to see obvious connections, just as they do in mathematics.

With still higher level languages, this problem will be lessened, but it does seem as though a lack of desire or possible inability to think logically does exist for what may be the majority of people.

If they can't/won't learn algebra, how are they to learn coding?

Reply Score: 6

RE[4]: Programming for all
by kwan_e on Thu 27th Dec 2012 13:16 UTC in reply to "RE[3]: Programming for all"
kwan_e Member since:
2007-02-18

I had the same experience in high school comp science as well; students would completely fail to see obvious connections, just as they do in mathematics.

With still higher level languages, this problem will be lessened, but it does seem as though a lack of desire or possible inability to think logically does exist for what may be the majority of people.

If they can't/won't learn algebra, how are they to learn coding?


Organizations like the Khan Academy shows that children, of all stripes, are willing and can learn algebra given the right teaching environment.

I don't think thinking logically is as fundamental to programming as it is thinking algorithmically. I've known many intelligent people, much more intelligent than me. But they can't program for shit. Logic is a red herring in programming and is really only a problem "in the small". Programming happens "in the large".

Reply Score: 0

RE[5]: Programming for all
by woegjiub on Thu 27th Dec 2012 13:29 UTC in reply to "RE[4]: Programming for all"
woegjiub Member since:
2008-11-25

Organizations like the Khan Academy shows that children, of all stripes, are willing and can learn algebra given the right teaching environment.

I don't think thinking logically is as fundamental to programming as it is thinking algorithmically. I've known many intelligent people, much more intelligent than me. But they can't program for shit. Logic is a red herring in programming and is really only a problem "in the small". Programming happens "in the large".


Then, if this particular art is lost on the intelligent, and requires more algorithmic thinking, how is that to come about?
More importantly, how do you make people actually *want* to program?
Why is it that most are happy to use computers for leisure, but are repulsed by the notion of understanding them more deeply?

Just like the other sciences, there seems to be a significant desire to avoid anything to do with actual analytical thinking.

Reply Score: 4

RE[6]: Programming for all
by Drumhellar on Thu 27th Dec 2012 21:05 UTC in reply to "RE[5]: Programming for all"
Drumhellar Member since:
2005-07-12

Just like the other sciences, there seems to be a significant desire to avoid anything to do with actual analytical thinking.


Well, thinking is an expensive activity, in terms of energy requirements.

Reply Score: 3

RE[3]: Programming for all
by Kochise on Thu 27th Dec 2012 13:43 UTC in reply to "RE[2]: Programming for all"
Kochise Member since:
2006-03-03

Otherwise we would all be fluent lispers...

Kochise

Reply Score: 3

RE[3]: Programming for all
by hhas on Fri 28th Dec 2012 20:08 UTC in reply to "RE[2]: Programming for all"
hhas Member since:
2006-11-28

Thinking algorithmically and structurally should come before learning programming languages.


Bingo! A point utterly missed by the article author and subsequent Ars commenters, who are all too busy debating whether new programmers should be taught Python/Java/C/C++ first.

The huge irony is that Seymour Papert was successfully doing exactly this with 6 and 7 year-olds several decades before the average Arsian was even assembled. LOGO was never about teaching kids to program: it was about teaching them to think, and how to learn, and how to learn how to learn. Gaining the ability to assemble useful programs along the way was merely a side-benefit.

The wonderful thing about LOGO was that it avoided all of the mindless bureaucracy and special-case behaviors so rampant in 'popular' languages. Learners weren't misled into believing type declarations, memory management, conditional and loop statements, and other such tedious details were what programming was fundamentally about. Directing attention to those is like teaching a student every single irregular verb in the English language before explaining what a verb actually is, or demonstrating how the vast majority of logical (regular) verbs operate.

Being essentially a better-looking Lisp, LOGO was incredibly parsimonious: the only core features were words and values, and everything else was expressed in terms of those structures. Thus abstraction, which is the real key to becoming a programmer, is naturally the second or third thing taught: it's simply a matter of defining new words in addition to the words already provided by the language. Neither was making mistakes seen as something to be ashamed of: instead, it was part of the natural learning process: write some words, run them, figure out what's not working and fix it (i.e. debugging), and learn from the whole experience.

Papert ultimately failed, of course, but not due to flaws in his core tools or techniques. Rather, his objectives were undermined and ultimately buried by the heinous politics of education: technophobic teachers fearful on one side; programming priesthood threatened on the other. Programming became a elitist course for special students only; computer education in general degenerated into poor-quality ICT training churning out third-rate Office monkeys.


Programming requires foresight, hindsight and lateral thinking.


Remarkably rare qualities in the modern profession, alas. Probably not aided by the silent degeneration of Computer Science into Software Engineering and from there to bottom-of-the-barrel Java diploma mills, but that's another rant...

Reply Score: 4

RE[4]: Programming for all
by Zifre on Fri 28th Dec 2012 22:31 UTC in reply to "RE[3]: Programming for all"
Zifre Member since:
2009-10-04

As someone who learned Logo at the age of 8, I can attest to this.

Reply Score: 2

RE[2]: Programming for all
by renox on Thu 27th Dec 2012 13:44 UTC in reply to "RE: Programming for all"
renox Member since:
2005-07-06

This seems naive. The average person is far less intelligent than most intellectuals actually realise.

It is not purely due to poor teaching that first year university programming courses have immense failure rates.
It does seem to be beyond most people.


In France in the first year of university classes are *big* whereas they were 30-50 people in high school, students aren't supervised like they were before, they live alone for the first time, etc in these conditions the high failure rate has nothing to do with intelligence, more with lack of self-discipline|maturity.

Reply Score: 3

RE[3]: Programming for all
by Earl C Pottinger on Thu 27th Dec 2012 19:10 UTC in reply to "RE[2]: Programming for all"
Earl C Pottinger Member since:
2008-07-12

But the same lacks that makes them fail that first year of college/university are the same problems once they consider writing a complex program no matter what the language/environment.

Code that do real work tends to be complex, and even the simpler programs still need the programmer to consider how to handle things/events when something goes wrong with inputs/hardware/communications.

Reply Score: 3

RE[3]: Programming for all
by unclefester on Fri 28th Dec 2012 06:35 UTC in reply to "RE[2]: Programming for all"
unclefester Member since:
2007-01-13

In France in the first year of university classes are *big* whereas they were 30-50 people in high school, students aren't supervised like they were before, they live alone for the first time, etc in these conditions the high failure rate has nothing to do with intelligence, more with lack of self-discipline|maturity.


A guy at my school began his Australian undergraduate medical degree at 15. Despite being in the 99.9th percentile he failed every subject in first year due to immaturity. Luckily he was allowed to re-enroll after two years.

Edited 2012-12-28 06:35 UTC

Reply Score: 2

RE: Programming for all
by karunko on Sat 29th Dec 2012 15:29 UTC in reply to "Programming for all"
karunko Member since:
2008-10-28

I think it will be a fact of life in the next 50 years that programming will just be something people do at a basic level as part of everyday life.

With smartphones, tablets and app stores that let you click and install software to your heart's (and wallet's) content, I seriously doubt it.

If anything, it's far more likely that programming will increasingly become the domain of professionals and serious hobbyists. Think about it: in the late 70s and early 80s you switched on your computer and were greeted by the BASIC prompt that encouraged you to explore and try things out. Before that, everything was even harder because you had to buy a kit and actually assemble it yourself -- and decode the LEDs that served as the display and go the the User Group meeting and trade software (you had to write) with fellow enthusiasts.

Of course programming is a lot easier now, with plenty of languages to choose from, IDEs, online tutorials and whatnot. However, smartphones, tablets and walled gardens are creating a generation of pampered users. Not to be insulting, but when everything you need is usually one or two clicks/taps away, I can't see a lot people getting interested in programming just out of sheer curiosity.


RT.

Edited 2012-12-29 15:31 UTC

Reply Score: 3

The magic of the 20th century
by AlephZero on Thu 27th Dec 2012 13:09 UTC
AlephZero
Member since:
2011-07-12

I like sometimes to think that the "art" of programming resembles some kind of "magic", and us, the programmers, are those who (try to) master the ability of thinking in programming terms and "bend the reality" to what they desire.

Reply Score: 2

What is vs what could be vs what will be
by kwan_e on Thu 27th Dec 2012 14:12 UTC
kwan_e
Member since:
2007-02-18

I think the point is being missed: whether it's a good a idea or not, it's probably going to happen.

Congratulations to all those telling it how it is, but that is near to useless in predicting what will be. Even if most people are psychologically not disposed to maths and the sciences, the fact is most people in the first world today on average knows more about basic maths, the sciences, and even literacy than people just over a hundred years ago. Average intelligence increases.

No, not everyone is going to be a programming genius. That's not my argument at all. My argument is that, for good or bad, basic understanding of programming will be expected. Just like basic maths and basic literacy.

One reason I mentioned previously - the potential availability of private manufacturing like in "Diamond Age".

Another reason is the trend towards automation in all physical jobs and the outsourcing of all other menial low skill jobs. Pretty soon, "entry level" jobs will be about being able to program basic automation of tasks and maintenance.

I think it's going to happen, and either society keeps up by updating the education system, or risk widespread unemployment and unrest.

Reply Score: 3

Alfman Member since:
2011-01-28

kwan_e,

Our culture is openly embracing technology as a means of life. I'll side with you in that for me there's no doubt that kids are smart enough to learn how to program it (given the proper education foundations, which is by no means a given).

However there are some roadblocks too. Children are being introduced to technology as fashionable bling instead of programmable tools. Worse still, today's popular consumer devices are becoming *less* programmable than their predecessors, which are threatening to displace open computing technologies at home.

Looking past these roadblocks, I have to wonder if there's any need for a significant percentage of the population to know programming. What would that get us? If half the population could program, wouldn't most of them be overqualified for the menial jobs they end up getting? Many of us are already overqualified today, meaning our advanced degrees are not being put to great use.

Reply Score: 2

kwan_e Member since:
2007-02-18

Looking past these roadblocks, I have to wonder if there's any need for a significant percentage of the population to know programming. What would that get us? If half the population could program, wouldn't most of them be overqualified for the menial jobs they end up getting? Many of us are already overqualified today, meaning our advanced degrees are not being put to great use.


I have addressed that problem specifically. Menial jobs are getting automated - slowly for now, but it's happening and can only accelerate.

Warehousing is becoming automated obviously (eg Amazon's robotic warehouse with robots zipping around at 40km/h). Industrial manufacturing is getting better automated. Shopping centres are moving towards self service, and more and more people are just ordering things over the internet (the dotcom dream wasn't dead, just resting). Google is developing driverless automobiles. Roomba. The list goes on.

Today's device aren't too programmable, but as we can see, things like the iPad and Android are able to make the possibility of programming available to a wider group of people but that's beside the point. Programming will become a menial job.

I'm not saying the average person will write in Java or C++ or C# or one of the functional languages. There will probably be domain specific languages that are less powerful that would be easy enough for it to be common knowledge like maths is today.

Reply Score: 2

Alfman Member since:
2011-01-28

kwan_e,

"I have addressed that problem specifically. Menial jobs are getting automated - slowly for now, but it's happening and can only accelerate."

Yes and no. The price of robotics obviously has to continue to drop for them to become more prevalent. In theory we might get rid of most jobs and have robots to do all the work. Some might even consider it a utopia. However if we don't reform our current economic models, it might easily result in mass joblessness. The thing with robots is that production can scale WITHOUT creating enough new jobs to replace those that had been laid off.

For example, a highly successful robotics company might eventually employ 100K engineers to build machines which will do the menial work of 50M people.

There's certainly no need for 50M engineers, and even if we pretend there is, there would not be enough money to pay all of them good wages.


"Today's device aren't too programmable, but as we can see, things like the iPad and Android are able to make the possibility of programming available to a wider group of people but that's beside the point. Programming will become a menial job."

Can the ipad be programmed without a computer?
Can an android?

Reply Score: 2

kwan_e Member since:
2007-02-18

kwan_e,

"I have addressed that problem specifically. Menial jobs are getting automated - slowly for now, but it's happening and can only accelerate."

Yes and no. The price of robotics obviously has to continue to drop for them to become more prevalent. In theory we might get rid of most jobs and have robots to do all the work. Some might even consider it a utopia. However if we don't reform our current economic models, it might easily result in mass joblessness. The thing with robots is that production can scale WITHOUT creating enough new jobs to replace those that had been laid off.


You think employers care, or the government cares? They're going to push for this no matter how many people lose those jobs. They'll just redefine unemployment yet again.

Yes you are right, it's going to require reforming current economic models, especially employment models. But employers don't care. They always want to get rid of the human element for cheaper, non-unionized, labour if they could. They haven't cared in the past when the higher ups made a bad decision and covering it up by laying off tens of thousands of low level workers.

For example, a highly successful robotics company might eventually employ 100K engineers to build machines which will do the menial work of 50M people.

There's certainly no need for 50M engineers, and even if we pretend there is, there would not be enough money to pay all of them good wages.


I think one of the solutions has to be a rotational workforce. We have to be done with the idea that everyone has to have a job every day of the year and that welfare is bad. You can't force people to find jobs that don't exist, and you can't force employers to create jobs when they don't need them or can't afford them.

This leaves us in a situation where the only jobs left are the highly skilled jobs that are too difficult to automate.

I personally don't have a problem with welfare, but a lot of people do, so why not cut people's working year short and have workers do essentially shifts a few months at a time. They'll still be "earning their keep". Robots aren't going to complain about how they have to work and how others are on welfare, are they? ;)

"Today's device aren't too programmable, but as we can see, things like the iPad and Android are able to make the possibility of programming available to a wider group of people but that's beside the point. Programming will become a menial job."

Can the ipad be programmed without a computer?
Can an android?


What does that matter? I'm talking about potential 50 years in the future. It's obviously part of a trend.

Reply Score: 2

Alfman Member since:
2011-01-28

kwan_e,

"You think employers care, or the government cares? They're going to push for this no matter how many people lose those jobs. They'll just redefine unemployment yet again."

If we're really conceiving doing away with an employment based society through obsolescence, then we as a society really should strongly reconsider the very existence of for-profit corporations as well. Because if we really do end up with machines taking the majority of jobs (bit of a stretch, but I'm willing to roll with it), the means of production will no longer be dependent upon ordinary people as employees, there'll be no corporate ladders to climb either. You'll either be an owner, or your not, there will be very few opportunities to transition from one to the other because most people will have no where to work. Since work would mostly not exist, working would become something people do for their own pride & entertainment.

Under such circumstances, society would probably be better off transitioning to public ownership where the technology exists to serve the general public rather than private profit based interests, which would likely have collapsed into a handful of all powerful oligopolies.



"I think one of the solutions has to be a rotational workforce. We have to be done with the idea that everyone has to have a job every day of the year and that welfare is bad. You can't force people to find jobs that don't exist, and you can't force employers to create jobs when they don't need them or can't afford them."

That's a logical solution to unemployment, especially considering how employees are working longer hours each year. Within the past decade, US law was changed to specifically exclude IT workers from federal overtime pay requirements so that businesses are legally entitled to demand longer hours from us with zero additional pay (forget time and a half). So we're kind of moving in the opposite direction.

"What does that matter? I'm talking about potential 50 years in the future. It's obviously part of a trend."

I'm a bit confused... it matters because you brought them up as examples of that trend "...the iPad and Android are able to make the possibility of programming available to a wider group of people..." I find them ironic choices for illustrating the point because technology could be less user accessible in the future.


Incidentally, the FSF just sent an email about it's campaign to fight restricted boot devices, if anybody's interested:

http://www.fsf.org/campaigns/secure-boot-vs-restricted-boot/2012-ap...

Edited 2012-12-29 04:49 UTC

Reply Score: 2

kwan_e Member since:
2007-02-18

That's a logical solution to unemployment, especially considering how employees are working longer hours each year. Within the past decade, US law was changed to specifically exclude IT workers from federal overtime pay requirements so that businesses are legally entitled to demand longer hours from us with zero additional pay (forget time and a half). So we're kind of moving in the opposite direction.


I think the problem you highlight is actually exacerbated by certain IT jobs being considered as above "entry level", if not "elite". IT administration is kind of like the janitorial equivalent in the eyes of the corporate types, but it requires a great amount of training and time. The sooner those IT jobs no longer require university degrees, the better.

With the momentum, IT jobs can become unionized again. Employers will just have to suck it.

"What does that matter? I'm talking about potential 50 years in the future. It's obviously part of a trend."

I'm a bit confused... it matters because you brought them up as examples of that trend "...the iPad and Android are able to make the possibility of programming available to a wider group of people..." I find them ironic choices for illustrating the point because technology could be less user accessible in the future.


The devices themselves may be less user accessible, but the trend I'm talking about is programming itself being available to people without going to university. As I understand it, the iPad and Android created a market for programmers that didn't require university degrees and established companies.

Yes, most apps are of poor quality, but it doesn't matter. The opportunity and market is now there, and no matter how many restrictions are put in place, you can't deny that programming itself is being opened up and your average student will start seeing programming as a required basic skill.

Incidentally, the FSF just sent an email about it's campaign to fight restricted boot devices, if anybody's interested:

http://www.fsf.org/campaigns/secure-boot-vs-restricted-boot/2012-ap...


Uh oh, cue the "RMS is a fanatic" slogans.

Reply Score: 2

Alfman Member since:
2011-01-28

kwan_e,

"IT administration is kind of like the janitorial equivalent in the eyes of the corporate types, but it requires a great amount of training and time. The sooner those IT jobs no longer require university degrees, the better."

I'd say that's already the case. When institutions are pumping out so many professional degrees per year, they become requirements for jobs which previously did not require them. Back in the 90's, employers would hire anyone who was able to do IT administration regardless of degrees since most candidates didn't have one. I believe the higher degree requirements today is a result of supply and demand rather than the increasing difficulty of the work. If the supply were to increase substantially as you predict, then won't most employers just add more requirements to filter them out?



"The devices themselves may be less user accessible, but the trend I'm talking about is programming itself being available to people without going to university."

Ok I see, they created new markets, and hence new openings for programmers.

"Yes, most apps are of poor quality, but it doesn't matter. The opportunity and market is now there, and no matter how many restrictions are put in place, you can't deny that programming itself is being opened up."

I donno, it's still an incredibly ironic example to me, I'd have picked the raspberry pi or it's ilk since it doesn't run a walled garden.

Reply Score: 2

TM99 Member since:
2012-08-26

I think the point is being missed: whether it's a good a idea or not, it's probably going to happen.

No, not everyone is going to be a programming genius. That's not my argument at all. My argument is that, for good or bad, basic understanding of programming will be expected. Just like basic maths and basic literacy.


Your idealism just doesn't match with the reality of the last 40 years of computing.

Yes, in my generation, everyone who used a computer had to learn even the basics of programming or you simply couldn't use the device. This might have carried on into the early 1990's. But then it started changing.

Kids don't need to know nor will they ever need to know computer programming. They will learn, as they do now, how to use their devices like a car or an appliance. They will learn how to download songs via iTunes 16. They will learn how to do a Power Point presentation in Office 27.

Computing is moving towards greater and greater levels of lock-down and vertical walled gardens where two major companies, Apple & Microsoft, will control the hardware & the content, oh I mean software. Linux, even though I use it and love it, is an after-thought for most people. Android is terrific and can offer a higher level of customization, however, few ever root their devices other than to simply load a game that won't otherwise run on their older model.

As to your points, the level of basic maths is atrocious compared to previous generations at least in America. In part, this is because of technology. Slide rules gave way to calculators which have given way to computers. Cashiers rarely calculate change in their heads when their POS tells them exactly how much to give back.

The same holds true for basic literacy. As the son of two university professors of English, I definitely am aware of the changes here. No one writes letters anymore and rarely do they even do a full email. It is about texting, texting, and more texting. Have you seen the new Shakespeare transliteration done in 'text speak'? Wow, is all I can say, just fucking wow!

Higher levels of automation lead to lower levels of intelligent & creative use of the technology. The same is true for highly specialized technology. When radios ran on tubes, more 'users' could and did fix and augment their devices. As radios began to be increasingly more specialized with ic's and transitors, fewer 'users' could and would even attempt to fix their devices. A great example of this is looking at weaving and textile mills at the birth of the industrial age. Previously, weavers had higher levels of training and education including extensive internships or apprenticeships. They developed high levels of skill and creativity. Then things became automated. Large mills replaced the small tailors and weavers. Trying to say that the men, women, and children who ran those machines were more intelligent and as equaled skilled at weaving, sewing, or creating textiles is ridiculous. They simply were not. The same is holding true for where ever computers and computing automation have taken over.

Public education, at least in America, is trending downwards in its level of intelligence and academic challenge not only in the sciences but also the arts. I was an academically gifted high school student. I had available math classes from algebra 1 through algebra 3 & trigonometry through pre-calculus and calculus 1. By the time I reached college, I was ready for higher level math classes even if they were not a part of my major. I get college interns and graduate students at my workplace today many of whom never had beyond algebra 2 in high school. Frankly, it shows in the lower level of technical skills and critical thinking compared with those of us as their supervisors from a previous generation.

Can society change this downwards trend? Will they? If history teaches us anything, then the answer is usually no.

Reply Score: 3

kwan_e Member since:
2007-02-18

"I think the point is being missed: whether it's a good a idea or not, it's probably going to happen.


Your idealism just doesn't match with the reality of the last 40 years of computing.
"

I posit my opening line, and your opening means you've completely missed the point, as my opening line continues to predict.

"IT'S GOING TO HAPPEN", is not idealism. At best, it's a prediction, one way or another. A prediction is not an ideal.

We can all be old men decrying the falling standards and how the past was better and everything is worse. Strange how the best times coincides with our developmental years or a short time after and everything since is the work of the devil...

"Our earth is degenerate in these latter days. There are signs that the world is speedily coming to an end. Bribery and corruption are common."

Reply Score: 1

TM99 Member since:
2012-08-26

Horseshit.

All you can provide is a tired trite response about 'being old men'.

It just happens that my developing years as you put it coincided with the development of computers and did require much more programming knowledge in general for the user whether we went on to become bankers, professors, or computer scientists. That is not the case today for those in their developmental years. The only ones getting or even requiring that kind of knowledge are those who now solely intend to be in the field.

I didn't miss your point.

Your point was a flawed prediction and was very much 'idealistic'. It involves ideas that just don't jive with the reality of the fields you were making the predictions about.

I stated that your prediction was wrong. I then produced arguments to back it up. Address those or bow out of the discussion.

Reply Score: 2

kwan_e Member since:
2007-02-18

I then produced arguments to back it up. Address those or bow out of the discussion.


No you didn't. You said exactly what other commenters have already said, which I have already addressed, which continued to be ignored.

You have made the exact same flawed point that another commenter already has over and over again. Don't be so up yourself to think you had an original point that I didn't already address.

Address my points or piss off.

Reply Score: 1

TM99 Member since:
2012-08-26

You are attempting to mix economic arguments with political idealism, education reform, and predictions about events 50 years into the future concerning programming.

You want to predict the future? Look at the past in that particular field or arena and conservatively estimate probabilities no more than five years out.

Otherwise you are just in a fucking fantasy world.

Obviously you were not trained well in critical thinking or in argumentation as you have not addressed any of the replies that address various flaws in your ideas, your logic, or your arguments.

I did address your points so this discussion with you is at a close. Enjoy your day.

Reply Score: 2

kwan_e Member since:
2007-02-18

Otherwise you are just in a fucking fantasy world.


Grow up. This argument is not that important.

You complain about me not addressing your points, but neither have you. You and the other one just throws around words like "naive" or "idealism" or "fantasy" as if they're arguments.

CASE IN POINT:

Admiral Grace Hopper invented the concept of human readable programming languages when the top scientists at the time were convinced it was impossible.

You and the other one are making the same kind of arguments those old fogeys have. You cannot deny this. Human readable languages were "naive", "idealistic" or "fantasy".

History is on my side.

Reply Score: 1

TM99 Member since:
2012-08-26

The article is about programming for beginners. You extrapolate that out and 'predict' that within 50 years we will all be programmers even when we aren't computer scientists or IT professionals.

I and others point out that this is a bad prediction. Here's why. You counter with political idealism. You counter with economic pipe-dreams like some Star Trekian utopia where we no longer work jobs or spend money.

Now you bring up a complete non sequitor about some admiral inventing a readable programming language. What does that have to do with the price of tea in China? Nothing. It is only relevant to computer programmers and not the average worker in other fields professional white-collar or blue collar drones.

History is not on your side in this argument about computer programming and the masses. The men and women who invented the foundational languages for programming today are dying off. My generation which grew up using these languages in order to use our computers are beginning to exit the workforce. The younger generations are not learning more and more programming languages. They are learning fewer. Only nerds, geeks, and hobbyists are playing with these languages. They aren't being used in fields other than IT and computer science or very rarely.

I am not an IT professional. I do, however, know some programming languages and use them daily in my work and teaching. I have graduate students today who I will inform that if they are serious about doing psychological research it behooves them to learn the R programming language. Is there excitement? Have any them even learned a whit about computer programming like I did? No. The usual response is "Is there an app for that for my iPad?"

This is not going to be changing when computers have evolved to a point where it is all point and click on big shiny pads or small little gadgets with a million and one apps for everything from how many times you picked your nose today to a library for your mp3's. My best friend and I in high school wrote our own fucking library application in Apple Pascal so we could catalog our LP collection.

Sorry, that is the reality outside of OSNews and the IT geek and hobbyist worlds with the current generations.

Reply Score: 2

zima Member since:
2005-07-06

Can society change this downwards trend? Will they? If history teaches us anything, then the answer is usually no.

But that's how progress works, not "downward trend" - we devise new ways to augment our bodies, including new prostheses of the mind (like books replacing memorisation of everything). So that some of us can move on to new challenges.

Reply Score: 2

Comment by Luminair
by Luminair on Thu 27th Dec 2012 14:53 UTC
Luminair
Member since:
2007-03-30

Programming for all is a good concept. Inclusion is good, exclusion is bad.

If you want to talk about the future, talk about whether the march of great mature open source software will kill the need for employing so many programmers. If that happens, what next? When good software is unchanging and ubiquitous like the microwaves and toaster oven designs that seem unchanged in ten years, will the new goal of programming be... to make programming different so new types of people can participate? Will everyone use the same twitter client for 50 years like they and their parents ate Cheerios? Or will it be like fashion, and the next great thing for the individual isn't just done by a professional somewhere, but also by little girls after Christmas, thanks to the bedazzler or nail polish patterns someone kindly invented so those other than the professionals could participate in creation too?

Just sayin

Reply Score: 4

RE: Comment by Luminair
by kwan_e on Fri 28th Dec 2012 01:19 UTC in reply to "Comment by Luminair"
kwan_e Member since:
2007-02-18

talk about whether the march of great mature open source software will kill the need for employing so many programmers.


It probably won't. Not because of great mature open source software, anyway. Great mature open source software, at least today, seems to give great business opportunities and seems to have resulted in more people being employed to program than it is driving people out of it.

If that happens, what next? When good software is unchanging and ubiquitous like the microwaves and toaster oven designs that seem unchanged in ten years, will the new goal of programming be... to make programming different so new types of people can participate?


Yes. People seemed to miss my point on that. For whatever reason (possibly lack of foresight and imagination), they seem to think I'm arguing that people in the future would need to learn programming at the systems level as a basic requirement.

We've seen how the SGML based and HTML-like languages had changed the accessibility of the average user to program computers. Yes, it resulted in Geocities homepages for pets in the beginning, but that would be a ridiculous counter-argument. At the very least, you can't stop people from making what they want.

Will everyone use the same twitter client for 50 years like they and their parents ate Cheerios?


No. People haven't even used the same social networking sites for 10 years.

Or will it be like fashion, and the next great thing for the individual isn't just done by a professional somewhere, but also by little girls after Christmas, thanks to the bedazzler or nail polish patterns someone kindly invented so those other than the professionals could participate in creation too?


Yes. It only seemed like a short time ago that even knowing how to use computers was considered nerdy and uncool. Now even your average bimbo has an iPad.

Of course, we also need to remember Admiral Grace Hopper. The computer scientists at the time not only thought that compilers were impossible, they thought opening up computers to a greater audience would RUIN EVERYTHING. What they didn't anticipate was that computers and their programming models themselves changed to adapt to people.

It's funny to see the same macho types today trying to fence off programming for the "elites", not knowing they are just repeating history with their short-sightedness and pretend-OCD.

Just sayin


Nobody is ever "just sayin". If they were, they wouldn't need to end with it.

Just breathin

Reply Score: 3