InfoWorld’s Peter Wayner reports on once niche programming languages gaining mind share among enterprise developers for their unique abilities to provide solutions to increasingly common problems. From Python to R to Erlang, each is being increasingly viewed as an essential tool for prototyping on the Web, hacking big data sets, providing quick predictive modeling, powering NoSQL experiments, and unlocking the massive parallelism of today’s GPUs.
The language that exists between /* and */ (or whatever denotes a comment for your favorite language)
Oh, I don’t know… it’s certainly possible to overdo the comments. Used to know a guy who’d average two lines of comment to every one line of code. That might not sound like a bad thing, but it gets to the point where you can’t see the code for the comments…
True, verbosity in any language is not good – definitely a few good comments beats a pile of obvious ones (like, getFirstName function returns the first name…)
What we really need in computer programming is a bit more science and a LOT less marketing these days. We need folks with common sense, the ability to stand up and call bulls*** to all the crap folks pull in meetings and political agenda moves in projects. The language matters, sure. But the first language to master is the one vocalized between members of a team, the customers, and of course the documentation folks. Compiling, ya right, how about ‘lost in translation’.
Aah, R. It’s a very useful language for statistical analysis and the like, and has some downright neat language features for messing around with data from CSV files. Annoyingly, it also has some of the least friendly data type handling I’ve dealt with recently.
(What did you say that function wanted again? A list of data frames containing matrices? Sure, why not. AARGH. And finding out exactly what you have is a bit more ambiguous than I prefer it.)
Edited 2010-10-25 22:07 UTC
R is pretty nice, you can really produce some great quality plots.
A bit surprised that Haskell wasn’t included.
Haskell has excellent support for things like parallel-processing. Add to that the elegance of the language and libraries like Parsec (for parsing), and it’s no surprise to see the strongly-increasing interest in Haskell.
I’ve used a very wide variety of languages: Fortran, Lisp, Mathematica, C/C++/Objective-C, Smalltalk, C#/Java, Python and I can tell you without question, the ugliest “language” I’ve ever encountered is matlab.
Currently, I’m forced to use matlab because my advisor has a bunch of code written in it.
The reason I put “language” in quotes is I’m not even sure its valid calling matlab a language.
This language really shows its history, at UT Austin, an engineering professor was teaching a class in linear algebra in the early 80’s, and he believed that fortran was too hard to use, so he came up with a set of macros that gave access to LAPACK. Matlab is about the most schizophrenic “languages” out there. They took some of the worst ideas from C and grafted array syntax on top of it. The “language” is also so limiting, no higher order functions, every function needs to be in a separate file, freaking nasty to interface with C.
The thing is, there is absolutely nothing you can do in matlab that you can’t do a 1000x cleaner in Python.
And on top of all this, matlab is like $2000 a freaking seat!!!
Oh what I wouldn’t give to be able to use Python instead of matlab.
“The thing is, there is absolutely nothing you can do in matlab that you can’t do a 1000x cleaner in Python. ”
Good luck replicating the matlab toolboxes in Python…
BTW, for academic licenses, Matlab is ridiculously cheap.
Technically you’re correct, and MATLAB is not a language. For its target applications esp. when it comes prototyping and algorithmic proof of concept purposes, MATLAB is bar none. Honestly it sounds like you’re suffering from a “jack of all trades master of none” syndrome.
Edited 2010-10-25 23:37 UTC
scipy is getting there slowly but surely:
http://www.scipy.org
Python is definitely getting there, if not there already. Problem is convincing people who are your advisor where the only “language” they know is matlab.
I use scilab. Why is it not included?
Not really, I just think matlab is a fugly language to work in. R on the other hand has a clean design, nice language. Mathematica is truly a work of art (yes, I have written very very complex things in Mathematica). My couple gripes with Mathematica are the licensing, and the fact that it is cumbersome to run without the GUI.
Python is a nice clean and fairly complete language, it is really easy to express complex concepts in it. Matlab, unless you have something that can be beaten into a linear system, forget it.
” Matlab, unless you have something that can be beaten into a linear system, forget it.”
Huh? Apparently the MATrix in MATLAB was not clue enough?
Honestly, I have yet to find a system that does what MATLAB does for its intended applications, period. Your gripes seem to be based around purely qualitative arguments, which are a minefield when it comes to discussing technical issues.
Go and try to build a prototype for a control system gathering data from multiple sensors for publication in Python, and then get back to me… 🙂
MATLAB makes sense for people who are interested in getting results quickly, if you can afford to build everything from scratch… sure Python may be a good alternative. But you might as well do it in C.
Edited 2010-10-26 00:59 UTC
Maybe the OP’s application is not one of the intended ones, who knows? Maybe his advisor has only Matlab as a hammer, so uses it for every nail? We all probably know somebody who does ridiculous things in Excel because they don’t know about sed or a programming language.
I don’t believe that you can effectively discuss anything in a sufficiently complex technical system without using qualitative arguments, for the same reason I don’t believe command economies work–imperfect people, imperfect information. The machine to human brain interface is qualitative.
Question I have for you is what in the freaking world does your advisor have over you to get you to do matlab work for him/her? Don’t you have standards? Are you really going to be able to stand yourself at the end of your degree (oh, wait, maybe this is it, your advisor has a degree over your head?) knowing you spent most of your brilliance using matlab, maintaining someone else’s code? Sheeze, this is frightening to hear. Get a life, get another advisor, get someone that is actually going to help you with a job in a market that is ever shrinking (even more so regarding matlab).
I fully understand the supervisor!!!
It is obvious that the supervisor does not have a degree in programming and does not want to be a programmer!!! Matlab is just a tool which he needs to solve his (e.g. math, data compression) problems, like for example solving a set of differential equations. Hence if one values so much the programming skills and he is not interested how to solve faster and better (from math point of view and not from better OO point of view) that set of differential equations and it is obvious that he is in the wrong place.
Also if one wants to come with a better way of solving some math (or any other) problem obviously that person is not interested in writing maintainable and readable code which might be sold one day. It is like building a Formula 1 car <http://www.formula1.com> which is needed only for one race or two races and afterward another car is rebuild from scratch. That race car is never intended to go production and sold in million of units to the general public.
I love Python and I write a lot of Python code but I write only Matlab code for solving all my statistical modeling problems because Python+SciPy+NumPy+etc. are just not good enough for serious math. There are a lot Matlab toolboxes which just do not exist in Python. Also matrices in Matlab are and behave exactly as I would expect them from my Math classes and not from my programming classes.
Bullshit. It may have its flaws (1-based arrays.. wtf?), but matlab really has no competitors for matrix manipulation.
And you’re just plain wrong about higher order functions and putting every function in a separate file (although that is true if you want it to be a public function).
I think you just don’t know enough matlab.
Try doing something like this in python. I guarantee it will take more than 4 lines.
A = rand(5); % Create a random 5×5 matrix.
B = A * A.’; % Multiply by its transpose.
C = sum(B, 2); % Sum rows.
D = (A * diag(C)).^2 % Multiply each column by the row sum, then square each element.
I really wish it had 0-based arrays though. The number of times I have to do mod(x-1, N)+1 is silly.
from numpy import *
from numpy.random import *
a = random_integers(1,10,(5,5))
b = dot(a, a.transpose())
c = b.sum(axis=0)
d = square(dot(a, diag(c)))
OK, so 4 lines, plus 2 imports.
Note to prospective hecklers – I learned numpy 15 minutes ago and haven’t thought about linear algebra in years. I am sure there is a better way to do this.
“I’ve used a very wide variety of languages: Fortran, Lisp, Mathematica, C/C++/Objective-C, Smalltalk, C#/Java, Python and I can tell you without question, the ugliest “language” I’ve ever encountered is matlab. “
The aesthetic qualities of a language are of course entirely subjective. But I think you’d be very much in a minority if you claim Mathematica is a less “ugly” language than Matlab.
“The reason I put “language” in quotes is I’m not even sure its valid calling matlab a language. “
Why wouldn’t it be? Matlab makes a very good stab at being both a procedural and object-oriented language. The OOP capabilities in particular are great ever since R2008a.
“Matlab is about the most schizophrenic “languages” out there. They took some of the worst ideas from C and grafted array syntax on top of it.”
What, pray tell, would those worst ideas be?
“The “language” is also so limiting, no higher order functions, every function needs to be in a separate file, freaking nasty to interface with C. “
If you’re having difficulty interoping with C from Matlab, you’re doing it wrong. Same goes for higher-order functions. Try ‘doc feval’ for hints.
“The thing is, there is absolutely nothing you can do in matlab that you can’t do a 1000x cleaner in Python. “
Of course there is. The real value in Matlab lies in its toolboxes and its ability to leverage existing C/C++/Fortran code in a numerical environment. Good luck trying to implement something such as, say, the statistics/bioinformatics/financial/wavelets/etc toolboxes in Python with anything like the performance and clarity you’ll get from Matlab. Ditto for any interfaces to libraries like NAG, which are of absolutely critical importance to most of the people who use Matlab.
Python is a nice language. It definitely has its uses. But despite what some claim, things like NumPy/SciPy are not (and in all likelihood never will be) an acceptable replacement for Matlab in anything other than hobbyist projects. Who wants to dick about with the horror that is NumPy arrays when you can code the same thing more quickly, easily, and with higher performance in Matlab?
While I think all of them are pretty nice I also think they all have had their shiny moments and already reached their peak. All of them still have some rough edges that will most likely vanish, if steady development continues. Maybe except for Python, but not if one thinks about Python 3.
Would be nice, if they could build a common platform, so they can join their forces/manpower in some areas. Something like Parrot.
Edited 2010-10-25 23:10 UTC
It’s certainly odd, their idea of “niche languages on the rise”. Python and Ruby have been around for ages, and are pretty well mainstream these days. Matlab and R – they’re a little more ‘niche’ in the sense that they’re fairly specialised, but they’re pretty much standard tools for those who work in that area. And COBOL being on the rise, really? No question, there’s still demand for it as a skill, but it’s hardly a major growth area.
On the whole list, the only OO language is Python. And to tell the truth, Python is loved mostly by people who tends to program in imperative style. Most pieces of python code could be easily confused with (late eighties – early nineties) basic.
I have always disliked OO strongly, specially fundamentalist OO (everything must be part of an object – java). I was loosing all hope to see other areas of programming moving forward, the OO blackhole seemed to eat everything…but these last years functional languages (Erlang, F# (Ocaml), Haskell) started to make some noise. Also, others trendy languages are more permissive and interesting in the non-OO front, like Python or Lua.
Let’s hope this trend continues, and we see real improvements in areas like parallelization, expressiveness, optimization and debugging.
Edited 2010-10-25 23:50 UTC
Agreed — imperative programming is much more natural than OOP for data reduction. Data doesn’t “do” anything, stuff gets done to it.
Not sure about functional programming, as I’m not sure if every data reduction can simply be expressed as a transformation. And then, you have tasks like plotting/visualization which would be cumbersome to handle as “side effects.”
Edited 2010-10-25 23:58 UTC
A reduction is a transformation. There are standard functions in most functional languages for data reduction. I find functional languages much better suited to complex data handling than imperative languages.
Some functional languages (e.g., SML, F# and Scala) treat i/o such as graphics and file access as side effects, so they work pretty much as they do in imperative languages. Other languages (like Haskell and Clean) are purely functional and handle i/o by forcing sequence on i/o operations through monads or linear types. These take some getting used to if you come from an imperative world. IIRC, Erlang uses CSP-style message passing for i/o. This, IMO, is a very natural way of doing it.
OOP is imperative programming.
Is it? Several languages describe themselves as “functional and OO”, and there are even a few ones who embrace “declarative and OO” at the same time.
I guess that when I think in imperative programming, I am really thinking in languages that allow aplying different algorithms to several datatypes.Like “save this table on this file” or “sort using the second field of the table”, instead of having a specific method for each datatype, repeated/inherited ad infinitum.
OO is all about having a bunch of objects communicating with each other and mutating their states accordingly. This is strictly an imperative model.
On top of that basic concept you can have techniques for data abstraction and encapsulation, code reusing etc. They can be applicable for functional programming, and that’s what authors of these languages probably had in mind.
BTW, OO is about “nouns”, it is perfect for modeling complex data but less so for composing algorithms (you can black-box them but composing is difficult it requires exposing fragile, mutation based interfaces). OO programs get a bit unwieldy when there is a lot of interaction (especially concurrent) between objects. Consider data mutation a modern world’s “goto” problem – it’s now the main factor limiting scalability of large systems.
The solution is to remove or reduce mutation by either employing a specialized framework (SQL, Rails, QGraphicsView etc.) that does the dirty work behind the scenes, or to employ a language that promotes the use of immutable data (like Haskell or Clojure) and composability of algorithms in general. There is no language that could replace C++ or Java now but it’s worth watching this space.
You are correct.
Perhaps I should have said procedural programming?
The downfall of OO is really about the need for an indicative mood in computer languages. OO tried but failed. Declarative languages also had the idea in their head but also failed.
Maybe I’m opening a can of worms here, but what exactly do you mean by ‘indicative mood’?
Ruby and JavaScript are both certainly both OO. And there is even an OO version of COBOL I believe, although I’ve no idea if anyone actually uses it.
I don’t think OO is dead at all. Good luck writing GUI applications with a non-OO toolkit.
Of course, Javascript, Python, Ruby and nearly all languages are still OO. I was talking about not being “strict OO” like Java.
Python is used in imperative style programs very often. The same is true for Javascript, in fact is even more common.
Most of the Javascript code I have seen looks like an amateur basic program made in the 80’s. I really think the popularity of Javascripts comes from that aspect of the language: it’s so easy to produce a snippet of code which does something useful…without having to model a set of object abstractions unrelated to the task at hand.
BTW, I never said OO is dead. I am not so optimistic!
Object Oriented languages are almost always imperative…
I know only a small handfull of functional languages with object oriented features. In fact one of the is javascript which is equally a object oriented, an imperative and a higher order functional programming language.
Edited 2010-10-27 17:51 UTC
I forgot about the GUI comment.
When I started programming, few people had access to OO languages or even books about that thing.
All the GUIs available for the home user were programmed in procedural/imperative style, most of them in assembler.
And you did not need any “luck” for doing a GUI system. You had to be smart enough to abstract so much, but the same is true trying to do so in OO style.
These days I am remembering a little generic GUI thas was released for C64, (not GEOS) and probably took a few KB of ROM only. It was included in the Final Cartridge 3, a pirating tool for software “backup”
http://www.youtube.com/watch?v=ZrAh-hBSBA8
There is a window example in 2:00. And a text editor with proportional fonts in 1:15 (it was more powerful than it look in the video).
Grab the old ‘software tools’ book and find a recent implementation of RatFor, and go to town. Man, if you want to live in the code, and not the data then do something functional and reusable like RatFor.
“not rising” does not only imply “falling”. “Not rising” may also be “stagnating”.
pica
Lets hope this trend away from OO languages continues indeed.
I’m not saying OO is evil, there are some domains where it makes a lot of sense, perhaps business applications, possibly operating systems.
But there are a lot of domains where OO makes no sense. I’m a physicist, and most physical simulations at least to me make a lot more sense expressed in a functional manor, with languages like Haskel or Mathematica, and to a certain degree, Python (Python does have some functional features, enough to make it useful IMO).
Basically, some things I like in a high level language are easy binding to C/C++ for the numerically intensive bits, and an expressive language with functional features. Matlab is a nightmare to interface with C, Mathematica is not much better. Python and Haskel on the other hand are very easy to interface with C.
I
Um, no Ruby is OO
Perhaps the problem isn’t so much OOP as its evangelists. There’s no problem with everything being an object, so much as being forced to write code in what people consider “proper” OOP style (often in a high-mutation style, where accessors are mandatory even for simple data structures, like linked list items).
There’s no reason why OOP can’t be used in a functional style (immutable objects) or work nicely with other styles of coding (smalltalk, for example, is a lot more “functional” than python). Languages like F# and Scala show that these approaches don’t have to be at odds with fp.
That’s not to say that OOP is worthless. http://www.cs.utexas.edu/~wcook/Drafts/2009/essay.pdf
is a good essay on the comparison of OOP with more traditional abstraction techniques, although it perhaps requires a bit of functional programming knowledge.
I never really understood what “everything is an object” means. In fact, I share with Stepanov the opinion that it means nothing at all.
I can understand, however, “everything must be declared inside the scope of a class, and used through instanced objects”. And I don’t like it. It’s unproductive, and push people to produce inefficient, long and brittle code.
If I want (for example) to implement an algorithm which takes a raw data set and produces a quantized result, why should I abstract a class of it? And why should it work only on a few datatypes/classes, which must make explicit reference to it though inheritance?
I would like to see programming languages advancing in that kind of tasks, in order to produce generic code that molds itself to different data instead of the class/inheritance mess.
But please not in the path of the C++ templates; I mean solutions that can be read/used/compiled by humans.
I view “everything is an object” to mean that everything (or at least most things) is first-class: functions, methods, classes, etc. can all be passed around, and thus used much as “normal” objects. Smalltalk/ruby especially exemplify this, Java not so much (no first-class functions, etc. only reflection hacks)
I agree that not everything needs to be done through instanced objects, etc. However, it’s nice to have OOP or something similar (haskell typeclass, etc.) when you do want it.
Inheritance has its ugly parts, especially in Java, but I don’t view it as fundamental to OOP (or even classes, really). As Alan Kay said, it’s all about the messages; OOP allows an object to be used elegantly without regards to its actual implementation.
The author probably wanted to stress growing importance of scripting and domain specific languages. Fine with that.
But why wrap this simple statement with a template “N things you must know before you die”. The selection is far from being incomplete, unbiased or even interesting.
I used to write in COBOL. 20 years ago. Our backup library management software was written in COBOL, as was our course management software.
Most of those seem right to me. I don’t know much about R except that it was an offshoot of S … I thought it was heavily biased toward math…
Right now our shop develops mostly in Java, some Ruby, a little smattering of Python and PHP. Primarily Java.
So many lost languages that were very nice to program in that I miss: BLISS, Ada, even Pascal and Fortran. Some BASIC languages (compiled) were nice as well.
My current, favorite languages are Python and Groovy. I am still waiting on Perl 6.
[edit]
Corrected a spelling error. I am sure there are more.
Edited 2010-10-26 07:09 UTC
Are we sure that Ada is “lost”? True, it has always had its niche and never turned mainstream but maybe counting it among the lost languages is a little exaggerated?
Also syntactically it spawned a descendent in VHDL . But unfortunately VHDL is generally less popular than the more “c like” and more fractured Verilog
I have heard that Verilog’s popularity is mostly in the US, and even there VHDL has some popularity in universities. Correct me if I’m wrong (this not being first hand experience).
That probably quite accurate. I was taught VHDL at uni. I have only ever looked at one or two projects in Verilog but over the net the most available sources tend to be Verilog.
True… well lost to me anyway. I’d love to have a reason to use it again.
Funny. “From Ruby to Erlang” becomes: “From Python to R to Erlang”. Some anti-Ruby bias here
Edited 2010-10-26 07:11 UTC
“Programming languages on the rise: COBOL”
I have always feared the 21st century was going to be like that
*sigh*
I was surprised to see COBOL. While it is true that people have been saying its dead for a while but it still is with us (albeit a slow decaying existence), I seriously doubt it is “on the rise”.
There are errors and misrepresentations in the article that wouldn’t be there if he had even wikipedia’d some of the topics. Also, half the technologies are pretty much where they were last year, the other half are on the decline if anything.
Where is clojure? scala? lua?
Where is http://cobra-language.com/ ?
deleted
Edited 2010-10-26 15:35 UTC