Functions are the wonderful and powerful building blocks of computer programs. Functions allow you to break code down into simpler, more manageable steps. They also allow you to break programs into reusable parts — parts that are both reusable within the program and in other programs as well. In this article, learn how to create new functions at runtime based on templates, how to create functions that are configurable at runtime using function parameters, and how the Scheme language can be a valuable tool with functions.
Not that the readership here wouldn’t know this already, but other functional languages like ML (in its Standard ML of New Jersey or OCAML variants) and Haskell also offer higher order functions. Microsoft even has a .NET version of CAML called F#.
I’m glad to see an article on them. Now that I’m used to higher order, I find myself wanting to have HO functions in whatever language I’m programming in.
F# is still in “research” at Microsoft:
http://research.microsoft.com/projects/ilx/fsharp.aspx
and:
“F# is a research project designed to exploit the potential of the .NET platform to further the long-standing goals of the functional programming community. The aim is to prove that it is feasible and useful to implement ML-like languages for use on the .NET Framework. There are no current plans to commercialize F#”
What a pity …
OTOH, the original ocaml implementation offers very fast native compiled code (www.ocaml.org)
Functional programming (4th generation languages) still don’t compete over 3GLs, however they have great potential …
I hope to read more articles on functional programming !
regards,
Steven.
Concur.
Recommend http://lambda-the-ultimate.org/
for good programming language justice. Slightly academic, but a good complement to this site, IMHO.
The article doesn’t really give an example of how to build a function at runtime. It only shows how to return pre-configured versions of a function built at compile-time.
To make it clearer, all the examples in the article look like this:
(lambda (param1 param2) (op param1 param2))
The function body is always fixed at compile-time, never determined at run-time in these examples. It is only configured through its enclosing environment. Does anyone know an example where a function is really built at run-time… maybe a function that builds functions? It would look a bit like this (but this code doesn’t work and I’m too illiterate in Scheme to make it work):
(define build-function
(lambda (params body)
(lambda params body)
)
)
May I be wrong, but if I understood what you said, perhaps you should work on implementing a layer on top of someting like “FastMathParser” from http://www.codeproject.com/cpp if you whant it in C++.
In Schema you already can do that.
“In Schema you already can do that. ”
Does anyone here know a way to actually build new functions at runtime. ( without cheats like elaborating C code then calling “gcc” or doing compilation on the fly like advanced VMs )
The article’s examples shows how to uses closures and how to use functions in a closure local enviroment to fake a function written from the ground up.
Is there a language where you can actually “merge” sub functions calls to create a function or a “Compose” operator doing actual code rewriting to elaborate composed functions.
Are some Scheme or Lisp compilers able to do such optimisations ?
( … well, I’m poised to implement it in my own PL, so I’m deeply interested … )
Calling the compiler at runtime isn’t “cheating”, but a prerequisite for creating functions at runtime. The reason it seems like a cheat is that in most mainstream languages, the compiler is divorced from the runtime, and you have to go to awkward lengths to use the compiler at runtime. It also doesn’t help that representing program code in most languages is a painful affair. In Lisp, neither of these things holds true. The compiler is integrated with the runtime, and representing code is easy.
Let me give you an example in Common Lisp:
(defun make-lst (op)
(list ‘lambda ‘(x) (list op ‘x 4)))
(let ((lst (make-lst ‘+)))
(compile ‘add-four lst)
(format t “~A” (add-four 4)))
This will result in a function ADD-FOUR that adds 4 to its argument. The key here is the COMPILE function, which takes as it’s parameters a symbol (the name of the new function), and a list (an s-expression specifying the code of the new function). The key thing here is that the list can be *any* list, generated however you want. So you could generate the list at runtime, as I have done here. Of course, here, I use a very simple list generator that could probably be emulated by closures, but in real life, you could use a complex function that, for example, parsed an XML description to generate a function. The nifty thing is that the function will be a full-fledged function, compiled to machine code (if your compiler has that feature). So this technique is usable for stuff like analyzing an input data set, and generating a customized inner-loop to process it.
I think building of functions happens when they refer to local variables that are outside of their body, for example
(defun make-adder (x)
(lambda (y) (+ x y)))
You can also build the function body using operators like cons, and then run eval on it.
“Slightly academic, but a good complement to this site, IMHO.”
Thanks for the URL,
BTW.: nothing wrong about being academic 😉
Your example again uses compile-time-fixed code, but the hint about ‘eval’ was also what I had in mind. I suppose generating functions at run-time is really centered around eval, not around lambda (or maybe around that ‘compile’ that Rayiner mentioned).
I’m curious: aside from ‘compile’ or ‘eval’, what other method for generating functions at runtime can you envision? ‘Compile’ really is as general as it gets.
My code is fixed, but it can be run in different “hidden” contexts (different values of “x”) that are constant and local to this function.
The returned function is not just reference to the code, but actually _combination_ of code and hidden parameter. This combination is regarded as new function (another value of the parameter would produce different function), thus we are building functions at runtime.
If you look at the “Building functions at runtime” part of the article, you will see that it talks exactly about this kind of stuff.
“My other car is a cdr.”
Seriously… Scheme and its ilk are a great academic exercise, but there are numerous good reasons that nearly all practical programming is done in object-oriented languages.
When I was a student at Rice University, there was a prof named Matthias who was obsessed with Scheme and thought the entire world should be using it and that C/C++ was the devil. It never seemed to bother him that there were no good debugging tools for Scheme, that its performance was terrible, and that it couldn’t easily manage the complexity of any task larger than the academic exercises that we were asked to implement with it.
After looking at that URL. Do I really need to comment on the post?
Reading some of Paul Grahams essays migh help you understand his obsession,
http://www.paulgraham.com/articles.html
I think (memory is so fragile) that this is the one http://www.paulgraham.com/hundred.html
Seriously… Scheme and its ilk are a great academic exercise, but there are numerous good reasons that nearly all practical programming is done in object-oriented languages.
*Snicker*
It never seemed to bother him that there were no good debugging tools for Scheme
Define “good debugging tool?” An extended version of GDB is a common tool for Scheme implementations that can’t afford any better. World-class Lisp implementations like Allegro have debugging facilities that C/C++ can only dream of. What C++ debugger has restartable exceptions? What C++ debugger allows you to redefine a class and continue where you left off?
that its performance was terrible
I hope this was in the early 1980’s before there were good Scheme compilers. Compilers like Bigloo and SBCL generate code that has excellent performance.
and that it couldn’t easily manage the complexity
This is the only point that even comes close to being accurate, but it’s still deeply misguided. It’s highly likely that the Scheme implementation you used was designed to be used for teaching, not real use. Scheme implementations designed for real use (like Bigloo), have lot’s of features (IDE, packages, objects, etc) for working on professional code. Complaining that Scheme isn’t enterprise-ready because you used “My First Scheme Compiler” is like complaining that C isn’t enterprise-ready because you used FBCC (http://fabrice.bellard.free.fr/fbcc.html).
I believe he was talking about Scheme. Allegro and SBCL are Common Lisp implementations, and overall, Common Lisp is a far better language than Scheme. For example, in my 4th year computer science class we’re using Scheme with an idiotic text book (“Essentials of Programming Languages”). If I hadn’t already known Common Lisp, the combination of Scheme’s limitations and the retarded book would’ve turned me off Lisp-like languages, for sure.
Seriously… Scheme and its ilk are a great academic exercise, but there are numerous good reasons that nearly all practical programming is done in object-oriented languages.
I’m sorry, but this argument just doesn’t make any sense. Object oriented programming is actually a subset of programming with higher order functions. Objects can easily be created in higher order languages by building a function (or special form if the language allows them) that maps symbols to functions. In fact Common LISP has one of the most powerful object systems around. As far as your comment about LISP languages not being able to handle large problems, there are numerous counter examples where LISP allowed companies to outperform their competitors by building enormous applications (for instance Yahoo! stores and Orbitz.com)
I’d agree, but it was the “Scheme and it’s ilk” that signaled to me that he meant Lisp too. At the end of the day, the point remains that not all Schemes are created equal, and it’s silly to judge Scheme based on a teaching implementation.
Don’t feed the trolls. CL vs Scheme is an interminable flamewar, EoPL is a very nice book, and if people really cared about speed then Perl and Python would never have become popular.
Is on-the-fly compilation cheating ?
Eval and Compile are available both on Scheme and Scripting languages ( Python, … )
The ‘special’ syntax of Scheme/Lisp allows both easier run time generation of arbitrary source code and an easier implementation of the eval function.
I’m not sure that it is practical to make an intense use of Compile for a infix syntax language with a big compiler overhead ( both for parsing and generating the code. I know that there are hacks in Python to generate a parse tree ).
In my own language, I will implement metaprogramming by on-the-fly transformation of the VM code to implement things like compose operators ( f ^ g -> f(g(x)) ), Currying, merging of sub functions call. I thing that it is sufficient to do any kind of metaprogramming without being bothered with the parse tree or calling the compiler.
Some of these transformations could be made automatically by a clever optimising VM.
“It never seemed to bother him that there were no good debugging tools for Scheme, that its performance was terrible,”
I don’t know about scheme (however scheme bigloo must be an option)
Anyway: have a look at objective caml (www.ocaml.org) which appears to be very performant (native compiled)
Ondrej:
> My code is fixed, but it can be run in different “hidden”
> contexts (different values of “x”) that are constant and
> local to this function.
>
> The returned function is not just reference to the code, but
> actually _combination_ of code and hidden parameter. This
> combination is regarded as new function (another value of
> the parameter would produce different function), thus we are
> building functions at runtime.
Yes, seems like I confused building functions with building code at runtime.
Rayiner:
> I’m curious: aside from ‘compile’ or ‘eval’, what other
> method for generating functions at runtime can you
> envision? ‘Compile’ really is as general as it gets.
I was arguing the same thing: That ‘compile’ and ‘eval’ really build code at run-time (thus provide full flexibility), while ‘lambda’ is only a run-time hook for static code.
In my own language, I will implement metaprogramming by on-the-fly transformation of the VM code to implement things like compose operators ( f ^ g -> f(g(x)) ), Currying, merging of sub functions call. I thing that it is sufficient to do any kind of metaprogramming without being bothered with the parse tree or calling the compiler.
I’m curious what you mean by “merging of sub functions call”. Do you mean on-the-fly inlining? Eg, if f(x) = x + 2 and g(x) = x * 2, then f^g(x) = x*2+4? Curry and compose can be powerful, but are not fully general replacement for COMPILE, except in the case where everything in your language can be represented as a function call. In that case, however, you end up with a tree of sub-functions, which is equivalent to an s-expression.
Here is the kind of things I’d like to do ( with a somewhat different syntax )
If you want to build inlined power(n) functions :
—————————————————
F.merge_sub_calls inlines calls to functions accessed in the enclosing “local” context without altering the calls to object member functions. Many other tricks could be done.
local Power:=lambda(n) {
if (n=1) return lambda(x) { return x };
local I;
local Z:=lambda(x) { return x*x };
for (I:=2;I<n;I++) {
Z:=lambda(x){ return x*Z(x) }
Z:=Z.merge_sub_calls();
}
return Z;
}
Square:=Power(2);
Cubic :=Power(3);
Cubic(3) shall yield 27 out of an inlined function whose bytecode shall ressemble to “lambda(x){ return x*x*x; }”
For implementing some generics, customized functions can be generated on-the-fly with the ( run time ) generation of classes.
I already have implemented “string.eval” and “string.compile” but I’m by no way satisfied of that way of assembling bits of source code in strings, it seems as dumb as the C preprocessor.
Everything is a function call, except program flow statements ( do,if,for,return,… } which can be transformed into function calls ( à la Smalltalk ) :
FUNCTION_IFELSE::=lambda(x,y,z) { if (x) y() else z(); } and then inlined back with the merge_sub tool.
A VM code optimiser can transform the code but usually calls to external functions cannot be optimised out.
Such VM code transformations could probably be implemented in languages like Python, Ruby, …
( Of course, the Lisp way is the most elegant for such things, but, well, preferring infix syntax to S-exprs is not a lethal sin )
Maybe I’ve missed an important problem ?
Ah, I see what you’re saying. There isn’t a problem, I was just pointing out that merge_sub_calls() ends up being equivalent to sexprs. The power example you mentioned above could be transformed into a DO loop appending * elements to a list, and then a COMPILE on the resulting list. In both cases, the compiler ends up with a tree of function calls from which code is generated. I can understand the affinity for infix syntax, though. You might be interested in checking out this paper: http://people.csail.mit.edu/people/jrb/Projects/dexprs.pdf
The paper discusses implementing Lisp-style macros in an infix language, and comes up with the idea of using ‘code shapes’ to do it. It might give you some ideas for a synactic suger to make it easier to represent more complex functions.
Languages like metaocaml and metaml can provide runtime code generation (and they’re safely typed if you like that sort of thing , by annotating the code with staging constructs.
An example using metaocaml, the power function –
let rec pow x n =
__match compare n 0 with
______-1 -> invalid_arg “negative power in pow”
____| 0 -> 1
____| _ -> if n mod 2 = 0 then
________(pow (x*x) (n/2))
______else
________(x * pow x (n-1))
Now add staging constructs, and modify the code a bit to express domain specific knowledge (in this simple case, using (let y = x * x in y * y) instead of (x * x) * (x * x)).
let pow’ x n =
__let rec staged x x’ n =
____match compare n 0 with
________-1 -> invalid_arg “negative power in pow”
______| 0 -> x’
______| _ -> if n mod 2 = 0 then
__________.<let y = .~x * .~x in
______________.~(staged .<y>. x’ (n/2))>.
________else
__________.<.~(staged x .<.~x * .~x’>. (n-1))>. in
____staged x .<1>. n
let power15 = .<fun x -> .~(pow’ .<x>. 15)>.
let _ = Printf.printf “%dn” ((.! power15) 2)
Run that, and power15 ends up as this code type –
val power15 : (‘a, int -> int) code =
.<fun x_2 ->
let y_3 = (x_2 * x_2) in
let y_4 = (y_3 * y_3) in
let y_5 = (y_4 * y_4) in (y_5 * (y_4 * (y_3 * (x_2 * 1))))>.
Then just use the run (.!) staging construct to get a function, pass in 2 and voila! 32768. What could be simpler?
Thank you for these enriching comments and advices.
I think that doing ‘my’ sort of VM code manipulation could be interesting but I’m not sure of what shall do precisely each metaprogramming system call nor what kind of library can be written on top of them.
I’m still wondering whether Macroprogramming is as useful as Metaprogramming ( or can it be the same thing ? )
I’d like to do stealth metaprogramming ( ! ), that is the SW generates continuously functions, rather than having a powerful macro system which is used only at “compile” time ( wich doesn’t mean anything, of course, for the Lisp family ).
As I understand of the problem, implementing generic functions customisation as Macroprogramming would generate functions when the SW is compiled ( C++ templates ) or during startup. Using Metaprogramming, the customised functions can be generated at run time ( C# style ? )
Of course, my example was silly [partly] because generating that kind of things is more a Macroprogramming task than a Metaprogramming factory.
( The ultimate goal of SW development is to make the computer program itself 😉
( BTW, sorry for my poor English )