January/February 2018 issue of acmqueue

The January/February issue of acmqueue is out now

Programming Languages


Download PDF version of this article
This and other acmqueue articles have been translated into Portuguese
ACM Q em Língua Portuguesa

ITEM not available


Originally published in Queue vol. 9, no. 9
see this item in the ACM Digital Library



Robert C. Seacord - Uninitialized Reads
Understanding the proposed revisions to the C language

Carlos Baquero, Nuno Preguiça - Why Logical Clocks are Easy
Sometimes all you need is the right language.

Erik Meijer, Kevin Millikin, Gilad Bracha - Spicing Up Dart with Side Effects
A set of extensions to the Dart programming language, designed to support asynchrony and generator functions

Dave Long - META II: Digital Vellum in the Digital Scriptorium
Revisiting Schorre's 1962 compiler-compiler


(newest first)

Displaying 10 most recent comments. Read the full list here

Bilyan Borisov | Fri, 24 Jan 2014 15:01:30 UTC

Great article! In my opinion, most of the critics of OCaml in the comments have overlooked the fact that the language can be compiled down not only to byte code but also to native code. In the long run, a compiled application will always be faster than a Python-Ruby-Perl script.

Conciseness aside (even though I totally agree that functional languages, with OCaml as an example, are much more succinct than Java/C++), static typing and type inference are enormously good even for moderate and small sized projects. Again, this is all a personal preference, but logically speaking a dynamically typed language will 'let you do nasty things' like :

>> 4 == None

which evaluate to False in Python, which will give rise to a TON of bugs that only semantic testing can catch since this is syntactically OK.

Therefore, even if we assume that Python is as succinct as OCaml (or any other statically typed and compiled functional language) it will always be slower and will leave a larger margin for introducing bugs in huge codebases. However, again as the author suggests, Python does have a very serious advantage, namely, its extensive set of libraries but I personally think that they give the inexperienced programmer a false sense of awesomeness since you get the illusion of being able to do a lot without much effort.

At the end of the day languages are still tools and we still love them and defend them dearly so I guess everything goes in the 'which programming language is the best' discussion.

Kevin Kinnell | Thu, 11 Oct 2012 19:59:40 UTC

"First class" is a CS definition, and according to that definition CL has first class functions. I got it wrong, because I didn't really say what I meant.

I should have said "easily usable first class functions."

Sorry about that.

Jesse Talbutt | Sun, 19 Aug 2012 16:10:16 UTC

Unless the robot in question is the one in the first scene of Robocop. Then I'd probably use a more strongly typed language.

Jesse Talbutt | Sun, 19 Aug 2012 15:59:42 UTC

Common LISP does, in fact, have first-class functions. It's one of the sine qua nons of the language - i'm not sure how anyone could get that wrong.

The main difference between LISP and Scheme is that LISP uses separate namespaces for its functions and its variables, which is an abstract that's both incredibly powerful and incredibly dangerous: all it takes is one "clever" code-writer using that feature to crack open a closure and you get unpredictable side-effects. I read this article wondering "if they're going to use OCaml, why not just go whole hog and use LISP" but I see the issue now - OCaml is like a LISP where you sacrifice power that you'll probably never use in exchange for safety: likely a premium when you're dealing with large financial transactions in real-time instead of, say, creating a natural language DSL for robots.

Kevin Kinnell | Mon, 11 Jun 2012 23:17:33 UTC

Drat. I meant "last few paragraphs." C'est l'ecriture.

Kevin Kinnell | Mon, 11 Jun 2012 23:03:35 UTC

The readability of any code is related to knowledge of 1) the purpose of the code, 2) the language the code is written in, and 3) -- and please pardon the pun -- the knowledge of the coder.

One can argue that literate programming obviates (1) but what it actually seems to do in practice is to add a highly-technical natural-language narrative that wrecks the flow of the program. It becomes something like reading a legal brief. Fine -- if you're a lawyer, with extensive knowledge of the arcane (meta) language of law. My apologies, Dr. Knuth, but there it is. The actual cure for (1) is a combination of a good description of the purpose of the code, and good comments.

(3) is more or less a combination of innate ability and experience, whether we're talking about the writer or the reader of the code. I'd bet NO CODE reads like a natural language, but certainly there isn't any that reads like English. If you think it does, it's because you have ability and experience. Arguably, the process of "speaking" in a programming language seems to mirror the process of learning a natural language. In particular, the whole gamut of "native speakers" is there -- you can have babies talking to babies, babies talking to adults, well educated adults talking to less educated adults, etc. It's quite a bit of fun to observe. The knowledge of the coders, both the reader and the writer, is a big part of readability. Anybody who can make a language that somehow skirts this is, in my opinion, some sort of supernatural being.

That leaves (2), the language the code is written in. The closest-to-objective view of that is, I think, how quickly a reader can grasp the overall purpose of a coding sequence, and integrate that into the flow of a program. In other words, can the code be "grokked" easily?

You can create a language which makes this almost impossible to achieve -- APL comes to mind. No one could argue that APL isn't concise. Gaining the experience necessary to be able to read APL like a natural language is probably denied most ordinary mortals by their finite lifespan. I doubt that anyone can grok APL, even if they wrote the code.

At the other extreme there're COBOL and Java (one programmer's opinion, of course.) If you can actually grok code written in either of these you can make a pile of money, but you have sold your soul.

Lisps tend to be interrupted by their closing parentheses, but you can learn not to see them. Lisps also use a REPL, giving them the benefit of incremental coding. Unfortunately, except for Scheme dialects, functions aren't first-class in Lisps. Lisp can be grokked, but it's missing some expressive power, especially built-programmable types and type checking, making computation with types a huge burden on the programmer.

Perl goes beyond Lisp in concision, and its scope-control is as good as it gets. But Perl has a steep learning curve for fluency, and functions aren't really first-class, Perl lacks a usable REPL, and Perl is missing automated type checking.

OCaml manages to be concise, allows computation with types, makes functions first-class objects, has a visually clean syntax, has a REPL, does not require "purity" (thus allowing coder-readable access to the real world) and seems to be quite easy to learn--can you imagine a trading company requiring its traders to learn any of the languages in the last paragraph?

Well expressed OCaml code is easy to grok.

I think Dr. Minsky makes his case.

Fredrik Skeel L√łkke | Tue, 13 Mar 2012 18:25:46 UTC

It would be wonderfull to hear how the language plays out in the context of tests. Especially regarding tests involving stubbing of external resources.

Rowan | Fri, 09 Mar 2012 05:41:37 UTC

@name on "more practical": Actually, monadic effects and the like are quite common in OCaml. In fact about 1 month before you posted your comment, Yaron (the author) also announced a new monadic concurrency library called Async: https://ocaml.janestreet.com/?q=node/100

For unintended side-effects, it's more a library issue than a language issue: it's pretty easy in OCaml to encapsulate effects in a monadic library, and then only use the monadic versions, banning other uses or considering them unsafe in a similar sense to Haskell's "unsafePerformIO".

To me the main reason I tend not to use Haskell except for small projects is that I've hit the unintended side-effect of "allocating massive amounts of memory and thrashing or crashing" and sometimes been unable to resolve it without digging deep into the source code of multiple libraries written by different authors. This is improving in Haskell, but I'd still trust OCaml much more for critical systems like the one described.

gkannan | Thu, 12 Jan 2012 15:57:31 UTC

Excellent article.

name | Thu, 22 Dec 2011 22:27:19 UTC

"more practical". Wow, those are weasel words if ever I heard them.

Haskell is way "more practical" (you see what I did there?) for me than O'Caml. O'Caml is great, but it doesn't even try to prevent unintended side-effects. (The whole Applicative/Semiring/Monoid/Monad/MonadT is insanely useful and rewarding if you just try it "fo' realz".)

Displaying 10 most recent comments. Read the full list here
Leave this field empty

Post a Comment:

© 2018 ACM, Inc. All Rights Reserved.