[ Go back to normal view ]

BW2 :: the bitwise supplement :: http://www.bitwisemag.com/2

What’s Wrong With Ruby?
Matthew Huntbach takes a long hard look at the coolest language on the planet and is distinctly under impressed by what he sees…

16 March 2007

by Matthew Huntbach

Tim Sweeney’s talk The Next Mainstream Programming Language (PowerPoint PPT) is in many ways an antidote to the recent Ruby hype. Tim calls for the use of stronger types to ensure program reliability. He praises the academically-developed Haskell functional programming language. He raises concurrency as a feature which must be tackled in the next big programming language, using a better model than the shared state with threads and mutual exclusion devices used by Java - and by Ruby - which haven’t changed since the 1960s.



As a major player in the computer games development world, Tim Sweeney has to be listened to with respect, but it is not surprising that he, rather than a proponent of Ruby, was asked to speak at the POPL (Principles of Programming Languages) conference in 2006. POPL is the premier conference for academic programming languages theoreticians: types, concurrency and programming languages like Haskell with a strong background in discrete mathematics are what these people are about. Tim Sweeney’s words are sweet music to their ears. Ruby, and the whole scripting language phenomenon is a slap in the face.

Ruby – The Sales Pitch

I confess, as an academic whose main research interest has been programming languages, to have been taken aback by the rise in scripting languages. We have been used to complaining that the programming languages we had developed were so much better than the mainstream programming languages, the only reasons they hadn’t taken off were that they weren’t developed and promoted by big companies, and that the inherent conservatism of industry restricted commercial programming to tried and trusted languages.

Yet a whole stream of new languages: Perl, PHP, Python and now Ruby, each initiated by one person as back-room projects, have been adopted for serious use and achieved many thousands of users. Unlike academically-derived languages, they have given the impression of being thrown together to meet a need without any strong underlying theoretical basis. This is, however, not entirely fair. Python was heavily influenced by the functional and object-oriented languages. Ruby claims to be purely object-oriented.

Clearly the rise of web programming has developed a niche for simple text manipulation languages, with Perl and PHP starting with few pretensions to be anything else. Python and Ruby, however, have been more ambitious. While originating in the scripting language niche, their supporters have proposed them as general purpose programming languages, claiming them to be more elegant and better suited for introducing programming to novices and for development of fairly large-scale programs than what has developed into the leading mainstream language, Java.

I had been meaning for some time to take some time to learn a scripting language seriously, both for practical purposes and to further my interest in programming languages. Putting me off was the fact that every time I had been convinced to start on one, another one making more claims came along. The latest of these, Ruby, seems to be promoted with some fairly outrageous claims. In the sales pitch for "Programming Ruby: The Pragmatic Programmer’s Guide" (the ’pickaxe’ book), it is described as ‘the finest and most useful language available today’, and this is mild compared to some of the hype. Yet there are people whose views I have respected in the past going for it. I have therefore bitten the bullet and taken some time to teach myself at least the basics of Ruby.

Not Elegant – Just ad hoc

I have a great aversion to hype, having seen a lot of it during my time as a computer scientist. As an undergraduate in the 1970s, I was exposed to the Prolog, taught it by its originators, and assured it was a revolutionary language, so much easier to use than conventional programming languages, almost like a human language to use, as it was based on human-derived logic rather than building a language to run a machine. Personally, I did appreciate Prolog, but it was a minority taste. It was a humbling experience, in my later years, having to teach it to a new generation of undergraduates, who almost all found it extremely difficult to use.

I have lived through the rise of Java, which at its birth was hailed with the sort of language now used to describe Ruby: claims that it would make the process of programming so much easier by eliminating many of the complexities of earlier languages, and by using a programming model so close to human thinking that programs written in it could be understood almost as if they were English. Java has not made teaching programming any easier. Get together any group of university Computer Science teachers, and the chances are you will find them moaning about how difficult it is to get many of their students to get to grips with even the simplest practical programming exercise. The recent blog explosion on the subject of Why can’t programmers program? indicates a problem we Computer Science academics have long known, and have covered up either by setting the pass barriers low or by using the excuse "well, these days there’s a lot more to Computer Science than just coding". Programming is a skill a few take to naturally, but many, despite the keenness for the technology shown by joining a Computer Science degree, struggle with and never master. The many attempts to introduce it in new ways, with different languages or different paradigms have only convinced me, since none of them have shown conspicuous success, that the programming language used is not the issue. I don’t accept the Ruby claim that throwing away types and their supporting syntax will make programming much easier to learn.

As Fred Brooks put it there is no silver bullet, "no inventions that will do for software productivity, reliability, and simplicity what electronics, transistors, and large-scale integration did for computer hardware". A telling point is that the first proposed "silver bullet" he mentions is the then widely touted language, Ada, whose impact proved rather limited. Claims that Ruby is such a different language from the others and so natural to use that it will magically increase productivity sound hollow when the same has been said about so many previous programming languages.

A major reason for Ada’s lack of impact was that it was developed just before the object-oriented revolution, and it was C++ which came to be the dominant mainstream language as a result of its pragmatic introduction of object-oriented features on top of C. Java cleaned up C++ by removing the low-level aspects inherited from C. It is often forgotten that Java also made automatic garbage collection mainstream, along with the insistence that an extensive code library provided with the language be considered an integral aspect of the language. These are now taken for granted in Ruby, and it is unlikely a new general purpose language would be introduced without them.

If I was put off Ruby by the hype, I was put off more by the many cutesy introductory tutorials I encountered when trying to get into it. Why’s (Poignant) Guide is a particular horrid example, but there are many others. Sorry, if I’m getting into a new language, I don’t want to be patronised in this way. I don’t want someone chatting away to me and telling me how "cool" it all is (I’ve lived long enough as a computer programmer to know it’ll never really be "cool" to be one). I just want the straight facts, plainly put. However, that’s just me, and if there weren’t plenty of others who disagree, the "Head First" books would never have sold so well.

Having ploughed through several tutorials, I did not find Ruby particularly "elegant", or its syntax particularly obvious. Much of it seemed ad hoc, thrown together, in particular when there were several different ways of doing something, and it seemed to be the philosophy of the language to provide all of them. I did not find its constructs as intuitive and natural as claimed, trying out simple coding examples proved as frustrating when things didn’t do what you’d suppose they’d do, as it has with other languages I’ve taught myself from tutorials.

I’ll give just a couple of examples. In Ruby, if a is [1,2,3] and b is [10,20,30], then a+b is [1,2,3,10,20,30]. Why not [11,22,33], which Tim Sweeney in an earlier article suggests "intuitively makes sense"? What does a[x,y] (when a is a two-dimensional array) mean in Ruby? Not what I first supposed it might mean from other languages (otherwise a[x][y]), or even what I then thought it might mean (otherwise a[x..y]).

Untrue To Type

The point about these relatively trivial examples (many more could be produced) is that what is obvious and intuitive to one person may not be to another. This has been brought home to my through my experiences of teaching novices to program in a variety of programming languages. It is surprising just how many ways a novice can build up which are different from the correct understanding of some language construct. I do not think Ruby has the answer to this; indeed in its rather rich variety of constructs it has plenty to mislead the novice. My take on some of its "clever" or "elegant" features is "how could I teach this to a bunch of first year undergraduates?". I know from experience, it would be painful.

More than this, the introductory tutorials to Ruby which I’ve seen make the time-old mistake of assuming that learning the language syntax is learning to program. In my experience, learning to program is much more about being able to see patterns and abstractions than it is mastering programming language syntax. Some people seem to have a natural ability for this, some don’t. Some never really grasp why a local variable in a procedure call does not have the same value as a variable of the same name in the code which called it; a great many never really grasp recursion. I do not see anything in Ruby which makes this sort of issue any easier. In fact I see aspects which would add to the confusion. The single optional block parameter to every Ruby method which many of its proponents seem to find particularly "elegant" (I see it as an ugly way of doing what is done more powerfully and elegantly in higher-order functions in the functional languages), would I think be a source of intense confusion to many novices.

Ruby’s dynamic typing is put forward as more elegance. I can take it or leave it. Yes, it gives a superficially cleaner appearance to code by cutting out all the syntax required to maintain static types, and required for compile-time checks to ensure that methods are only called on objects that can accept them. But who really uses a variable or argument without some idea of its type? I see types as useful language-supported comments, a guide to clear thinking. My advice to novices is always to start by writing a method’s header, with its argument and return types, that is the first step to understanding what it does. When I look at old code, the types are a good guide to start working it out. I’d hate to have to debug a large piece of typeless legacy code, that wouldn’t be "fun" no matter how much fun it was when its original author picked up Ruby and threw away types.

Ruby’s dynamic class modification also strikes me as something with the potential to produce hugely complex and non-understandable code. If I am tracking down the source of some buggy behaviour, I might start by looking for the class of the object whose method call led to the bug. The lack of types in Ruby would make that more difficult, but on top of that I have to face the possibility that the method may have been redefined somewhere else during code execution, maybe for the whole class, maybe for the individual object. There isn’t one class definition I can reliably say completely defines an object’s behaviour. There is a growing criticism of code inheritance in Java, with recommendations that we stick to interface inheritance and otherwise use composition and delegation for clearer and more reliable code. How very much worse this problem is with what can easily be done in Ruby. Sure, the ability to modify code dynamically can be valuable, but it should be done with caution and clearly flagged up. The casual nature by which it can be done in Ruby worries me - again, it may be "fun" now, but how about debugging code using it in a few years time?

When The Revolution Comes…

Still, as many people whose opinion I have found sound in the past have taken to Ruby, I have to concede it has something going for it. A big part of that is clearly Rails. I have deliberately avoided mentioning it in this article but, looking at it, it seems a useful tool which does a lot of the donkey-work of building web-based applications, and I can see the place of Ruby on Rails. I mentioned Tim Sweeney’s article at the start as a reminder that many of us are doing coding which is not primarily web-based, and our needs may differ. This needs to be noted in a world where many new programmers have grown up with the web, and tend to think of programming purely in terms of building web applications.

The other issue is the over-maturity of Java. As a simple language in which I first learnt to appreciate the object-oriented paradigm, Java was fun. It was also good, after years in which we seemed to be in bitter disagreement, to see both industry and academia converging on a programming language both could agree was useful - no more did I have to try and explain to students why I was teaching them to program in a language they never saw in the job ads.

However, these days when I look at the Java section in the bookshop, there is little I understand. There are huge numbers of add-on libraries, each of which has its justification but which have been developed on a learning curve and so haven’t got it quite right. Even the core APIs are bloated, since backwards compatibility means one cannot throw away one’s first solution to a problem once it has become an integral part of the language, even if developing and using it has led to a better solution. It seems to be inevitable that once a language becomes widely used as a general purpose language, it is pushed in directions it isn’t suited to, builds up unnecessary complexity through accretions, and the urge to throw it away and start again becomes stronger. A bit like any large software system which has served its time.

Despite what I’ve written above, I didn’t find Ruby horrible. If I needed to use a scripting language, it’s the one I would now use. It’s cleaner than Perl, and I like its syntax better than Python. Its object-based nature works for me: object-orientation is not the silver bullet, but it is a better abstraction for managing complexity and modelling things than anything else we have. It has picked up some useful aspects from functional programming as well and put them into a language whose ability to handle interaction between stateful objects makes it more practical than pure functional programming (still to me, real elegance, but of an infuriating kind). But it isn’t as revolutionary or as cool as its advocates suppose.

To me, the real revolution in programming languages will come when we have one which breaks away from the single-processor sequential computer. There is some truth in the claim that Ruby doesn’t really give us anything that wasn’t there long ago in Lisp and Smalltalk, but they weren’t bad languages.


Matthew Huntbach is a Lecturer in Computer Science at Queen Mary, University of London. During his eighteen years working there, he has taught introductory and intermediate programming in a variety of programming languages. His research interest is in the development of practical programming languages which are inherently concurrent. He has a DPhil in Artificial Intelligence from the University of Sussex.