My Journey Through the World of Programming Languages

2008-04-07

My journey through the world of programming languages began in 1987 with the blinking cursor on a black-and-white computer screen of an Atari ST 1040 computer. After a few hours of playing with the GFA BASIC interpreter, I was hooked. The graphical capabilities of the Atari computer made it possible to program Mandelbrot fractals, the Towers of Hanoi, the Breakout game, and all those things which newbie programmers like to entertain themselves with.

Quite a few of these programs looked peculiarly similar to what people programmed ten years later when the first Java applets appeared. But I am getting ahead of myself. Back in 1987, BASIC was the beginner language. The GFA BASIC dialect was considered quite modern at the time, since it didn’t have line numbers and it was a full featured procedural language, at least in principle. Yet, it was still a toy. After about six months I moved to more ambitious projects and I realised that I had outgrown BASIC. Someone gave me a copy of a C-compiler, so I started learning the C-language.

This was a good decision as it turned out later, because I was able to use C throughout the first five years of my career. I found Kernighan-Ritchie style C to be conceptually very close to the GFA BASIC I had started with except for pointers, which were completely new. The study of C led me to Unix. I began writing clones of Unix tools and utilities for my own use. This was the late 80s before the GNU and Linux phenomena appeared.

One such project was a text editor that I enhanced with optimised scrolling routines in 386 Assembler language. I wrote the editor after I had exchanged my Atari computer for a PC. After a few months I had a number of common Unix tools and a nice text editor at my disposal which I could use under MS-DOS. Then I read Andrew Tanenbaum’s Minix book and I got into system programming. I wrote a micro-kernel task scheduler for the 386 in Assembler. Multi-tasking was a fascinating thing that seemed to be out of reach for an average personal computer. At the time, I briefly considered expanding the micro-kernel into a more complete OS by adding memory management and file management. However, I soon realised the immensity of this task. I had just started studying informatics, and I figured that I wouldn’t be able to accomplish it while still visiting lectures and doing homework.

At university, we were taught Pascal as a “first” programming language and Lisp as a second. Pascal was very easy, of course; it seemed like a verbose dialect of C. - Lisp, on the other hand, I found quite repulsive. - I could appreciate the underlying mathematical idea, the lambda calculus, but the syntax was just awful. I believe it was IEEE Scheme. The language seemed great for graph-theoretical problems, but unsuitable to express common algorithms in a natural way. In other words, I found it to be a language for eggheads.

At the time, the imperative programming paradigm was predominant. It seemed the best way to get things done, as development tools and libraries for imperative procedural languages were readily available. The next language I learned at the university was Modula 2. I thought of it as an elaboration of Pascal with emphasis on data abstraction and encapsulation. From Modula 2 I learned the importance of encapsulation. Although I didn’t use Modula 2 for practical applications, I was able to apply the conceptual foundation in my work that revolved around C programming.

After university, I worked in systems programming. I designed and implemented drivers for a company that manufactured proprietary hardware. Then I changed to work with another company in the field of machine translation and computer based training. After 5 years of coding in C, I thought it was time for a change. This was the early nineties, so I turned my attention to application programming with RAD tools which had just hit the market. I learned SQL inside out and created data-driven programs. Visual Basic 3.0 was the killer application in 1993, as it made the construction of Windows GUIs extremely easy. I was able to tie in with my prior Basic experience. Customers liked the productivity that comes with RAD.

After about a year, I dropped VB in favour of Delphi, which was superior for this purpose. Likewise, I could tie in with my previous Pascal experience. I learned the rudiments of object oriented programming with Object Pascal, which is odd given that C++ would have been the more natural path to object orientation after having programmed in C for many years. However, Object Pascal taught me proper componentisation. This was the mid-nineties and a lot of amazing things happened in the IT industry. The most important change was the commercial breakthrough of the Internet. Almost simultaneously, the Linux phenomenon happened. The IT industry boomed and technological progress was fast-paced. The Internet connected everybody everywhere and Linux brought corporate computing horsepower to the desktop.

As a result of these changes, I began coding HTML in 1996 and I learned JavaScript and Perl in 1997. The next year brought even more changes, as I decided to gear my business towards web development. Perl seemed like an idiosyncratic Unix solution born out of necessity. It was certainly practical for server side programming, but it was also rather painful and hackish. Fortunately, PHP appeared at around the same time and it offered a much cleaner solution for server programming.

Soon I found myself programming web applications in PHP most of the time. LAMP-based applications literally exploded on the Internet between 1998 and 2003. During this period, I also learned the rudiments of Java, C++, and C#. I was responsible for the management of projects implemented in all of these languages. Object-oriented programming had become the mainstream paradigm in the late nineties. I decided that I needed to take on one of these languages more seriously.

The obvious choice was Java, since it was general purpose, but still very strong in the field of web development. So I fully immersed myself in Java when the language made the transition from 1.4 to 1.5. At that point, Java was already mature and mainstream. As a latecomer to Java, the platform seemed huge to me, certainly larger than anything I had looked at before, including .NET. The sheer number of APIs was just unbelievable. It required a sustained effort of two years during which I read a shelf of Java books and began moving from trivial programming exercises to small projects and then to larger projects. Since the mid-2000s, Java has become my mainstay.

There are two reasons why I like Java. First, there is a fantastic eco-system connected to the platform. It ranges from best-of-breed IDEs, VMs, and app-servers to a gazillion libraries and frameworks, and (almost) everything is free. Second, Java is extremely scalable and robust. It is not the purest object-oriented language, neither the richest, but Java is probably the one language that transforms average programmers into software engineers. I argue that this is so, because of the high level of standardisation and best practices endorsement in the Java community.

I know that there are quite a few people who debate that. However, there’s a reason why universities teach Java to freshmen and why corporations use Java for enterprise development. It offers the largest and possibly the most robust platform for developing industrial-strength software. Of course, not everything is hunky-dory in the Java department. I perceive that the main problem is the language itself. - It’s aging. - Although (or perhaps because) it forces programmers to write tidy code and relinquish dirty C tricks, it tends to be tedious, as it involves generous amounts of boilerplate code. It also lacks good paradigms for fine-grained concurrency control.

Fortunately, with the Scala language I discovered a possible solution for these problems. At this point -early 2008- I haven’t yet done any larger projects in Scala, but my eagerness to do so is growing. Adding the functional paradigm to my programming instruments is very beneficial. It even flows over into my Java work, since it has changed the way I phrase algorithms in Java. The only negative effect is that by learning Scala, the limitations of the Java language became more evident and thus more painful. While functional programming will probably grow in the near future, Java has such a strong position that it won’t just fade away. Many large systems have been created in Java, so there will be maintenance work for decades to come. Meanwhile, it will be interesting to see how fast the industry embraces functional programming.