Natty Unity UI

So, the Natty Narwhal 11.04 release of Ubuntu has finally arrived, entering the Linux stage with a fanfare. Many oohs and aahs were heard throughout the blogosphere during the past few months, and it seems that outcries alternated with songs of praise. Canonical's new user interface called Unity was described as a "dramatic new look", an "aggressive change", as "revolutionary", "a breath of fresh air", and "a blight on the Linux OS". – Frankly, I cannot understand what all the fuss is about. Yes, the desktop looks a bit different, but hardly different in a revolutionary way. There's a new strip of launcher icons on the left side of the desktop (called the dock), the bottom panel is missing, and the top panel isn't a conventional GNOME panel, but a menu bar. Not exactly what I would call cataclysmic changes in the world of computing.

With the latest Ubuntu version, the Linux desktop looks even more Mac-ish, if you ask me. I admit that it took me a few days to get used it, but I like most of the ideas that went into the Unity shell, so I've decided to keep it. Having the launcher on the left side frees up vertical space. This is a good idea, because most modern monitors are in 16:9 widescreen format. The launcher dock also doubles up as window switcher and indicator. Displaying the application menus in the top panel will probably meet with resistance from Windows, KDE, GNOME users, or at least break with tradition. It saves vertical space, however, at the expense of longer mouse trails from the application window.

Another Unity innovation is the "dash" (another D-word), a search window that lets you find applications or documents. It comes in the same bright-on-dark jewel case appearance as the other Unity components and it locates less frequently used programs or files by displaying incremental search results for the characters typed into the search field. I find this much easier and superior to opening nested menus to start applications. A nice improvement. The work space switcher and panel indicators are likewise felicitous adaptations of true and tested UI concepts.

Unity has still a few rough edges, though. The most obvious one would be the unspeakable clunkiness of the default 64px launcher icons which look inappropriate on any type of screen, unless your intend to operate a touchscreen with protective gloves on. Fortunately, the icon size can reduced to 32px using the Compiz Config Settings Manager. This lets you obviously display twice as many launcher icons in the strip, uhm, I mean dock. Furthermore, I am not sure if application menus really belong into the global top panel. Finally, it isn't yet possible to start multiple instances of applications from the dock, for example terminal windows or editors. A special operation such as Shift+Click on a program icon would be handly for this purpose.

I had also grown quite fond of the GNOME weather panel indicator, which is missing from the Unity panel. I found myself looking at the weather panel more often than at the thermometers in my house. This can be fixed as well by installing an additional program package called indicator-weather from a PPA:

sudo add-apt-repository ppa:weather-indicator-team/ppa
sudo apt-get update
sudo apt-get install indicator-weather

In technical terms, Unity is far less "revolutionary" than most people think. Although it replaces the GNOME shell, it is still firmly embedded in the GNOME desktop environment and it is designed to be used with GTK+ desktop applications. Unity is implemented as plugin for Compiz, the same window manager that was already used by previous Ubuntu versions. Unity does not provide its own file manager, but uses the well-tried Nautilus program for file system presentation and file operations.

If as a Ubuntu user you don't like Unity, it is very easy to revert to the old GNOME 2.x shell. Just select the "Ubuntu Classic Desktop" from the drop-down box at the bottom of the login screen. The computer remembers the setting, so you have to change this option only once. It is even possible to use GNOME 3 with Natty Narwhal, although this requires installing additional software, because GNOME 3 is not included by default. If you want to try out or use GNOME 3, try these commands:

sudo add-apt-repository ppa:gnome3-team/gnome3
sudo apt-get update
sudo apt-get dist-upgrade
sudo apt-get install gnome-shell

Onward To Lucid Lynx

Ubuntu 10.04 alias Lucid Lynx has arrived and because this is a long-time support version, many users are bound to upgrade within the next few weeks. It seems like the GUI people from Canonical were unusually daring this time. Not only is this the first Ubuntu version that sports a graphical interface that is NOT BROWN (shock!), but the window control buttons are on the wrong side, namely on the left (double shock!). Apparently, Mac OSX Leopard has godfathered here. Well, I am not going to get used window controls on the left side, so I applied a quick fix which is amply documented on the Internet, as many people seem to feel the same way. Otherwise, the new look is a welcome change, as the permutations of brown and orange seemed to have been exhausted.

The only thing that turned out to be slightly trickier was the Tomcat upgrade to 6.0.24. A surreptitious installation of Apache 2 (the purpose of which eluded me) took possession of port 80 which on my machine was previously occupied by the system-wide Tomcat installation. This was rather easy to solve with the command: sudo update-rc.d -f apache2 remove to disable Apache on boot. It turned out, however, that the application launcher jsvc was removed in Ubuntu 10.04. Since Tomcat previously used jsvc to launch Tomcat on privileged ports, Tomcat was not able to bind to port 80 any longer. I was able to solve this by setting the AUTHBIND variable in /etc/default/tomcat to ‘yes’. After that Tomcat started up on port 80 without complaints.

Ubuntu 10.04 Default Theme

During the upgrade, the system politely asked whether to replace or keep manually changed system configuration files. I have chosen to replace most files, because, the upgrade manager is kind enough to create a copy of the existing configuration using the *.dpkg-old extension during the upgrade. That way I was able to diff configuration files later and incorporate any customisations into the new files. This method is superior to keeping the old files, because it allows for upgrading the configuration files in sync with the latest program versions, though, of course it takes a bit of work manually diffing and patching those files if you happen to have numerous customisations. You can alternatively keep the old files and then diff and patch the new files created by the upgrade manager with the *.dist extension. In summary, the upgrade was painless and took less than 90 minutes per machine.

Ubuntu Newbie Tips

ubuntu.pngI’ve been using Linux on servers in various flavours since 1997, but I am relatively new to Ubuntu and I have just started using Ubuntu as a desktop OS. Despite some installation problems, the overall experience was very positive. I had made earlier attempts to switch over to Linux, but for one or another reason these were thwarted, mostly because of the professional necessity of testing software under Windows. Since I am now working on cross-platform applications that particular constraint has evaporated. I spend most of my day developing software and writing documentation. Before installing Ubuntu, I was slightly concerned that there would be a temporary decrease in productivity due to having to learn new software. However, this turned out to be largely unfounded.

Most of the key applications like Eclipse, Firefox, Thunderbird, and OpenOffice work exactly the same under Linux as they do under Windows. The only major change was replacing Notepad++ (which only runs on Windows) by vi/vim. These editors are suitable for programming in situations where you don’t want to fire up an IDE. Furthermore, I have made some customisations to ease the transition, which I’d like to share with you. If you are new to Linux, you might find one or another useful for your own work. The following list is by no means exhaustive or even comprehensive, just a number of things I stumbled across during my first two weeks with desktop Ubuntu.

Repositories and download servers
Ubuntu maintains software packages with the Synaptic package manager. Because as a new user you are likely to make frequent use of this tool, one of the most useful things to do is to optimise its usage. This involves defining the repositories and the download server. Choose System/Administration/Software Sources from the main menu. In the first tab “Ubuntu Software”, select the four items marked with “main”, “universe”, “restricted” and “multiverse” for the widest choice of software packages. Next, optimise the download server. I wasted a whole day with downloading the 9.04->9.10 update, because of a slow server. Ubuntu can find the fastest server for you. Select “Other…” in the “Download from” dropdown-box. A dialogue with a list of servers shows on screen. Click on “Select Best Server” to let Ubuntu test all available servers for their response time and select the fastest one.

Keyboard and language customisations
If you are -like me- frequently typing text in different languages, chances are that the default language and keyboard settings will not suit you. Fortunately, Ubuntu is easy to configure for international use, possibly even superior to Windows in this regard. First, I added Thai language support in System/Administration/Language Support. Then I configured two additional keyboard layouts, German and Thai, in System/Preferences/Keyboard/Layout. As I am using a Thai/English keyboard, I have to remember the German key mapping by heart which is only of limited use. On Windows I got used to producing international characters by typing ALT+num key sequences. On Linux, this is even easier thanks to the concept of the compose key. In the keyboard layout dialogue, click on “Layout Options” which will show you a number of intricate keyboard customisation options. Click on “Compose key position” and pick a key, for instance “Right Alt”. Now you can use this key to compose international characters. For example, type right Alt, double quotation marks, and letter ‘u’ to produce the German Umlaut ‘ü’. Type right Alt, backtick and the letter ‘a’ to produce the accent grave ‘à’. Voilà!

Customising Nautilus
Nautilus is the Linux/Gnome equivalent to the Windows Explorer. In fact, I find it to be superior to the latter, because it supports protocols for remote access (such as ftp/sftp); it offers better search capability and better support for compressed files. If you prefer to work with a GUI rather than the command line, you would probably want to customise Nautilus in some way. The most obvious candidates for customisation are probably file associations. These can be defined by right-clicking on a file, selecting “Properties” from the context menu and switching to the “Open With” tab in the property dialogue. Here you can define alternative applications to use for opening a file, as well as the default application that is started upon double-click. If you need even more customisation options, install the package named “nautilus-actions”. This package lets you define custom actions for file entries in Nautilus which can be incorporated into the context menu. Predefined Nautilus extensions (aka shell extensions) for various file display and transformation purposes are also available.

Command line and terminal customisations
Ubuntu comes with the bash (Bourne again shell) and the Gnome-Terminal as command line defaults. These are fine for me. However, there is one feature which I found missing in the terminal application. It is not possible to search the output buffer. For example, when I run applications that produce a large amount of diagnostic output, there is no intuitive way to search trough this data, other than piping it into a command like “less”. I have found a little program named “screen” which appears to solve this problem. After “screen” is started, virtual sessions can be created within the same terminal window, each with its own searchable buffer. “Screen” involves remembering some arcane keyboard commands, but that’s the best I could find so far. Another command line annoyance is that the “vi” editor runs in compatible mode by default. This will let the cursors keys produce character output in insert mode; in other words, the cursor keys are broken. There is an easy fix for this, however. Put a file named .vimrc in your home directory that contains a single line saying “set nocompatible” and the cursor keys will work again.

Backup and antivirus software
Surprisingly, neither backup nor antivirus software packages are included in the default Ubuntu installation. Although viruses are probably not an immediate threat on a Linux system, I would rather not breed any of them on my machine. There is the open source software clamAV as well as a number of free-for-private-use commercial offerings for Linux. I am still evaluating antivirus software. So far I found clamAV and AVG quite usable, but not quite as convenient as under Windows. Backup software is an absolute necessity in my opinion, and I am surprised that it isn’t integrated in the original Ubuntu installation. Of course, individual backup needs differ, but a simple mirroring and archiving facility is probably required for even the most basic usage. Initially, I planned to hack a script based on rsync together for that purpose, but I have found something much nicer. The “backintime” package lets you create incremental backups with great ease and minimal storage requirements. Backintime revolves around the concept of snapshots; it is a GUI framework for rsync, diff, and cron. I highly recommend it.

Second thoughts about desktop Linux

One of the best things about Linux is that it is free, and the saying goes: "Don't look a gift horse in the mouth." However, there are times when one must break with tradition. If you are evaluating Linux for home or business use, you must ask yourself whether Linux provides equally good features and usability as commercial operating systems. Unfortunately, there is no simple answer to this question. Although Linux is very succcessful in the server market, it did not yet make a comparable splash in the desktop market. Here are some reasons why this might be so. Disclaimer: Let me mention a few things to prevent possible misunderstandings: I don't have any affiliations with Microsoft; Linux is installed at my office since 1996; I was one of the founding members of the Thai Linux User Group (TLUG), and I have written a series of Linux articles for an English language newspaper. Antiquated software concepts One of the first critcisms against the Linux OS was brought forward by the man who inspired it, Andrew Tanenbaum. He is the author of the MINIX OS on which early versions of Linux were based. Tanenbaum described the Linux kernel design as backward-oriented, because it contains a monolithic kernel (like the Unices of the 1970s) instead of a microkernel. A monolithic kernel lacks modularisation; it makes the kernel more vulnerable to internal faults and it complicates updates and extensions. This was an early design weakness. Early Linux distributions had additional problems, such as a filesystem limitations, rudimentary support for PnP, lack of scalability and other shortcomings. Many of these limitations have been resolved in the meantime, but the design of the Linux kernel still has a distinctive 1970s flair. Most of the concepts that underlie Unix (and thus Linux) are more than 30 years old. Lack Of User Friendliness In the 1970s, the term "user-friendliness" was used to describe the attitude of computer administrators, rather than the behavior of operating systems. Unix certainly kept with a user-hostile tradition, and unfortunately, so did Linux. Things began to change with the advent of desktop managers. Desktop managers provide graphical user interfaces which allow the user to interact with the system using graphical elements and a mouse. The two most popular Linux desktop managers are KDE and Gnome. They are both based on X-Windows, the lower level graphics engine, which is fairly complex in itself. KDE and Gnome are actually more than mere desktop managers. They are collections of applications, including browsers, calculators, organisers, mail clients and other programs which together provide standard PC functionality. Market Share Of Desktop Managers (Source: linux-desktop-stats-2.png While KDE and Gnome improved the user-friendliness of Linux a great deal, there are still serious problems. The first problem is that there are two desktop managers, which means that users have to learn two different GUIs instead of one. The second problem is that not all applications make use of them. Some applications use other GUIs, other applications don't have a GUI at all. There are a number of things you can do only with the command line interface. Whenever you have to use the command line interface, user friendliness goes out of the window. Usability User-friendliness is actually just one aspect of usability. The concept of usability is more comprehensive as it is concerned with the overall efficiency with which users employ a tool (such as an operating system) to accomplish particular tasks. There are a number of studies that have been conducted in view of Linux usability. The metrics as well as the results of these studies vary greatly. It should be noted, however, that even those studies which are generally favourable towards the Linux OS bring forward some criticisms. A case in point is the study by Relevantive AG, which considers the usability of Linux "to be nearly equal to Windows XP", but also mentions the following paragraph in the executive summary: "The study also reveals significant problems that are connected with Linux as a desktop system. This mainly consists of the poor wording of programs and interfaces, the lack, at times, of clarity and structure of the desktop interface as well as the menus, and poor system feedback." Inconsistency and poor user interface design has long been a problem in the Linux world. The reason for this is obvious. Linux applications have been developed by a large, disparate group of volunteer programmers. The work of different groups is rarely coordinated in a structured manner. Lack of usability engineering and testing is thus a direct result of the absence of centralised management. Fortunately, the KDE and Gnome teams have recognised this problem and are beginning to address this. Inflexible and complicated file access control If you use Linux on your home PC, file access control probably doesn't matter to you. However, it definitely matters on multi-user systems in business environments. Unfortunately, Linux has inherited the ghastly Unix file permissions system which forces you to learn octal numbers and which has brought despair to many system administrators. For example, you must concern yourself with esoteric questions, such as what happens if a file has read/execute permission and the containing directory has only read permissions. A great number of security problems in Linux systems is caused by incorrect or inappropriate permission settings which escaped the eye of the administrator. What is more, Unix file permissions don't allow you to assign multiple users and groups to file system resources. This means you must treat groups as roles and assign individual users to these groups. If there are two groups who should have access to a resource, you must define a third group containing the union of both groups. In short: it's an administration nightmare, especially on systems with a large number of users. To make things worse, the "root" user can access all file resources regardless of permissions. Since system administration requires "root" privileges, this means your system administrator will have access to all resources, including payrolls, balance sheets, and all sorts of confidential information that may be stored on the Linux computer. Higher system maintenance and administration costs One staple argument that Microsoft keeps throwing at Linux is that its total cost of ownership (TCO) is higher than that of Windows. Microsoft tried to prove its case quite unconvincingly by citing Microsoft-sponsored studies. Is this just FUD or is there some grain of truth in it? Well, it depends. If you install a Linux distribution, such as SuSE, and you are happy with the out-of-the-box functionality, then there are virtually no maintenance costs. However, in real life this is hardly ever the case. Normally you want customise and configure the computer for a particular purpose, and you might also want to install additional software packages and update older programs. Customisation and configuration are easy as long as you can use a GUI, such as SuSE's Yast program to do the job. Likewise, program maintenance is easy as long as you can find the right RPM package. But this is not always the case. Frequently, administrators must resort to editing configuration files or installing software packages manually. For example, if you can't find a vendor-specific binary RPM for your software update, you might have to install the software from a source distribution. This requires a C-Compiler, some basic system administration knowledge, and considerably more time, especially if things go wrong. On average, system configuration and maintenance tasks take therefore longer to accomplish on Linux. This may indeed result in
higher TCO if the amount of maintenance is high. Let's assume that the cost of an inhouse Linux administrator is $35 per hour (which is conservative) and that a Windows volume license costs $105 a piece. This means, if the administrator spends within the lifespan of a Linux installation (say two years) more than three hours longer with updates and maintenance per machine, the license cost advantage is forfeited. Companies that want to deploy Linux should therefore take a close look at the planned usage and maintenance requirements. The impact of administration and maintenance cost on TCO is quite high for a small installation, but decreases with growing number of machines. Little standardisation among Linux distributions Linux has inherited a problem that has plagued the Unix world for decades: a lack of standardisation. This is actually a natural outcome of the Linux history, since the system is designed for maximum configurability and it is used and developed by various groups with disparate goals. The problem lies with the multitude of distributions rather than with the kernel. Linux is packaged by different vendors. These vendors cater to different target user groups ranging from beginners and commercial users to developers and expert users. This results in subtle (or less subtle) differences in system configuration, file system structure, administration tools, all of which can be pretty frustrating for an IT administrator who has to manage different Linux distributions. Of course, it is also confusing for users who switch between distributions. Market share of Linux distributions (Source: linux-desktop-stats-1.png An average Linux distribution comes with several thousand software packages. This gives users a tremendous amount of choice. Principally, choice is a good thing. Users can select those packages that match their requirements best. But sometimes, choice is a bad thing. First, the time spent on evaluating software products is unproductive time. Second, having multiple softwares that do the same thing is not necessary and -worse- it may result in compatibility problems. Third, it is better to use one software that fulfils all requirements than having to use two or three different softwares, each of which fulfils a part of the requirements. Fourth, time spent on maintenance and updates is proportional to the number of deployed software packages. Applications are difficult to install and maintain The complaint "difficult to install" was uttered by many early Linux users in the 1990s. Fortunately, things have improved in the meantime and most Linux distributions are now as easy to install as the Windows OS. Generally this is also true for application, albeit not for all applications. Frequently it is necessary to install applications from a *.tar.gz source distribution. For example, this may be the case when you install very new versions or exotic software for which there is no installer or RPM available. A software installed in this manner is difficult to maintain, because it often cannot be removed or updated automatically. This means that whenever you install from source you must keep track of all the application files and dependencies by yourself, which again adds to maintenance overhead and TCO. Low backward compatibility Have you ever tried to install new software on a two or three years old Linux installation? Chances are that it won't work. The most common problem is that the new software requires up-to-date libraries, or that it requires an updated version of another RPM package on which it depends. Under these circumstances, you must update parts of the system before you can install the new software. This scenario more likely, if your Linux installation is older. It is partly due to the "release early, release often" practice in the open source world and partly to the idiosyncracies of Linux distributions. Ideally, you would have to update a Linux installation at least every six months, but this will of course increase the TCO to an intolerable level. In a commercial environment, people often go with the same OS installation for years, simply because it is too disruptive and too expensive to keep upgrading the OS. When running Linux, this often means you can use only those applications that the initial installation supports. Not all applications run on Linux Finally, when switching to Linux it is important to make sure that all applications that you or your company need are available under Linux. Unfortunately, this is not always the case. Although one can hardly reproach Linux for the fact that not all application developers support the platform, it is a make-or-break-it criteria. If you depend on a certain software and that software doesn't run on Linux, you could still run the software using an emulation software such as Wine or Crossover, but this carries a high price in complexity and performance. The likelihood that the application doesn't run under Linux is somewhat increased in the graphics, engineering, and games sectors. However, more and more software makers are adopting a cross-platform strategy and make their products available for Linux.