Natty Unity UI

So, the Natty Narwhal 11.04 release of Ubuntu has finally arrived, entering the Linux stage with a fanfare. Many oohs and aahs were heard throughout the blogosphere during the past few months, and it seems that outcries alternated with songs of praise. Canonical's new user interface called Unity was described as a "dramatic new look", an "aggressive change", as "revolutionary", "a breath of fresh air", and "a blight on the Linux OS". – Frankly, I cannot understand what all the fuss is about. Yes, the desktop looks a bit different, but hardly different in a revolutionary way. There's a new strip of launcher icons on the left side of the desktop (called the dock), the bottom panel is missing, and the top panel isn't a conventional GNOME panel, but a menu bar. Not exactly what I would call cataclysmic changes in the world of computing.

With the latest Ubuntu version, the Linux desktop looks even more Mac-ish, if you ask me. I admit that it took me a few days to get used it, but I like most of the ideas that went into the Unity shell, so I've decided to keep it. Having the launcher on the left side frees up vertical space. This is a good idea, because most modern monitors are in 16:9 widescreen format. The launcher dock also doubles up as window switcher and indicator. Displaying the application menus in the top panel will probably meet with resistance from Windows, KDE, GNOME users, or at least break with tradition. It saves vertical space, however, at the expense of longer mouse trails from the application window.

Another Unity innovation is the "dash" (another D-word), a search window that lets you find applications or documents. It comes in the same bright-on-dark jewel case appearance as the other Unity components and it locates less frequently used programs or files by displaying incremental search results for the characters typed into the search field. I find this much easier and superior to opening nested menus to start applications. A nice improvement. The work space switcher and panel indicators are likewise felicitous adaptations of true and tested UI concepts.

Unity has still a few rough edges, though. The most obvious one would be the unspeakable clunkiness of the default 64px launcher icons which look inappropriate on any type of screen, unless your intend to operate a touchscreen with protective gloves on. Fortunately, the icon size can reduced to 32px using the Compiz Config Settings Manager. This lets you obviously display twice as many launcher icons in the strip, uhm, I mean dock. Furthermore, I am not sure if application menus really belong into the global top panel. Finally, it isn't yet possible to start multiple instances of applications from the dock, for example terminal windows or editors. A special operation such as Shift+Click on a program icon would be handly for this purpose.

I had also grown quite fond of the GNOME weather panel indicator, which is missing from the Unity panel. I found myself looking at the weather panel more often than at the thermometers in my house. This can be fixed as well by installing an additional program package called indicator-weather from a PPA:

sudo add-apt-repository ppa:weather-indicator-team/ppa
sudo apt-get update
sudo apt-get install indicator-weather

In technical terms, Unity is far less "revolutionary" than most people think. Although it replaces the GNOME shell, it is still firmly embedded in the GNOME desktop environment and it is designed to be used with GTK+ desktop applications. Unity is implemented as plugin for Compiz, the same window manager that was already used by previous Ubuntu versions. Unity does not provide its own file manager, but uses the well-tried Nautilus program for file system presentation and file operations.

If as a Ubuntu user you don't like Unity, it is very easy to revert to the old GNOME 2.x shell. Just select the "Ubuntu Classic Desktop" from the drop-down box at the bottom of the login screen. The computer remembers the setting, so you have to change this option only once. It is even possible to use GNOME 3 with Natty Narwhal, although this requires installing additional software, because GNOME 3 is not included by default. If you want to try out or use GNOME 3, try these commands:

sudo add-apt-repository ppa:gnome3-team/gnome3
sudo apt-get update
sudo apt-get dist-upgrade
sudo apt-get install gnome-shell

Ubuntu Newbie Tips

ubuntu.pngI’ve been using Linux on servers in various flavours since 1997, but I am relatively new to Ubuntu and I have just started using Ubuntu as a desktop OS. Despite some installation problems, the overall experience was very positive. I had made earlier attempts to switch over to Linux, but for one or another reason these were thwarted, mostly because of the professional necessity of testing software under Windows. Since I am now working on cross-platform applications that particular constraint has evaporated. I spend most of my day developing software and writing documentation. Before installing Ubuntu, I was slightly concerned that there would be a temporary decrease in productivity due to having to learn new software. However, this turned out to be largely unfounded.

Most of the key applications like Eclipse, Firefox, Thunderbird, and OpenOffice work exactly the same under Linux as they do under Windows. The only major change was replacing Notepad++ (which only runs on Windows) by vi/vim. These editors are suitable for programming in situations where you don’t want to fire up an IDE. Furthermore, I have made some customisations to ease the transition, which I’d like to share with you. If you are new to Linux, you might find one or another useful for your own work. The following list is by no means exhaustive or even comprehensive, just a number of things I stumbled across during my first two weeks with desktop Ubuntu.

Repositories and download servers
Ubuntu maintains software packages with the Synaptic package manager. Because as a new user you are likely to make frequent use of this tool, one of the most useful things to do is to optimise its usage. This involves defining the repositories and the download server. Choose System/Administration/Software Sources from the main menu. In the first tab “Ubuntu Software”, select the four items marked with “main”, “universe”, “restricted” and “multiverse” for the widest choice of software packages. Next, optimise the download server. I wasted a whole day with downloading the 9.04->9.10 update, because of a slow server. Ubuntu can find the fastest server for you. Select “Other…” in the “Download from” dropdown-box. A dialogue with a list of servers shows on screen. Click on “Select Best Server” to let Ubuntu test all available servers for their response time and select the fastest one.

Keyboard and language customisations
If you are -like me- frequently typing text in different languages, chances are that the default language and keyboard settings will not suit you. Fortunately, Ubuntu is easy to configure for international use, possibly even superior to Windows in this regard. First, I added Thai language support in System/Administration/Language Support. Then I configured two additional keyboard layouts, German and Thai, in System/Preferences/Keyboard/Layout. As I am using a Thai/English keyboard, I have to remember the German key mapping by heart which is only of limited use. On Windows I got used to producing international characters by typing ALT+num key sequences. On Linux, this is even easier thanks to the concept of the compose key. In the keyboard layout dialogue, click on “Layout Options” which will show you a number of intricate keyboard customisation options. Click on “Compose key position” and pick a key, for instance “Right Alt”. Now you can use this key to compose international characters. For example, type right Alt, double quotation marks, and letter ‘u’ to produce the German Umlaut ‘ü’. Type right Alt, backtick and the letter ‘a’ to produce the accent grave ‘à’. Voilà!

Customising Nautilus
Nautilus is the Linux/Gnome equivalent to the Windows Explorer. In fact, I find it to be superior to the latter, because it supports protocols for remote access (such as ftp/sftp); it offers better search capability and better support for compressed files. If you prefer to work with a GUI rather than the command line, you would probably want to customise Nautilus in some way. The most obvious candidates for customisation are probably file associations. These can be defined by right-clicking on a file, selecting “Properties” from the context menu and switching to the “Open With” tab in the property dialogue. Here you can define alternative applications to use for opening a file, as well as the default application that is started upon double-click. If you need even more customisation options, install the package named “nautilus-actions”. This package lets you define custom actions for file entries in Nautilus which can be incorporated into the context menu. Predefined Nautilus extensions (aka shell extensions) for various file display and transformation purposes are also available.

Command line and terminal customisations
Ubuntu comes with the bash (Bourne again shell) and the Gnome-Terminal as command line defaults. These are fine for me. However, there is one feature which I found missing in the terminal application. It is not possible to search the output buffer. For example, when I run applications that produce a large amount of diagnostic output, there is no intuitive way to search trough this data, other than piping it into a command like “less”. I have found a little program named “screen” which appears to solve this problem. After “screen” is started, virtual sessions can be created within the same terminal window, each with its own searchable buffer. “Screen” involves remembering some arcane keyboard commands, but that’s the best I could find so far. Another command line annoyance is that the “vi” editor runs in compatible mode by default. This will let the cursors keys produce character output in insert mode; in other words, the cursor keys are broken. There is an easy fix for this, however. Put a file named .vimrc in your home directory that contains a single line saying “set nocompatible” and the cursor keys will work again.

Backup and antivirus software
Surprisingly, neither backup nor antivirus software packages are included in the default Ubuntu installation. Although viruses are probably not an immediate threat on a Linux system, I would rather not breed any of them on my machine. There is the open source software clamAV as well as a number of free-for-private-use commercial offerings for Linux. I am still evaluating antivirus software. So far I found clamAV and AVG quite usable, but not quite as convenient as under Windows. Backup software is an absolute necessity in my opinion, and I am surprised that it isn’t integrated in the original Ubuntu installation. Of course, individual backup needs differ, but a simple mirroring and archiving facility is probably required for even the most basic usage. Initially, I planned to hack a script based on rsync together for that purpose, but I have found something much nicer. The “backintime” package lets you create incremental backups with great ease and minimal storage requirements. Backintime revolves around the concept of snapshots; it is a GUI framework for rsync, diff, and cron. I highly recommend it.

Zend Framework Review

zend-framework.gifEarlier this week, I gave the latest version of the Zend Framework v-1.9.2 another test drive. I had previously dabbled in v-1.7.4 as well as a pre-1.0 incarnation of the framework. I will not repeat listing the whole breadth of its functionality here, since you can find this elsewhere on the Internet. Neither will I present a point-by-point analysis, just the salient points, short and sweet, which you can expect to be coloured by my personal view.

Suffice to say that the ZF (Zend Famework) is based on MVC -you’d never guessed- and it provides functionality for database access, authentication and access control, form processing, validation, I/O filtering, web services access, and a bunch of other things you would expect from a web framework. The first thing to notice is that the framework has grown up and I mean this quite literally from a few megabytes in its early days to a whopping 109 MB (unzipped) distribution package. Only about 21 MB are used by the framework itself; the rest contains demos, tests, and… the dojo toolkit… an old acquaintance, which is optional.

The documentation for the ZF was excellent right from the beginning and it has staid that way. Included is a 1170-pages PDF file, which also bears testimony to the growing size and complexity of the framework. Gone are the days when one could hack together a web application without reading a manual. One of the first things to realise is that ZF is glue-framework rather than a full-stack framework. This means, it feels more like a library or a toolkit. ZF does not prescribe architecture and programming idioms like many other web frameworks do. This appears to fit the PHP culture well, though it must be mentioned that most ZF idioms come highly recommended, since they represent best OO practices.

Another thing that catches the eye is the lack of an ORM component, which may likewise be rooted in traditional PHP culture. If you want object mapping, you would have to code around ZF’s DB abstraction and use Doctrine, Propel, or something similar. Let’s get started with this item.

Database Persistence
ZF provides a number of classes for DB abstraction. Zend_Db_Table implements a table data gateway using reflection and DB metadata. You only need to define table names and primary keys. Zend_Db_Adapter, Zend_Db_Statement and Zend_Db_Select provide database abstraction and let you create DB-independent queries and SQL statements in an object oriented manner. However, as you are dealing directly with the DB backend, all your data definitions go into the DB rather than into objects. Although this matches with the traditional PHP approach, it means that you need to create schemas by hand, which may irritate people who have been using ORM layers, like Hibernate, for years. On the other hand, a full-blown ORM layer likely incurs a significant performance cost in PHP, so maybe the ZF approach is sane.

Fat Controller
Like many other frameworks, ZF puts a lot of application logic into the controller, and this is my main gripe with the ZF. It seems to be the result of the idea that the “model” should concern itself only with shovelling data from the DB into the application and vice versa. A case in point is the coupling between Zend_Form and validation. This leaves you no option, but to put both into the controller. I think that data validation logically belongs to the model, while form generation logically belongs to the view. If you pull this into the middle, it will not only bulge the controller, but it is likely to lead to repetition of validation logic in the long run. That’s why I love slim controllers. Ideally, a controller should do nothing but filtering, URL rewriting, dispatching, and error processing.

MVC Implementation
Having mentioned coupling, it would do ZF injustice to say that things are tightly coupled. Actually, the opposite is the case, as even the MVC implementation is loosely coupled. At the heart you find the Zend_Controller_Front class which is set up to intercept all requests to dynamic content via URL rewriting. The rewriting mechanism also allows user-friendly and SEO-friendly URLs. The front controller dispatches to custom action controllers implemented via Zend_Controller_Action; if non-standard dispatching is required this can be achieved by implementing a custom router interface with special URL inference rules. The Zend_Controller_Action is aptly named, because that’s where the action is, i.e. where the application accesses the model and does its magic. The controller structure provides hooks and interfaces for the realisation of a plugin architecture.

Views
Views are *.phtml files that contain HTML interspersed with plenty of display code contained in the traditional <? ?> tags. It should be possible to edit *.phtml files with a standard HTML editor. The Zend_View class is a thin object from which View files pull display data. View fragments are stitched together with the traditional PHP require() or with layouts. It is also possible to use a 3rd party templating system. Given the <? ?>, there is little to prevent application logic from creeping into the view, except reminding developers that this is an abominable practice punishable by public ridicule.

Layouts
Layouts are a view abstraction. They enable you to arrange the logical structure of page layouts into neat and clean XML. These layouts are then transformed into suitable output (meaning HTML in most cases). As you can probably infer, this takes a second parsing step inside the PHP application, which is somewhat unfortunate, since PHP itself already parses view components. While layouts are optional, they are definitely nice to have. I think it’s probably the best a framework can do given the language limitations of PHP, which only understands the <?php> tag. If the XML capabilities of PHP itself would be extended to process namespaced tags like <php:something>, then one could easily create custom tags and the need for performance-eating 2-step processing would probably evaporate. Ah, wouldn’t it be nice?

Ajax Support
ZF does not include its own Javascript toolkit or set of widgets, but it comes bundled with Dojo and it offers JSON support. The Zend_Json class provides super-simple PHP object serialisation and deserialisation from/to JSON. It can also translate XML to JSON. The Zend_Dojo class provides an interface to the Dojo toolkit and makes Dojo’s widgets (called dijits) play nicely with Zend_Forms. Of course, you are free to use any other Ajax toolkit instead of Dojo, such as YUI, jQuery, or Prototype.

Flexibility
As mentioned, ZF is very flexible. It’s sort of loosely coupled at the design level, which is both a blessing and a curse. It’s a blessing, because it puts few restrictions on application architecture, and it’s a curse, because it creates gaps for code to fall through. A case in point is dependency injection ala Spring. In short, there isn’t much in the way of dependency management, apart from general OO practices of course. Nothing keeps programmers from having dependencies floating around in global space or in the registry. A slightly more rigid approach that enforces inversion of control when wiring together the Zend components would  probably not have hurt.

Overall Impression
My overall impression of the ZF is very good. It is a comprehensive and well-designed framework for PHP web applications. What I like best about it that it offers a 100% object-oriented API that looks very clean and makes extensive use of best OO practices, such as open/closed principle, programming to interfaces, composi
tion over inheritance, and standard design patterns. The API is easy to read and understand. The internals of its implementation likewise make a good impression. The code looks clean and well structured, which is quite a nice change from PHP legacy code. ZF still involves a non-trivial learning curve because of its size. I’ve only had time to look into the key aspects, and didn’t get around to try out more specialised features like Zend_Captcha, Zend_Gdata, Zend_Pdf, Zend_Soap, and web services, and all the other features that ZF offers to web developers. If I had to choose a framework for a new web application, ZF would definitely be among the top contenders.

Galileo Troubles

Eclipse GalileoAnother year has passed in the Eclipse universe, and this means another minor release number and another Jupiter moon. Eclipse has moved from 3.4 to 3.5 or respectively from Ganymede to Galileo. Using a small gap in my busy development schedule, I decided to install the latest version this morning. Thanks to broadband Internet, the 180 MB JEE package was downloaded in a breeze and installed in a few minutes. Unfortunately, that’s where things stopped being easy.

When I downloaded the PDT plugin for PHP development, I found a bug in it that prevented Eclipse from creating a PHP project from existing sources. After some research on the Internet, I found that this was a well-documented bug which had been fixed in the meantime. I tried installing the latest PDT release via the Eclipse install & update feature, but the process came to a crashing halt with a message that demanded some mylyn jars that could not be found. Although I had no idea why PDT required that particular jar, I dutifully installed the mylyn plugins with the required version number.

Unfortunately, this did not impress Galileo, as it now demanded other jars when installing the PDT update. – Perhaps a case of workspace pollution, I thought. – Clearly, it was time for a fresh start. I scrapped the installation and started anew with a blank workspace and a new install location. This time, everything seemed to install fine. I was able to create Java and PHP projects. However, Galileo suddenly wouldn’t open *.xml, *.xsl, or *.html files any more. It complained that there was no editor for this content type, which appeared fishy since both web tools (WTP) and PDT were installed. I tried to solve the problem by playing around with the configuration, but to no avail.

After several fresh attempts and considerable time spent with looking up error messages on the Internet, I decided to stay with Ganymede. Since I had wasted my entire morning and since I had some real work to do as well, this seemed to be the best course of action. Maybe I will give Galileo another go when an updated distro package becomes available. With Ganymede I never ran into this sort of trouble, despite having PDT, WTP, the Scala plugin and Jboss tools installed. I am still clueless as to what went wrong and I wonder if anybody else had a similar experience.

Laments of a would-be Ubuntuist

I have been a Linux fan for more than a decade. I used Linux in my own company and projects since 1996  and I was also one of the founding members of the Bangkok Linux User Group. Oddly however, the computer on my desktop still runs on Windows. It’s a glaring contradiction. I’ve wanted to replace Windows for years. There’s always been a reason not to, mainly because I need to test software under Windows for my customers. Last weekend, the XP installation on my laptop “forgot” my user account and with it all account data. Simultaneously, the file system started to behave funny. “Ah, a sign from above,” I thought. “Finally the day has come, I will install Ubuntu on my laptop.” So I did. Ubuntu Dekstop 9.04 was installed with ease and -even more impressively- it recognised all of my Thinkpad hardware. Even the Wifi connection was up and running without fiddling about.

I should have said “almost all” hardware. Unfortunately one piece of hardware refused cooperation with Linux, namely my Novatel USB modem. Since I’ve come to rely on 3G mobile Internet, this is a knockout criterion. No modem, no Internet. After hours of scouring the Web for possible solutions and  trying out various settings, I gave up in frustration. There wasn’t anything I could do except zapping the Linux partition and installing old friend XP. To attenuate my disappointment, I will make it a dual boot machine, though. Note to hardware vendors: please take Linux seriously and provide drivers for your nifty electronics. That would make life much easier. I guess I have to postpone my switch-over to Linux for another year. Hopefully I will be able to resist the urge to buy another piece of exotic hardware in the meantime.