GNU/Linux Desktop Survival Guide
by Graham Williams
20190630 A particularly touted feature of Unix comes from a tools philosophy where complex tasks are performed by bringing together a collection of simpler tools. This is contrasted with the philosophy of providing monolithic applications that in one fell swoop solve all your problems, supposedly. The reality is often different.
Most operating systems supply a collection of basic utility programs for managing your files (things like arranging your files into directories, trashing files, and copying files from one place to another). Large applications then provide the word processing, spreadsheet, and web browsing functionality.
Unix places less emphasis on the monolithic applications. Instead, tools provide simple functionality, focusing on doing well what they are designed to do. They simply pass their results on to another tool once they're done. Unix pipes provide the mechanism for doing this: one tool pipes its output on to another tool. This allows complex actions to be performed by piping together a collection of simpler commands.
A simple example is to determine the number of users logged on to your system:
$ who | wc -l
The who command will list, one per line, each user logged on. The wc command will count the number of characters, words, and lines that it comes across, with the -l option only counting the number of lines. (GNU tools, like Unix, introduce options with the minus sign.)
For various reasons though this tools philosophy was often overlooked when large monolithic applications arose that did not adhere to the philosophy—they did not share components. Common tools such as Netscape, ghostview, Acrobat, FrameMaker, and OpenOffice essentially share very little. Compare that with the Microsoft community where, for example, an application like Internet Explorer is component-based. This is now changing in the GNU world with the operating system software and the Gnome project encouraging component-based architectures.