Window Manager and xcompmgr

In my current Window Maker configuration, I use xcompmgr to create some nice shadows for the windows. The problem is that xcompmgr sometimes doesn’t realize that a window has closed, and thus the shading remains while the window is actually no longer there. It is possible to “fix” this problem by reopening the window and closing it, but this is ot workable: the only workable work-around is to restart xcompmgr; since xcompmgr is small and fast, this is not a problem.

There are a number of ways to automate this…

Currently, xcompmgr is started in ~/GNUstep/Library/WindowMaker/autostart:

xcompmgr -c -C -r 8 -l -12 -t -12 &

To make it easy to restart xcompmgr, let’s put a program around it that automatically starts the program again if it dies. Since xcompmgr does not go into the background, we can just start it up and when it exits, restart (in a loop).

With Ruby, we can do this:

#!/usr/bin/ruby
#
# xcompmgr2: run xcompmgr and keep it running:
#            permits autorestart of xcompmgr

# Get rid of any programs already running...
`pkill -x xcompmgr`

while (1==1) do

    # Main program: run it!
    `/usr/bin/xcompmgr -c -C -r 8 -l -12 -t -12`

    # Catch and log unusual exitstatus...
    if ($?.exitstatus != 15 && $?.exitstatus != 0) then
        # Not good
    end
end

If xcompmgr exits for any reason, it loops again in the infinite loop. This sort of program could be done in almost any language: certainly Perl, ksh, and PHP are up to the task (as are many others).

Thus, when xcompmgr gets confused, do this command to fix it:

pkill -x xcompmgr

The -x option makes it so that xcompmgr is killed, and not xcompmgr2: -x matches a command name exactly.

5 Programs That Should be in the Base Install

There are a number of programs that never seem to be installed with the base system, but should be. In this day and age of click-to-install, these programs will often require an additional install – I maintain that this should not be.

Most of these will be relevant to Linux, but the programs will often be missing on other commercial UNIXes also.

  • Ruby. This is the first that comes to mind. I have been installing ruby onto systems since 1.46 – and ruby is still a fantastic scripting language, and one of the best implementations of object-orientated programming since Smalltalk.
  • m4. I recently wrote about m4, and thought it was already installed on my Ubuntu Karmic system – not so. I used it to create a template for the APT sources.list file.
  • ssh. This should be installed everywhere automatically, and not as an add-on. For many UNIX systems, ssh is an add-on product that must be selected or compiled from source.
  • rsync. Rsync is a fabulous way to copy files across the network while minimizing traffic – even though it is not designed to be a fast way.
  • ksh. This will surprise most commercial UNIX administrators. However, Linux does not come with ksh installed – and the emulation by GNU bash is weak. Now you can install either AT&T ksh-93 (the newest version!) or the old standby, pdksh (which is close to ksh-88).

OpenVMS is a different animal – and some of the things that should be installed by default would be perl, ruby, java, SMH, and ssh. I’m not sure if perl or ssh is installed by default, but they should be. OpenVMS should also support compliant NFS v3 and v4 support out of the box – without making it difficult to connect to other NFS servers.

What programs do you think should be in the base install?

Scripting on the Java Virtual Machine

Now that OpenVMS has Java, and that HP-UX has Java, I started wondering about the possibility of running a scripting language on the Java Virtual Machine (JVM) as a way of supporting all these diverse environments with the same code and tools.

Choosing a language can be difficult, especially when there are so many good ones to choose from. I’ll assume for purposes of discussion that one is already familiar with at least one or more computer languages already (you should be!).

So what are the criteria a system administrator should use to choose a language on the JVM?

  • Does the language have a strong and vibrant community around it? The language might be nice, but if no new development is being done on it, it will eventually fail and stop working on the newer JVM. Bugs will not be fixed if development has halted. It also helps to have a large variety of people to call on when trouble arises, or when your code has to be maintained in the future.
  • Does the language support your favored method of programming? If you have no desire to learn functional programming, then don’t choose a language that is a functional language. Find a language that thinks the way you do (unless you are specifically trying to stretch your mind…).
  • Is your preferred langauge already available for the JVM? There are implementations of Ruby (JRuby), Python (Jython), LISP (Armed Bear Common Lisp), Tcl (Jacl) and many others. A language that you already know will reduce your learning time to near zero on the Java Virtual Machine.
  • What are the requirements? For example, JRuby requires a dozen libraries; Clojure and Armed Bear Common Lisp have no outside requirements. Which is simpler to install onto a new machine?

So what languages am I looking at? I am looking at these:

  • Clojure – a LISP-like functional programming language which seems to be taking off handsomely.
  • JRuby – Ruby is my all-time favorite scripting language, and having it available whereever the Java VM is is a very tintillating prospect. It’s also directly supported by Sun, the makers of Java.
  • Groovy – this is a new language that takes after Ruby and Smalltalk, and it is growing in popularity at a dramatic pace.
  • Scala – this is a language with a strong developer base and an object-oriented and functional design. Don’t know much more about it yet.
  • Armed Bear Common Lisp – ABCL is a full Common LISP implementation for the Java VM, and is used as part of the J editor. Unlike ABCL, development on J seems to have stopped; development on ABCL has gone through a resurgence after not quite dying for several years. ABCL is the closest thing to LISP on the JVM, and is usually the first mentioned – even though its development community is not nearly as strong as Scala or JRuby.

These are only the ones I’ve chosen to focus on; there are many, many more.

Perl Tidbits: Annoyances and Surprises

Having given up Perl at one time to use Ruby almost exclusively, I have returned to Perl in recent years since Ruby is not available for HP-UX 11i v3 nor for OpenVMS 8.3. Perhaps some Ruby folks are listening; it is abominable that HP-UX 11i has gone without Ruby for so long. It would be nice to be able to say that it is shameful that Ruby doesn’t exist on OpenVMS 8.3 – but we know how popular OpenVMS is…

Here are some of the things I’ve discovered in bits and pieces about Perl since my “return”:

  • HP-UX does not use the Perl core, but rather ActiveState Perl. This may not sound like much, but it can be important when using modules. For instance, the Net::FTP module has a bug in it that causes the last character of ASCII transfers to be “snipped”, and several Perl core modules are missing entirely.
  • It is possible for Perl to quietly abort without warnings. This makes no sense, but I’ve experienced it. Only stepping through the code with the debugger will show what line it is – and removing the line in question makes the program start working again.
  • Using CPAN can corrupt your HP-UX Perl installation. I suspect this is because CPAN assumes that you are using the Perl core, and if not, then things can break. In my case, the module Data::Dumper was hosed so bad I needed to reinstall Perl.
  • Perl requires an additional module to emulate Ruby’s p command. This was a real disappointment; attempting to print data in a readable format (no matter what its type) is a simple command in Ruby; in Perl, you must include an additional module and use a function with a confusing name (is it Dump? or is it Dumper? or is it dump?) – I often have to correct my code to use the right one.
  • Testing for a module’s existance is easy. Use the command perl -e 'use My::Module;' to see if the module is there.
  • Don’t think of $var as the name of a variable, but as the value of a variable. Just this one change in thinking has made a tremendous amount of difference in programming. When you think this way, you see an entry like @myarray[4] for what it is: a list with a single element in it. Likewise, $mine{entry} is a scalar value returned from a hash. The same “variable” can be found with differing leading punctuation based on its value, not on its type.
  • If your script needs to be portable across platforms, then write it in Perl. The Korn shell is nice and even more ubiquitous than Perl, and Ruby is (in my opinion, anyway) much nicer than Perl, but Perl is where ksh and Ruby are not. Being there is a significant part of the battle: if that snazzy scripting language isn’t installed, you can’t use it – and Perl is there.
  • The documentation (in perldoc) is tremendous! Use perldoc whenever you can: it includes more than a man page would. It includes tutorials, frequently asked questions (FAQs), module documentation, function documentation, and more.

Though I’d still write in Ruby at the drop of a hat, I have indeed rediscovered the joy of Perl. Part of this is due to a book that has illuminated the dusty corners of my knowledge: Effective Perl Programming by Joseph Hall with Randal Schwartz – an awesome book!

Automation: Live and Breathe It!

Automation should be second nature to a system administrator. I have a maxim that I try to live by: “If I can tell someone how to do it, I can tell a computer how to do it.” I put this into practice by automating everything I can.

Why is this so important? If you craft every machine by hand, then you wind up with a number of problems (or possible problems):

  • Each machine is independently configured, and each machine is different. No two machines will be alike – which means instead of one machine replicated one hundred times, you’ll have one hundred different machines.
  • Problems that exist on a machine may or may not exist on another – and may or may not get fixed when found. If machine alpha has a problem, how do you know that machine beta or machine charlie don’t have the same problem? How do you know the problem is fixed on all machines? You don’t.
  • How do you know all required software is present? You don’t. It might be present on machine alpha, but not machine delta.
  • How do you know all software is up to date and at the same revision? You don’t. If machine alpha and machine delta both have a particular software, maybe it is the same one and maybe not.
  • How do you know if you’ve configured two machines in the same way? Maybe you missed a particular configuration requirement – which will only show up later as a problem or service outage.
  • If you have to recover any given machine, how do you know it will be recovered to the same configuration? Often, the configuration may or may not be backed up – so then it has to be recreated. Are the same packages installed? The same set of software? The same patches?

To avoid these problems and more, automation should be a part of every system wherever possible. Automate the configuration – setup – reconfiguration – backups – and so forth. Don’t miss anything – and if you did, add the automation as soon as you know about it.

Things like Perl, TCL, Lua, and Ruby are all good for this.

Other tools that help tremendously in this area are automatic installation tools: Red Hat Kickstart (as well as Spacewalk), Solaris Jumpstart, HP’s Ignite-UX, and OpenSUSE Autoyast. These systems can, if configured properly, automatically install a machine unattended.

When combined with a tool like cfengine or puppet, these automatic installations can be nearly complete – from turning the system on for the very first time to full operation without operator intervention. This automated install not only improves reliability, but can free up hours of your time.

Using Parallel Processing for Text File Processing (and Shell Scripts)

Over at Onkar Joshi’s Blog, he wrote an article about how to write a shell script to process a text file using parallel processing. He provided an example script (provided here with my commentary in the comments):

# Split the file up into parts
# of 15,000 lines each
split -l 15000 originalFile.txt
#
# Process each part in separate processes -
# which will run on separate processors
# or CPU cores
for f in x*
do
runDataProcessor $f > $f.out &
done
#
# Now wait for all of the child processes
# to complete
wait
#
# All processing completed:
# build the text file back together
for k in *.out
do
cat $k >> combinedResult.txt
done

The commentary should be fairly complete. The main trick is to split the file into independent sections to be processed, then to process the sections independently. Not all files can be processed this way, but many can.

When the file is split up in this way, then multiple processes can be started to process the parts – and thus, separate processes will be put onto separate threads and the entire process will run faster. Other wise, with a single process, the entire process would only utilize one core, with no benefit of multiple cores.

The command split may already be on your system; HP-UX has it, and Linux has everything.

This combination could potentially be used for more things: the splitting of the task into parts (separate processes) then waiting for the end result and combining things back together as necessary. Remember that each process may get a separate processor, and that only if the operating system is supporting multiple threads will this work. Linux without SMP support will not work for example.

I did something like this in a Ruby script I wrote to rsync some massive files from a remote host – but in that case, there were no multiple cores (darn). I spawned multiple rsync commands within Ruby and tracked them with the Ruby script. It was mainly the network that was the bottleneck there, but it did speed things up some – with multiple cores and a faster CPU, who knows?

Also, in these days, most every scripting language has threads (which could potentially be run on multiple CPUs). I’ll see if I can’t put something together about threading in Ruby or Perl one of these days.

UPDATE:: Fixed links to article and blog (thanks for the tip, Onkar!).