30 March 2007

bad geek joke: the bourne shell

Here's one I made earlier (19 Oct 2006 according to my pc):
It's modified from http://thebourneidentity.com/, which is incidentally a film I very much like.

For those who don't know, this is the bourne shell:
Which is the father of BASH (/bin/bash, the Bourne Again SHell)

partimage + stdout, existing code

On first inspection it looks like some code already exists for writing an image to stdout (standard output).

image_disk.cpp, line 558
  if (strcmp(m_szImageFilename, "stdout"))
//... network output code hidden for clarity ...
else // it's stdout
m_fImageFile = stdout;
showDebug(1, "image will be on stdout\n");

Unlike stdin for restore, stdout for save is not currently available in the command line options. I did do a build earlier where I enabled it (which I don't have any more due to my build problems). I managed to pipe an image to hexdump and seemed to be able to see some of the user interface info in the output.

It would seem the problem with the stdout option is that even in batch mode the program outputs interface data to stdout, which then corrupts the image.

I think I shall attempt to remove all the UI stuff, and make it act more like all the other unix tools. Might also try to create a reusable library out of it.

26 March 2007

compiling partimage

Had a problems getting partimage to compile on one of my pcs from a fresh checkout.
svn co https://partimage.svn.sourceforge.net/svnroot/partimage/trunk/partimage partimage

The ./autogen.sh script was failing as follows
tim@lap:~/projects/partimage$ ./autogen.sh
Running "autoreconf -vif" ...

autoreconf: Entering directory `.'
autoreconf: running: autopoint --force
autoreconf: running: aclocal -I m4 --output=aclocal.m4t
autoreconf: `aclocal.m4' is unchanged
autoreconf: configure.ac: tracing
autoreconf: running: libtoolize --copy --force
autoreconf: running: /usr/bin/autoconf --force
autoreconf: running: /usr/bin/autoheader --force
autoreconf: running: automake --add-missing --copy
configure.ac: 16: required file `./[config.h].in' not found
Makefile.am:1: AM_GNU_GETTEXT in `configure.ac' but `intl' not in SUBDIRS
automake: Makefile.am: AM_GNU_GETTEXT in `configure.ac' but `ALL_LINGUAS' not defined
autoreconf: automake failed with exit status: 1

Barked up lots of wrong trees, including looking for missing libraries, gettext config etc.

Turned out to be an old version of automake.

Not sure how my other pc ended up with the right version, but this pc's version was:
$ automake --version
automake (GNU automake) 1.4-p6

Installing new version (with some help from command line auto-completion):
$ apt-get install automake[tab]
automake automake1.5 automake1.8
automake1.4 automake1.6 automake1.9
automake1.4-doc automake1.7 automaken

$ sudo apt-get install automake1.9
$ automake --version
automake (GNU automake) 1.9.6

After updating automake, the ./autogen.sh script ran, and I could then run ./configure and make successfully, and was left with a binary for partimage in src/client/


The solution came from a post by Tibor Simko on cdsware.cern.ch:

Re: problem with autoreconf when installing from cvs

* From: Tibor Simko
* Subject: Re: problem with autoreconf when installing from cvs
* Date: Thu, 18 Jan 2007 18:12:20 +0100


On Thu, 18 Jan 2007, robert forkel wrote:

> $ autoreconf
> Makefile.am:23: AM_GNU_GETTEXT in `configure.ac' but `intl' not in SUBDIRS
> automake: Makefile.am: AM_GNU_GETTEXT in `configure.ac' but
> ALL_LINGUAS' not defined

Which version numbers of automake, autoconf, and gettext do you have?
E.g. automake versions prior to 1.9 and gettext versions prior to 0.14
will not work.

Best regards
Tibor Simko

16 March 2007

Multi-room music at home

me and slimserver

Today, I wanted to play music/radio in more than one room, and since BBC Radio 4 was playing The Archers, that ruled out the FM/Radio 4 simple option!

So, not liking to do anything the simple way, I set about searching for a way to broadcast sound to multiple rooms, preferably with a UDP/multicast type setup. Didn't manage that in the end, but have got something quite cool running.

Initially came across firefly media server, from an article [pdf] in linux magazine. Was put off by its absence from the ubuntu repositories.

I have a mate with a slimdevice, which is an awesome device. The server side of it is available free as it is OSS, and it is in the ubuntu repo (universe). So install was trivial:
sudo apt-get install slimserver

I could immediately connect with a web browser to http://localhost:9000/ and see the web interface (which is very good), and point any of my media players to http://localhost:9000/stream.mp3 and listen to the selected music. Nice. (Requires mp3 codec support to be installed. See easyubuntu)

Two things tripped me up connecting remotely. I had already spotted "Server Settings / Security / Allowed IP Addresses" and added my local subnet, but wasn't able to connect from another pc.
netstat showed that the server had only bound to the local ip address:
$ netstat -tln
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State
tcp 0 0* LISTEN

Through chance I knew about defaults files in ubuntu. Looking in /etc/defaults/slimserver, what do I find? Only bind to localhost. duh!
# This limits http access to the local host.
# Comment it out for network-wide access, or change
# to enable a single interface.

So, I commented out the http_addr line, and restarted the slimserver.
sudo /etc/init.d/slimserver restart
Slim server was now listening on *:9000

The other thing that tripped me up is that slimserver doesn't multicast, it maintains an independent stream & playlist for each connected device. So when I connected remotely I hadn't added music to the right playlist. In the web based interface there is a drop down list to select which device's playlist you want to modify. Once I figured that out it all worked. Yay. :-)

Didn't solve the original problem of playing the same audio simultaneously in multiple rooms, but it's cool nonetheless.

13 March 2007

Today's project - partimage enhancement

me and partimage

I recently reorganised the partitions on my laptop, with the help of some invaluable OSS tools.

The laptop was split into OS partitions, with a data and swap partition at the end, but I'd starting running out of space. I have since made ubuntu my only OS at home, so no longer require multiple partitions.
My partition table ended up looking something like this: data | OS | more data | swap, and I wanted it to look like this: OS & data | swap, but without having to rebuild (again).

With another linux box available with bags of disc space, I did something like the following:
  • from each data partition and my home folder: tar -cv datafolder | ssh otherbox "cat > laptop/datafolder.tar", which gave me a tarball of all my data
  • boot into knoppix 4
  • use partimage to save os parition image into filesystem of another partition
  • scp osimage.img otherbox:laptop/
  • fdisk to set up new partitions
  • pipe the image back into partimage across the wire: ssh otherbox "cat laptop/osimage.img" | partimage .... plus some flags for batch writing to new partition
  • use parted (partition editor) to stretch partition image back up to full size of new partition.
  • fix grub with help from knoppix - hda(0,2) to hda(0,0) or something.
  • remove references to non existent partitions from fstab
Which was all great, but I feel there's a feature missing from partimage. Although it can read an image from stdin for writing to disc, it can't write an image to stdout from disc. This would have saved me some thinking and some hassle. So in the true spirit of OSS, I shall have a go at adding the functionality.

So far, I have grabbed the source from sourceforge's svn server, managed to compile the source (after being confused by a misleading error message) and installed an IDE. I started with Eclipse, as I've been using it a bit recently and really like it, but figure that perhaps the C++ devs aren't likely to be java fans and maybe they would choose something else. So I've installed KDevelop, and will be having a go with that.