Making and installing Python on a 1and1 shared host

In my last post I built and installed Subversion. In this post, I’ll build and install the current 2.x version of Python, which will be needed for Trac, later.

Similar to how we built and installed Subversion, where are the commands to install Python. As before, you will need to get the HTTP address for the current version of Python (from and modify the procedure for your home directory path, which you can get by running the pwd command.

For line six, in the following list of commands, get the prefix path by entering cd ~/opt and then running the pwd command.

cd ~/dev
tar -xzvf Python-2.6.4.tgz
cd Python-2.6.4
mkdir ~/opt
./configure --prefix=$HOME/opt
make install

Modify your ~/.bash_profile file so you can run Subverison commands without typing the full path. If you’ve already modified your PATH to look in ~/opt/bin, then you should skip this.

echo 'export PATH=$HOME/opt/bin:$PATH' >> ~/.bash_profile

Note that I stuck the new Python at the beginning of the path so it overrides the random old Python that 1and1 has installed.

Use the source command to reset your environment from your newly modified ~/.bash_profile file.

source ~/.bash_profile

Test the result


Look at the version printed and exit with control-d.

End of procedure.

Making and installing Subversion on a 1and1 shared host

For historical reasons, I have a 1and1 shared host account with a few sites hosted there.

For more projects, I use Slicehost, which I’ve had great experiences with. Ideally, I’d like to move these 1and1 hosted sites to a Slice, but, my experience with 1and1 has been pretty flawless and two of the sites generate enough income that I am loathe to mess with them. (Other than changing Copyright dates, the code has not been changed in a couple of years. Not broke.)

So…I have this shared hosting. I like to avoid unnecessary expenses, so paying for a Slicehost without a clear money-making purpose is to be avoided. The problem is I’d like a Trac and Subversion setup for my personal stuff. I’ve got a few hosts that I use for various projects, but none are mine alone, so it would be wrong to host my personal stuff one one of them. Thus, I would like to set up Trac and Subversion on 1and1. Tricky. I do not have sudo rights there and can’t configure Apache, for instance.

Here, we will make and install Subversion. Later, I’ll install Python and Trac. Then, integrate them together.

First, download source for Subversion. I went to, found the source release area link and got the tarball HTTP links for the two tarballs I needed. We’ll create a directory called dev in our home directory and build Subversion there.

You’ll need to do the same so you get the current, stable version of Subversion (1.6.9 when I did this).

mkdir ~/dev
cd ~/dev

Esplode da tarballs into a working source tree. The following commands correctly melt the two tarballs into an appropriately named directory.

tar -xzvf subversion-1.6.9.tar.gz
tar -xzvf subversion-deps-1.6.9.tar.gz

Build Subversion and install it to a logical place, (~/opt/).

cd subversion-1.6.9
mkdir ~/opt
./configure --prefix=$HOME/opt
make install

Modify your ~/.bash_profile file so you can run Subverison commands without typing the full path.

echo 'export PATH=$HOME/opt/bin:$PATH' >> ~/.bash_profile

Use the source command to reset your environment from your newly modified ~/.bash_profile file.

source ~/.bash_profile

Test the result.

svn --version

End of procedure.

Remote support on parent’s computer

The parent’s computer is a special case. Kids have their friends and a lifetime of computer use. Friends have their network of friends and IT contacts. But parents…they just have you for IT needs.

Added on March 11th: A friend suggested the free version of LogMeIn.

Also, every time I examine dad’s computer something random has been installed, an Internet Explorer off-brand toolbar here, a weird card game there…or whatever. Also, my dad does not have a system to keep track of passwords, so fixing an e-mail account includes some investigative work.

None of this is a problem per se, but it takes time and when I’m visiting them, I’d prefer to talk and hang out and not plow through a bun of IT baloney.

KeePassX – keeping track of passwords

I use KeePassX to store authentication. I use long, weird passwords that I have no hope of remembering, so it saves my butt every single day. I keep my KeePassX database file in Dropbox, so I always have the current iteration of it handy.

When I learn one of dad’s passwords, it now goes straight into KeePassX! I now have a group called Dad – perfect.

TightVNC – remote control

TightVNC is free and works great. VNC is a standard and works across platforms, so I can use a client on my Mac or Ubuntu box, controlling my Dad’s Windows 7 machine. Easy breezy.

It was a snap to download and install on dad’s computer as a service.

Problem 1: an IP address

The problem is not installing or using VNC. It is getting access to the computer. As is common, dad’s computer is on a cable modem and has an IP address that is subject to change. So, I cannot just use the I address, because it will change when ever he reboots his cable modem.

The solution to this is to use a dynamic DNS system. This provides a URL like “” which points to his current address. To get this set up I used the Free Dynamic DNS service at to get a URL for his computer. I then set up a program to update that whenever th IP address changes. I installed their DynDNS Updater for Windows.

The above results in a usable URL to access dad’s computer and the IP address is automatically updated to the dynamic DNS service when the address changes.

Problem 2: getting through the router

One of the valuable services a typical consumer router provides is protection from nasties. Thus, we cannot access the computer beyond the router. The answer to this is port forwarding. Some help with port forwarding can be found at Some Googling found some instructions for my dad’s specific router. I needed to forward port 5900 to my dad’s computer. Some more details about this can be found at:

Problem 3: getting through the firewall

I just sort of hacked my way through allowing port 5900 through the Windows 7 firewall. Just now, I Googled it and found a pretty good guide at:

You need to allow port 5900 with TCP through. TightVNC will now guard that port instead of the Firewall.


You can use the service at to test your port forwarding and potential VNC connection. It gives some feedback on what works and what doesn’t.

Then, use a VNC client to connect to your new VNC server.

My remote Subversion dump/tar/rotating file Perl script

This is the script I use to SSH remotely dump Subversion repositories on various servers for which I am responsible.

Before you can use this script, you need to set up SSH so your local cron can access the remote servers without a password.

One thing to note about this script is that it automatically rotates the archived dump files; keeping a fie for the 1st of the week on a month, 1st of the month and 1st of the year.

see: Using Public/Private Key Pairs with SSH

Then, just modify the script for your database/servers (the block @ about line 22).

This will create a series of files over time with daily/weekly/monthly Subversion dump backup tar files. The point is not so much to have every state of every repository, but to grab the daily changes without clobbering the last know good one. More is better, no?

#!/usr/bin/perl -w
# by Andrew Ault
# No arguments. The program is to be modified to include each Subversion repository to be archived.
# Saves a tar of a remote Subversion dump in a rotating file.
# Of course you have to have SSH authentication already set up.
# This get cron'd daily on my local workstation.
use strict;
use warnings;

use DateTime;

my $fileError;
my $jobError  = 0;
my $jobErrors = "";
my $result;

# Specify a data block for each remote repository to be archived.
my %dumpJobs = (
				 'servername-repositoryname' => {
							'remoteServer' => 'servername',
							'repository'     => 'repositoryname',
							'dumpFilename' => 'servername-repositoryname.dump.svn',
							'svnDumpCmd' => '/usr/bin/svnadmin dump', # find svnadmin on your server
							'tarCmd'       => '/bin/tar', # find tar on your server
				 'servername-repositoryname2' => {
							'remoteServer' => 'servername',
							'repository'     => 'repositoryname2',
							'dumpFilename' => 'servername-repositoryname2.dump.svn',
							'svnDumpCmd' => '/usr/bin/svnadmin dump',
							'tarCmd'       => '/bin/tar',

# Process each specified repository dump/archive job.
for my $dumpJob ( sort keys %dumpJobs ) {
	$fileError = 0;
	my $tarballFilename = "$dumpJobs{$dumpJob}{'dumpFilename'}-" . tarDateSegment() . ".tgz";
	my $svnDumpCmd    = $dumpJobs{$dumpJob}{'svnDumpCmd'};
	my $tarCmd          = $dumpJobs{$dumpJob}{'tarCmd'};
	print "$dumpJob\n";

	my $dumpCommand = "ssh $dumpJobs{$dumpJob}{'remoteServer'} '$svnDumpCmd ";
	$dumpCommand .= "/var/lib/svn/$dumpJobs{$dumpJob}{'repository'} > $dumpJobs{$dumpJob}{'dumpFilename'}'";
	print $dumpCommand . "\n";
	$result = system($dumpCommand );
	if ($result) { $fileError = 1; }

	if ( !$fileError ) {
		my $remoteMakeTarball = "ssh $dumpJobs{$dumpJob}{'remoteServer'} '$tarCmd ";
		$remoteMakeTarball .= "cvfz $tarballFilename $dumpJobs{$dumpJob}{'dumpFilename'}'";
		print $remoteMakeTarball . "\n";
		$result = system($remoteMakeTarball );
		if ($result) { $fileError = 1; }

	if ( !$fileError ) {
		my $downloadCommand = "scp $dumpJobs{$dumpJob}{'remoteServer'}:$tarballFilename .";
		print $downloadCommand . "\n";
		$result = system($downloadCommand );
		if ($result) { $fileError = 1; }

	if ($fileError) {
		$jobError = 1;
		$jobErrors .= "$dumpJob ";
if ($jobError) {
	warn "Errors were encountered: $jobErrors\n";

sub tarDateSegment {
	my $dt = DateTime->now();

	my ( $sec, $min, $hour, $mday, $mon, $year, $wday, $yday, $isdst ) = localtime(time);
	$year += 1900;
	my $dateTime = sprintf "%4d-%02d-%02d %02d:%02d:%02d", $year, $mon + 1, $mday, $hour, $min, $sec;
	my $date     = sprintf "%4d-%02d-%02d",                $year, $mon + 1, $mday;
	my @weekdays = qw( sun mon tue wed thu fri sat );
	my $weekday  = $weekdays[$wday];
	my @months   = qw( jan feb mar apr may jun jul aug sep oct nov dec );
	my $month    = $months[$mon];

	my $weekOfMonth = $dt->week_of_month;

	my $dateTar = "";

	# if the first day of the year, set $dateTar like: 2009-1st
	if ( $yday == 1 ) {
		$dateTar = "$year-1st";

	# if the first day of the month, set $dateTar like: feb-1st
	elsif ( $mday == 1 ) {
		$dateTar = "$month-1st";

	# if the first day of the week, set $dateTar like: mon-1
	# where the number is the week of the month number
	elsif ( $wday == 1 ) {
		$dateTar = "$weekday-$weekOfMonth";

	# otherwise, set the $dateTar like: mon
	else {
		$dateTar = "$weekday";

	# $sec      seconds          54
	# $min      monutes          37
	# $hour     hour             11
	# $mon      month            4
	# $year     year             2009
	# $wday     weekday          3
	# $yday     day of the year  146
	# $isdst    is DST           1
	# $weekday  day of the week  wed
	# $month    month            may
	# $dateTime date and time    2009-05-27 11:37:54
	# $date     date             2009-05-27
	return $dateTar;

=head1 NAME - Andrew's remote Subversion repository archive program.




This is a program I wrote to SSH/dump/tar/download/rotate archives of Subversion repositories.



=head1 LICENSE

Use this as you will.

=head1 AUTHOR

Andrew Ault 


Installing Net::Amazon::S3 Perl module on an Ubuntu server

The following is the same on recent Ubuntu releases, including Karmic, Lucid and Maverick.

What will not work

There seems to be a problem if you install Net::Amazon::S3 from CPAN. This will not work:

sudo cpan Net::Amazon::S3

Just about every dependency in the world installs, but fails in the home stretch when XML::LibXML::XPathContext and XML::LibXML fail to install.

What will work

sudo aptitude install libnet-amazon-s3-perl
sudo cpan Net::Amazon::S3::Client

Test your install with this

After throwing some data into S3 with S3Fox, test your installation. You will need to set values for aws_access_key_id and aws_secret_access_key, of course.

use warnings;
use strict;
use Net::Amazon::S3;
use Net::Amazon::S3::Client;

my %s3_hash = (
				aws_access_key_id     => "XXXXXXXXXXXXXXXXX",
				aws_secret_access_key => "YYYYYYYYYYYYYYYYYYYYYYYYYY",
				retry                 => 1,

my $s3 = Net::Amazon::S3->new( \%s3_hash );
my $client = Net::Amazon::S3::Client->new( s3 => $s3 );

my @buckets = $client->buckets;
foreach my $bucket (@buckets) {
	print $bucket->name . "\n";

XAMMP erroneous error message

While setting up a test system for a new MVC PHP web project, I ran into a hiccup when I restarted Apache under XAMPP on my Mac (OSX).

Googling this error turns out not to be very helpful. It is shown as an error, but no solutions.

This is a hard-to-track-down XAMPP error because the error issued has nothing to do with the problem.

Here is the text from the error dialog, so Google et al can find it: “/Applications/XAMPP/xamppfiles/bin/apachectl: line 70: ulimit: open files: cannot modify limit: Invalid argument”.

I had simply created a typo in the CustomLog line in the httpd-vhosts.conf file.

This was in the httpd-vhosts.conf file in the code block sort like this:

    DocumentRoot "/Users/andrewault/www/"
    ErrorLog "/Users/andrewault/www/"
    CustomLog "/Users/andrewault/www/" common

The directory part of the CustomLog line was wrong, causing the error.

Emulating a Z80 and CP/M on Ubuntu Linux

Here is how to emulate a Z-80 processor running CP/M on Ubuntu Linux.

This method is very easy and achieves an excellent, easy to use and understand system. Essentially, the trick is to use a DOS-based emulator that works really well in a DOS emulator under Linux. I haven’t found a good Z80 emulator that runs directory under Linux.

To begin though, a mystery must be told. There, apparently was a fellow named Simon Cran in Australia who wrote a lovely CP/M Z-80 emulator for DOS. If you Google his name and “CPM” you can delve into the mysterious Simon Cran who created MyZ80 as shareware in the early nineties and then, seemingly vanished into ‘net anonymity.

I found that MyZ80 works well run in the dosemu DOS Emulator on Linux. I used to use this setup when I used Suse and it also works well on Ubuntu.

Install DOS Emulator

Install DOS Emulator:

sudo aptitude install dosemu

On my system dosemu has the equivalent of a DOS C: drive inside ~/.dosemu/drive_c/.

I have DOS in a Box installed as well as dosemu. When I run dosemu, it opens this window…seems to work fine.

Install MyZ80

Download MyZ80 from and unzip it into ~/.dosemu/drive_c/myz80/ – as illustrated, above. An easy way to do this is to plop the ZIP into ~/.dosemu/drive_c/ and then right-click and select Extract here.

Run MyZ80

In a Terminal, run:


Then run MyZ80:

cd myz80

You will be greeted with this friendly text:

A couple of return key presses will then show how to import and export data into the files that Simon Cran uses for the CP/M drives:

The command to exit MyZ80 is exit. The command to exit dosemu is exitemu.

I have run Wordstar and Turbo Pascal using MyZ80, re-living my experience with my Kaypro 10…a machine I miss very much!

Some useful links

Have fun!

Metaplace creating Facebook games

I just saw this Craigslist posting for a new employee at Metaplace (where I worked for a few months):

The title of the ad is: “Facebook game company seeking Flash developer”.

This is a great path for Metaplace. They have developed some wonderful game technology that is well suited for a Flash client embedded in a Facebook application.

Metaplace people are salt of the Earth and I wish them every success! They deserve it.

Perl function that returns info about a video (uses FFMPEG)

This program contains a Perl function I wrote to extract data about a given video.

It also shows how to parse information from program output and organize it usefully.

This has been used a few times in production systems.

If you use this, please drop a comment! It would be fun to know.

Share and enjoy!

#!/usr/bin/perl -w
# by Andrew Ault,
# Please drop me a note it you use this.

use strict;
use warnings;

use IPC::Open3;

# example
my $filename  = "yourvideoFilenameHere.mp4";
my %videoInfo = videoInfo($filename);
print "duration: " . $videoInfo{'duration'} . "\n";
print "durationsecs: " . $videoInfo{'durationsecs'} . "\n";
print "bitrate: " . $videoInfo{'bitrate'} . "\n";
print "vcodec: " . $videoInfo{'vcodec'} . "\n";
print "vformat: " . $videoInfo{'vformat'} . "\n";
print "acodec: " . $videoInfo{'acodec'} . "\n";
print "asamplerate: " . $videoInfo{'asamplerate'} . "\n";
print "achannels: " . $videoInfo{'achannels'} . "\n";

# returns media information in a hash
sub videoInfo {
	# ffmpeg command
	my $ffmpeg = '/usr/local/bin/ffmpeg';

	my %finfo = (
				  'duration'     => "00:00:00.00",
				  'durationsecs' => "0",
				  'bitrate'      => "0",
				  'vcodec'       => "",
				  'vformat'      => "",
				  'acodec'       => "",
				  'asamplerate'   => "0",
				  'achannels'       => "0", 

	my $file = shift;

	# escaping characters
	$file =~ s/(\W)/\\$1/g;

	open3( "/dev/null", \*ERPH, "$ffmpeg -i $file" ) or die "can't run $ffmpeg\n";
	my @res = ;

	# parse ffmpeg output
	foreach (@res) {

		# duration
		if (m!Duration: ([0-9][0-9]:[0-9][0-9]:[0-9][0-9].[0-9][0-9])!) {
			$finfo{'duration'} = $1;

		# bitrate
		if (m!bitrate: (\d*) kb/s!) {
			$finfo{'bitrate'} = $1;

		# vcodec and vformat
		if (/Video: (\w*), (\w*),/) {
			$finfo{'vcodec'}  = $1;
			$finfo{'vformat'} = $2;

        # Stream #0.1(und): Audio: aac, 48000 Hz, 1 channels, s16, 64 kb/s

		# acodec, samplerate, stereo and audiorate
		if (m!Audio: (\w*), (\d*) Hz, (\d*)!) {
			$finfo{'acodec'}     = $1;
			$finfo{'asamplerate'} = $2;
			$finfo{'achannels'}     = $3;

	my $tenths  = substr( $finfo{'duration'}, 9, 2 );
	my $seconds = substr( $finfo{'duration'}, 6, 2 );
	my $minutes = substr( $finfo{'duration'}, 3, 2 );
	my $hours   = substr( $finfo{'duration'}, 0, 2 );
	$finfo{'durationsecs'} = ( $tenths * .01 ) + $seconds + ( $minutes * 60 ) + ( $hours * 360 );

	return %finfo;