About the author
Boaz I'm a software developer, working at a small company in the Netherlands. Currently I'm mostly using techniques like .NET, C#, SQL and jQuery, but I have experience with JAVA and PHP as well.

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

I read somewhere that it's very easy to backup a site using wget on linux. That made me realize that, at the moment, I do not have any backups of my sites. The only backups that (hopefully) exist are the ones my shared host creates. So I figured it would be safe to have my own backups. Since I run my own Ubuntu home-server, already for a few years now, I have a backup target too.

The only problem left is: the databases. Luckily dasBlog does not use a database to store it's posts, but a XML datastore. This makes it less painful to make a backup, especially on shared hosting. My MySQL databases can be accessed externally, so they can be backuped using a (separate) script too. Currently I already do this for my e-mail database.

So, I created the following script:

cd "$(dirname "$0")"

backupfile=$(date +%Y%m%d)
savetodir=./backup-$(date +%Y)/

if [ -d $host ]; then
    rm -r $host

wget --recursive --level=inf --quiet --user=$user --password=$pass ftp://$host/

if [ $? -ne 0 ]; then
    echo "wget faild during the backup of $host" >&2
    exit 1

tar -cf $backupfile.tar $host && gzip -c $backupfile.tar > $backupfile.tar.gz && rm $backupfile.tar

if [ $? -ne 0 ]; then
    echo "faild to create a tar.gz during the backup of $host" >&2
    exit 1

if [ ! -d $savetodir ]; then
    mkdir $savetodir

mv $backupfile.tar.gz $savetodir

exit 0
Fist a few variables are set, like the ftp site, username and password. I created a special backup user that only has read privileges, since no writing is required when making backups.

By default wget creates a directory named after the host, so I start with removing that directory and it's contents but only if it exists (from a previous backup).

This is follwed by the wget command that does a fully-recursive download (note the: --level=inf) after finishing the exit code of wget is checked for errors. The backup is then compressed and stored in a directory per-year.

Finally I scheduled a cronjob, to automatically create a backup once a week:
0 3 * * 3 boaz sh /mnt/array1/backup/n3rd.nl-sohosted/backup.sh

November 21, 2011 - Comments [0] - Posted in Backup | Ubuntu
Logging in on a site using apache can be done in a lot of different ways e.g. .htpasswd or mysql. All of them have one thing in common: you get yet another password. What I wanted to accomplish is to be able to login using my normal Ubuntu username/password. This makes it a lot easier to change my password once in a while.

The first step is to install the required applications:
sudo apt-get install libapache2-mod-authz-unixgroup pwauth
Enable the apache module:
sudo a2enmod authnz_external
Edit the appropriate apache site in /etc/apache2/sites-available, make sure the site is only available over SSL otherwise you password will travel over the Internet unencrypted!
AddExternalAuth pwauth /usr/sbin/pwauth
SetExternalAuthMethod pwauth pipe

<location /sickbeard/>
    order deny,allow
    deny from all
    allow from all

    ProxyPass http://localhost:8081/sickbeard/
    ProxyPassReverse http://localhost:8081/sickbeard/

    AuthType Basic
    AuthName "Boaz' Sick Beard"
    AuthBasicProvider external
    AuthExternal pwauth
    Require valid-user
Finally restart apache and you're ready to go!
sudo /etc/init.d/apache2 restart
October 23, 2011 - Comments [1] - Posted in Ubuntu