Up one level
Backing up your data in Linux using "rdiff-backup" or "rsync"
Spencer Stirling

There are several "good" standard ways to perform backups for your system now. The "best" seems to be rdiff-backup - although I have been successfully using rsync for a long time and so I haven't felt the need to change. There is also the incremental tar backup method. I won't describe that here, either, but I have included a script (at the end) that I use for such purposes.

Rsync
If you want to learn the basics of "rsync" then you can first read this website which will open your eyes. Outlined there is a "rotating" backup method cleverly exploiting "hard links" in the Linux filesystem.

In Debian you can install rsync by installing the package "rsync" on the client machine. It is useful to also install this package on the server machine and then set the following lines in /etc/default/rsync

RSYNC_ENABLE=true
RSYNC_OPTS='--address=127.0.0.1 --port=873'
This starts up the rsync server daemon listening on loopback to port 873. Although the rsync daemon is not necessary (we will use rsync over SSH instead) I have found that I have problems backing up largish (~250GB) systems without it (I have no idea why this daemon helps when using the SSH method since the two methods seem to have nothing to do with each other, but it does).

If you want to use the rsync daemon for real (instead of rsync over SSH) then you will need to leave off the "--address=127.0.0.1" to have the daemon listen on all network interfaces (or, alternatively keep this entry and use an SSH Tunnel).

Although the scripts below are written for rsync over SSH, it is trivial to modify them to use the rsync daemon instead. This can be extremely handy since then you will not be required to SSH into the remote machine (good for FTP sites, also good for security reasons). Check the manpages for instructions concerning the rsync daemon.

Tip: if you are backing up to/from FAT32 partitions then you SHOULD mount these partitions using the "iocharset=utf8" option. Otherwise any international characters will be converted into "?" - and rsync does not handle question marks very well!!!

To get the flavor for rsync, first here is a simple backup script that simply SYNCS the source to the destination - this is nothing fancy (if either SRC or DEST is remote (using SSH) then this WILL NOT WORK in a cron job. You need to "source" the keychain variables first! Check the next script):


#!/bin/sh
# this script automatically syncs SRC to DEST

# don't forget the trailing "/" on SRC
# this can be a valid ssh path like
# yourname@ma.utexas.edu:work/  OR something like
# yourname@ma.utexas.edu:/home/yourname/work/
SRC="/home/stirling/movies/"
# don't forget the trailing "/" on DEST
DEST="/disks/sdc1/movies/"
# list your exclusions here - REQUIRED even if empty
EXCLUDEFILE="excludethisjunk.txt"

# check to see if the source even exists (don't worry if its remote)
echo $SRC | grep ":"
if [ ! $? -eq 0 ]
then
  if [ ! -d $SRC ]
  then
    echo "error: directory $SRC does not exist"
    exit 1
  fi
fi

# the source exists at this point (or it's remote)

# check to see if the destination even exists (don't worry if its remote)
echo $DEST | grep ":"
if [ ! $? -eq 0 ]
then
  if [ ! -d $DEST ]
  then
    echo "error: directory $DEST does not exist"
    exit 1
  fi
fi

rsync -vapu --delete -e ssh --exclude-from=$EXCLUDEFILE $SRC $DEST
if [ ! $? -eq 0 ]
then
  echo "Failed!!!"
  exit 1
fi

echo "Success!!!"
exit 0

I have written a couple of (what I think are) improvements to the techniques outlined in the "rotating backup" article above. Most notably I don't like having my weekly/monthly/halfyearly backup schedule being determined by actual dates/times - this is because I deal with several machines that sometimes go for weeks without being "on", and I don't want them to miss their monthly or halfyearly backups because they are not powered. In this script I can make it a CRON job and run it as often as I like - each run corresponds to "1 day".

Here is the script file that I use. This script file can take a "source" directory and back it up to a "destination" directory, although the "destination" directory MUST be locally mounted (because I'm rotating backups - obviously you can get around this by mounting with NFS, SMB, or my favorite: SHFS/LUFS networking - see my "no promises" tar script included at the VERY end if you want an idea how to write an rsync script that mounts first). The backups are kept on a "today, yesterday, week, month, half-year" schedule (again these increments of "time" are really determined by how often you run the script - each run corresponds to "1 day").

Also included is an option for an "exclusions file" so that certain things can be left out of the backup (like web caches or other worthless things). YOU MUST HAVE AN EXCLUSIONS FILE, EVEN IF IT IS EMPTY. Furthermore there is the ability to use the "keychain" utility for "passwordless SSH"ing (handy if the SRC directory is on a different machine and you want to run the backup in the middle of the night). You can read my blurb about keychain for guidance about that.

The format of the EXCLUDEFILE is as follows: assuming that SRC="/home/johnsmith/work/" and EXCLUDEFILE="/home/johnsmith/excludethisjunk.txt" then in "excludethisjunk.txt" you can put a list of files/directories that you don't want (pathnames are RELATIVE TO SRC!!!). For example, if I put in the following lines

junkdirectory1
junkfile

Then the directory /home/johnsmith/work/junkdirectory1 and the file /home/johnsmith/work/junkfile will be omitted from the backup. Notice that I DID NOT use the full pathname - this WILL NOT WORK. All files/directories put in your EXCLUDEFILE are relative to your SRC directory (this is an rsync thing, not a shortcoming in my script).

For completeness I have included below this script ANOTHER script that uses incremental tar backups (but otherwise is very similar). I prefer the "rsync" way, but "tar" backups can be useful for some things.


#!/bin/sh
# this program creates a "daily" rsync backup of the data specified
# in SRC (note the trailing "/", which is necessary for rsync)
# it puts the result in DEST
# IT IS ABSOLUTELY NECESSARY THAT "DEST" BE A LOCALLY MOUNTED DIRECTORY
# NOTE: "daily" doesn't really mean anything - each "day" corresponds
# to a single run (this script is intended to be run every night, but it
# doesn't have to) 

# Don't forget the trailing /
# Note that SRC can be a remote system like
#   johnsmith@machine.ma.utexas.edu:backup_this_directory/
SRC="/home/johnsmith/work/"
# DEST MUST be a LOCAL directory... don't forget the trailing /
DEST="/disks/data/backup/work/"
# This is the file which contains the exclusions (files/directories to leave out)
# MUST BE CREATED (even if empty)
EXCLUDEFILE="/home/johnsmith/excludethisjunk.txt"

# we need to "source" the keychain information
# so that we don't need to put in a password in
# order to make use of ssh tools (like scp)
# IF YOU ARE NOT USING SSH THEN YOU CAN IGNORE THIS
# OTHERWISE LOOK UP HOW TO USE KEYCHAIN
#KEYCHAINVARS="${HOME}/.keychain/yourkeychainfile-sh"
#. $KEYCHAINVARS

# These are the number of days since the last backup to wait
# in order to make a new backup
DAYSINWEEK=7
DAYSINMONTH=30
DAYSINHALFYEAR=180

# These are the files which store the days since last backup
DAYSSINCELASTWEEKFILE=${DEST}days_since_last_week_backup
DAYSSINCELASTMONTHFILE=${DEST}days_since_last_month_backup
DAYSSINCELASTHALFYEARFILE=${DEST}days_since_last_halfyear_backup

DEST_TEMP=${DEST}temp
DEST_TODAY=${DEST}today
DEST_YESTERDAY=${DEST}yesterday

WEEK_CANDIDATE=${DEST}week.candidate
DEST_WEEK=${DEST}week

MONTH_CANDIDATE=${DEST}month.candidate
DEST_MONTH=${DEST}month

HALFYEAR_CANDIDATE=${DEST}halfyear.candidate
DEST_HALFYEAR=${DEST}halfyear

# First check to make sure that DEST is not a remote directory
# this is not allowed
echo $DEST | grep ":"
if [ $? -eq 0 ]
then
  echo "No remote destinations allowed!!!"
  exit 1
fi

# check to see if the source even exists (don't worry if its remote)
echo $SRC | grep ":"
if [ ! $? -eq 0 ]
then
  if [ ! -d $SRC ]
  then
    echo "error: directory $SRC does not exist"
    exit 1
  fi
fi

# the source exists at this point (or it's remote)


  # check to see if the destination exists... if not, then create it
  if [ ! -d $DEST ]
  then
    mkdir $DEST
  fi  

  # still check to see if the destination directory exists, and if so, make
  #it writeable
  if [ -d $DEST ]
  then
    chmod u+w $DEST
    cd $DEST
    chmod u+w *
  else
    echo "Cannot create destination directory" 
    exit 1
  fi

  # We are going to rotate the halfyear backups first, if necessary
  # first see if it is necessary
  if [ -f $DAYSSINCELASTHALFYEARFILE ]
  then
    DAYSSINCELASTHALFYEAR=`cat $DAYSSINCELASTHALFYEARFILE`
  else
    DAYSSINCELASTHALFYEAR=$DAYSINHALFYEAR
  fi
  DAYSSINCELASTHALFYEAR=`expr $DAYSSINCELASTHALFYEAR + 1`  
  if [ $DAYSSINCELASTHALFYEAR -gt $DAYSINHALFYEAR ]
  then
    # we are going to rotate... reset the count variable
    DAYSSINCELASTHALFYEAR=0
    
    # we need to make sure that the candidate even exists
    if [ -d $HALFYEAR_CANDIDATE ]
    then
    
      # rotate the files
      # start by deleting the halfyear backup
      if [ -d $DEST_HALFYEAR ]
      then
        rm -r $DEST_HALFYEAR
      fi
      # then rotate the candidate to "halfyear"
      mv $HALFYEAR_CANDIDATE $DEST_HALFYEAR

      # put a timestamp into the log file
      echo -n "Halfyear backup: " >> "${DEST}log"
      date >> "${DEST}log"
    
    else
      echo "No halfyear candidate exists: nothing to do"
    fi

  fi
  # write the day count out
  echo $DAYSSINCELASTHALFYEAR > $DAYSSINCELASTHALFYEARFILE

  
  # We are going to rotate the monthly backups next, if necessary
  # first see if it is necessary
  if [ -f $DAYSSINCELASTMONTHFILE ]
  then
    DAYSSINCELASTMONTH=`cat $DAYSSINCELASTMONTHFILE`
  else
    DAYSSINCELASTMONTH=$DAYSINMONTH
  fi
  DAYSSINCELASTMONTH=`expr $DAYSSINCELASTMONTH + 1`  
  if [ $DAYSSINCELASTMONTH -gt $DAYSINMONTH ]
  then
    # we are going to rotate... reset the count variable
    DAYSSINCELASTMONTH=0
    
    # we need to make sure that the candidate even exists
    if [ -d $MONTH_CANDIDATE ]
    then
    
      # rotate the files
      # start by rotating the "month" to "halfyear.candidate"
      if [ -d $DEST_MONTH ]
      then
        if [ ! -d $HALFYEAR_CANDIDATE ]
        then
          mv $DEST_MONTH $HALFYEAR_CANDIDATE
        else
          rm -r $DEST_MONTH
        fi  
      fi
      # then rotate the candidate to "month"
      mv $MONTH_CANDIDATE $DEST_MONTH

      # put a timestamp into the log file
      echo -n "Monthly backup: " >> "${DEST}log"
      date >> "${DEST}log"
    
    else
      echo "No monthly candidate exists: nothing to do"
    fi

  fi
  # write the day count out
  echo $DAYSSINCELASTMONTH > $DAYSSINCELASTMONTHFILE
  

  # We are going to rotate the weekly backups next, if necessary
  # first see if it is necessary
  if [ -f $DAYSSINCELASTWEEKFILE ]
  then
    DAYSSINCELASTWEEK=`cat $DAYSSINCELASTWEEKFILE`
  else
    DAYSSINCELASTWEEK=$DAYSINWEEK
  fi
  DAYSSINCELASTWEEK=`expr $DAYSSINCELASTWEEK + 1`  
  if [ $DAYSSINCELASTWEEK -gt $DAYSINWEEK ]
  then
  
    # we are going to rotate... reset the count variable    
    DAYSSINCELASTWEEK=0
    
    # we need to make sure that the candidate even exists
    if [ -d $WEEK_CANDIDATE ]
    then
    
      # rotate the files
      # start by rotating the "month" to "month.candidate"
      if [ -d $DEST_WEEK ]
      then
        if [ ! -d $MONTH_CANDIDATE ]
        then
          mv $DEST_WEEK $MONTH_CANDIDATE
        else
          rm -r $DEST_WEEK
        fi  
      fi
      # then rotate the candidate to "week"
      mv $WEEK_CANDIDATE $DEST_WEEK

      # put a timestamp into the log file
      echo -n "Weekly backup: " >> "${DEST}log"
      date >> "${DEST}log"
    
    else
      echo "No weekly candidate exists: nothing to do"
    fi

  fi
  # write the day count out
  echo $DAYSSINCELASTWEEK > $DAYSSINCELASTWEEKFILE
  
  
  # NOW FOR THE DAILY BACKUPS
  # start by copying "today" backup to "temp" (just copy links, remember)
  # if it exists
  # first delete "temp" from any unfinished backups
  if [ -d $DEST_TEMP ]
  then
    rm -r $DEST_TEMP
  fi 
  if [ -d $DEST_TODAY ]
  then
    cp -lax $DEST_TODAY $DEST_TEMP
  fi 

  # NOW perform the rsync
  rsync -vapu --delete -e ssh --exclude-from=$EXCLUDEFILE $SRC $DEST_TEMP
  if [ ! $? -eq 0 ]
  then
    echo "backup failed"
    exit 1
  fi

  # rotate the files 
  # start by rotating the "yesterday" to the "week candidate"
  if [ -d $DEST_YESTERDAY ]
  then
    if [ ! -d $WEEK_CANDIDATE ]
    then
      mv $DEST_YESTERDAY $WEEK_CANDIDATE
    else
      rm -r $DEST_YESTERDAY
    fi  
  fi

  # now rotate "today" to "yesterday"
  if [ -d $DEST_TODAY ]
  then
    mv $DEST_TODAY $DEST_YESTERDAY
  fi 

  # now finally rotate the backup just done "temp" to "today"
  if [ -d $DEST_TEMP ]
  then
    mv $DEST_TEMP $DEST_TODAY
    touch $DEST_TODAY
  fi

  # put a timestamp on the backup
  date >> "${DEST}log"

  # change the backup directory back to nonwriteable
  cd $DEST
  chmod a-w *
  chmod a-w $DEST

echo "Success!!!"
exit 0

Incremental TAR backups
The following script gives a similar "tar" incremental backup from a "source" directory to a "destination" directory (both MUST be locally mounted - although you can get around this by mounting using NFS, SMB, or SHFS/LUFS networking). I should probably put an "EXCLUDEFILE" option in here (as above using rsync), but I've been too lazy.


#!/bin/sh
# this program creates a "daily" tar incremental backup of the data specified
# in SRC it puts the result in DEST
# NOTE: Here "daily" doesn't mean anything - it just corresponds to
# each time the program is run (this batch script is written to be run
# each night, but that's not necessary)
# everything must be locally mounted
SRC="/home/stirling/work/"
# don't forget the trailing "/" in DEST
DEST="/disks/sdd1/backup/work/"

# These are the number of days since the last full backup to wait
# in order to make a new full backup
DAYSINWEEK=7

# These are the files which store the days since last backup
DAYSSINCELASTWEEKFILE=${DEST}days_since_last_week_backup

DEST_TEMP=${DEST}temp.tgz
DEST_DAY_PART=${DEST}day
DEST_WEEK=${DEST}week.tgz
DEST_FILELIST=${DEST}filelist
DEST_FILELIST_TEMP=${DEST}filelist.temp

# check to see if the source even exists
if [ -d $SRC ]
then

  # check to see if the destination exists... if not, then create it
  if [ ! -d $DEST ]
  then
    mkdir $DEST
  fi  

  # still check to see if the destination directory exists, and if so, make
  #it writeable
  if [ -d $DEST ]
  then
    chmod u+w $DEST
    cd $DEST
    chmod u+w *
  else
    echo "Cannot create destination directory" 
    exit 1
  fi

  # We are going to do the weekly full backup, if necessary
  # first see if it is necessary
  if [ -f $DAYSSINCELASTWEEKFILE ]
  then
    DAYSSINCELASTWEEK=`cat $DAYSSINCELASTWEEKFILE`
  else
    DAYSSINCELASTWEEK=$DAYSINWEEK
  fi
  DAYSSINCELASTWEEK=`expr $DAYSSINCELASTWEEK + 1`  
  if [ $DAYSSINCELASTWEEK -gt $DAYSINWEEK ]
  then
  
    # we are going to rotate backups... reset the count variable    
    DAYSSINCELASTWEEK=0
    
    # we need to make sure to delete any partial old backups
    if [ -f $DEST_TEMP ]
    then
      rm $DEST_TEMP
    fi
    if [ -f $DEST_FILELIST_TEMP ]
    then
      rm $DEST_FILELIST_TEMP
    fi

    # now make the temporary full backup (complete with a filelist)
    tar zcvpf $DEST_TEMP -g $DEST_FILELIST_TEMP $SRC >> /dev/null  
    if [ ! $? -eq 0 ]
    then
      echo "backup failed"
      exit 1
    fi

    # we have a successful full backup - we need to remove the old backups
    if [ -f $DEST_WEEK ]
    then
      rm $DEST_WEEK
    fi
    if [ -f $DEST_FILELIST ]
    then
      rm $DEST_FILELIST
    fi
    mv $DEST_TEMP $DEST_WEEK
    mv $DEST_FILELIST_TEMP $DEST_FILELIST
    if [ -f "${DEST_DAY_PART}1.tgz" ]
    then
      rm ${DEST_DAY_PART}*
    fi
    
    
    # put a timestamp into the log file
    echo -n "Weekly backup: " >> ${DEST}log
    date >> ${DEST}log
    
  fi
  # write the day count out
  echo $DAYSSINCELASTWEEK > $DAYSSINCELASTWEEKFILE
  
  
  # We are going to do the daily backups, if necessary
  # first see if it is necessary
  if [ $DAYSSINCELASTWEEK -gt 0 ]
  then

    DEST_DAY="${DEST_DAY_PART}${DAYSSINCELASTWEEK}.tgz"

    # we need to make sure to delete any partial old backups
    if [ -f $DEST_TEMP ]
    then
      rm $DEST_TEMP
    fi
    if [ -f $DEST_FILELIST_TEMP ]
    then
      rm $DEST_FILELIST_TEMP
    fi

    # now make the daily incremental backup (using the filelist)
    cp $DEST_FILELIST $DEST_FILELIST_TEMP
    tar zcvpf $DEST_TEMP -g $DEST_FILELIST_TEMP $SRC >> /dev/null  
    if [ ! $? -eq 0 ]
    then
      echo "backup failed"
      exit 1
    fi
    if [ -f $DEST_FILELIST ]
    then
      rm $DEST_FILELIST
    fi
    mv $DEST_TEMP $DEST_DAY
    mv $DEST_FILELIST_TEMP $DEST_FILELIST
    touch $DEST_DAY
    touch $DEST_FILELIST

    # put a timestamp on the backup
    date >> ${DEST}log

  fi
  
  # change the backup directory back to nonwriteable
  cd $DEST
  chmod a-w *
  chmod a-w $DEST

else
  echo "error: directory $SRC does not exist"
  exit 1
fi

echo "Success!!!"
exit 0

"No Promises" tar script which first mounts using SHFS
Here is a VERY ROUGH script that mounts using SHFS before performing a tar backup (as we've seen, some things like destination directories need to be locally mounted for any of the above to work). You should have SHFS/LUFS networking before trying to use this - you WILL NEED TO MODIFY IT!!! It's JUST AN EXAMPLE:


#!/bin/sh
# this program backs up the "source" directory (REMOTE_DIR) on a remote machine to
# a file on the local machine using incremental TAR backups
# first the remote directory is mounted using the SHFS system tools

# don't forget the trailing "/" in DEST
DEST="/disks/sdc1/backup/laptop_junk/"
# REMOTE_DIR IS THE SOURCE DIRECTORY - put in any valid SHFS pathname 
REMOTE_DIR="stirling@laptop"

# this is where REMOTE_DIR will be mounted
# DO NOT MODIFY THIS - MODIFY ONLY REMOTE_DIR as the "source"
SRC="${DEST}mnt/"

# check to see if the destination exists... if not, then create it
if [ ! -d $DEST ]
then
  mkdir $DEST
fi  

# make sure that the local mount directory exists
if [ -d $DEST ]
then
  if [ ! -d $SRC ]
  then
    mkdir $SRC
  fi
  if [ ! -d $SRC ]
  then
    echo "cannot create temporary mount"
    exit 1
  fi
else
  echo "cannot create destination"
  exit 1
fi

# mount the laptop directory
shfsmount -o ro $REMOTE_DIR $SRC
if [ ! $? -eq 0 ]
then
  echo "Cannot mount laptop"
  exit 1
fi

# NOW JUST USE THE OLD TAR INCREMENTAL PROGRAM (EVERYTHING IS LOCALLY MOUNTED)

# this program creates a "daily" tar incremental backup of the data specified
# in SRC it puts the result in DEST
# NOTE: Here "daily" doesn't mean anything - it just corresponds to
# each time the program is run (this batch script is written to be run
# each night, but that's not necessary)

# These are the number of days since the last full backup to wait
# in order to make a new full backup
DAYSINWEEK=7

# These are the files which store the days since last backup
DAYSSINCELASTWEEKFILE=${DEST}days_since_last_week_backup

DEST_TEMP=${DEST}temp.tgz
DEST_DAY_PART=${DEST}day
DEST_WEEK=${DEST}week.tgz
DEST_FILELIST=${DEST}filelist
DEST_FILELIST_TEMP=${DEST}filelist.temp

# check to see if the source even exists
if [ -d $SRC ]
then

  # check to see if the destination exists... if not, then create it
  if [ ! -d $DEST ]
  then
    mkdir $DEST
  fi  

  # still check to see if the destination directory exists, and if so, make
  #it writeable
  if [ -d $DEST ]
  then
    chmod u+w $DEST
    cd $DEST
    chmod u+w *
  else
    echo "Cannot create destination directory" 
    exit 1
  fi

  # We are going to do the weekly full backup, if necessary
  # first see if it is necessary
  if [ -f $DAYSSINCELASTWEEKFILE ]
  then
    DAYSSINCELASTWEEK=`cat $DAYSSINCELASTWEEKFILE`
  else
    DAYSSINCELASTWEEK=$DAYSINWEEK
  fi
  DAYSSINCELASTWEEK=`expr $DAYSSINCELASTWEEK + 1`  
  if [ $DAYSSINCELASTWEEK -gt $DAYSINWEEK ]
  then
  
    # we are going to rotate backups... reset the count variable    
    DAYSSINCELASTWEEK=0
    
    # we need to make sure to delete any partial old backups
    if [ -f $DEST_TEMP ]
    then
      rm $DEST_TEMP
    fi
    if [ -f $DEST_FILELIST_TEMP ]
    then
      rm $DEST_FILELIST_TEMP
    fi

    # now make the temporary full backup (complete with a filelist)
    tar zcvpf $DEST_TEMP -g $DEST_FILELIST_TEMP $SRC >> /dev/null  
    if [ ! $? -eq 0 ]
    then
      echo "backup failed"
      exit 1
    fi

    # we have a successful full backup - we need to remove the old backups
    if [ -f $DEST_WEEK ]
    then
      rm $DEST_WEEK
    fi
    if [ -f $DEST_FILELIST ]
    then
      rm $DEST_FILELIST
    fi
    mv $DEST_TEMP $DEST_WEEK
    mv $DEST_FILELIST_TEMP $DEST_FILELIST
    if [ -f "${DEST_DAY_PART}1.tgz" ]
    then
      rm ${DEST_DAY_PART}*
    fi
    
    
    # put a timestamp into the log file
    echo -n "Weekly backup: " >> ${DEST}log
    date >> ${DEST}log
    
  fi
  # write the day count out
  echo $DAYSSINCELASTWEEK > $DAYSSINCELASTWEEKFILE
  
  
  # We are going to do the daily backups, if necessary
  # first see if it is necessary
  if [ $DAYSSINCELASTWEEK -gt 0 ]
  then

    DEST_DAY="${DEST_DAY_PART}${DAYSSINCELASTWEEK}.tgz"

    # we need to make sure to delete any partial old backups
    if [ -f $DEST_TEMP ]
    then
      rm $DEST_TEMP
    fi
    if [ -f $DEST_FILELIST_TEMP ]
    then
      rm $DEST_FILELIST_TEMP
    fi

    # now make the daily incremental backup (using the filelist)
    cp $DEST_FILELIST $DEST_FILELIST_TEMP
    tar zcvpf $DEST_TEMP -g $DEST_FILELIST_TEMP $SRC >> /dev/null  
    if [ ! $? -eq 0 ]
    then
      echo "backup failed"
      exit 1
    fi
    if [ -f $DEST_FILELIST ]
    then
      rm $DEST_FILELIST
    fi
    mv $DEST_TEMP $DEST_DAY
    mv $DEST_FILELIST_TEMP $DEST_FILELIST
    touch $DEST_DAY
    touch $DEST_FILELIST

    # put a timestamp on the backup
    date >> ${DEST}log

  fi
  
  # change the backup directory back to nonwriteable
  cd $DEST
  chmod a-w *
  chmod a-w $DEST

else
  echo "error: directory $SRC does not exist"
  exit 1
fi

echo "Success!!!"
exit 0

This page has been visited   times since April 8, 2005