I saw a post today on reddit where someone asked for help diagnosing a problem with their system that wouldn't boot. The error messages looked like this.

Error reading block 128624319 (Input/output error).

/dev/sda2: UNEXPECTED INCONSISTENCY: RUN fsck MANUALLY
              (i.e. without -a or -p options)

fsck exited with status code 4 
The root filesystem on /dev/sda2 requires a manual fsck 

The system drops into an emergency shell and offers a limited set of commands so you can try to fix the problem. In most cases a problem like this means the end of life for the sda disk and you are left with the choice of running the system on an unstable drive (for a short time) or, ultimately, replacement.

By following a regular backup plan, this can be something of a minor nusence compared to a total loss when you aren't backing up your data. Anyone that has worked in IT for any period of time knows how common drives fail and it's almost as common that people don't backup their data. Let's change that using cron, tar, and a bash script!

On Linux and Unix like machines most people keep their data in a "home" directory. To find the path to your home you can run the following command.

echo $HOME

The output of that command is what we need to backup.

The script I use creates a full (level 0) backup the first time it runs then each time after creates a incremental backup. The method used to create incremental backups is built in to the tar command (see the man page for tar(1) to learn more) and uses a snapshot file. To start over with a new level 0 backup, just delete the snapshot file.

The script used to backup $HOME. Remember to change to meet your requirements.

#!/bin/bash
# Incremental Backup script

# Store the current date and time
NOW=`date +%Y-%b-%d-%y%s`

# Assign a location to store backups
# This should be outside of your home dir
DESTINATION=/path/to/backups/directory

# Name the backup files by date
FN=backup-$NOW.tar.gz

# Identify the data to be backed up
# Use the full path here & not an env variable
SOURCEDIR=/home/username

# Create a record of files so we can do incremental backups
# This is the snapshot file
SNF=/home/username/.cache/backup.snf

# Create the backup, excluding some locations
tar --exclude='/home/username/exclude_dir1' \
    --exclude='/home/username/exclude_dir2' \
    --exclude='/home/username/exclude_file1' \
    --exclude='/home/username/exclude_file2' \
    -cpzf $DESTINATION/$FN -g $SNF $SOURCEDIR
backup_script.sh

Now that you have a script that backs up your home directory, let's put it to work with cron. Cron is a system scheduler that allows you to run tasks according to a pre-configured schedule. I highly recommend using https://crontab.guru/ to help with the syntax when you create cron jobs.

To create cron jobs you need to run the following command:

crontab -e

The system default editor will open your personal crontab for editing.

Add this line to the bottom to have the script created above run at 4 a.m. every morning:

0 4 * * * /home/username/bin/bckup.sh >/dev/null 2>&1

I also like to have a new level 0 backup created once a week (on Sunday), for that add the following.

 0 5 * * 7 /bin/rm /home/username/.cache/backup.snf 2>&1

One important thing to remember about cron is that it runs without access to your environment variables so full paths are needed for any commands and directory locations.

That's it, by doing to steps above you should feel confident about restoring any data that is lost by a failing hard drive or accedental deletetion. A follow up article will go over saving the backups to a network share or cloud storage.