Updating the Backup Scripts

So, it’s been awhile since I actually looked at the code for my Backup Scripts.. I had been using them religiously for quite some time, but when I got rid of the server cluster that I had, I stopped using them.

Recently, I setup a server in the Azure cloud to run this website and a few others on. I figured, what a perfect use for the scripts!

Upon firing them up, I found several (no, really, pretty much every bit of code) pieces of the script needed overhauling as errors were thrown all over the place.

First thing I started with is the home-backup script. I used a really handy website called¬†ShellCheck¬†that assisted in the ‘QA’ of the code.

I think, it’s about time to start versioning the script, instead of just making changes whenever. I’m not sure what version it’ll start with, but compared to the original, there will be a lot of improvements.

First, I’m working on enabling the ability to call flags on the command, for example: “./home-backup -d /some/backup/dir -u some_user”. Which will make it easy to setup single user backups, instead of backing up everything.

Second, Error handling. Right now, the script doesn’t have any error handling at all. Since it’s a backup script, the only risk is, the script fails and no backup is taken. I’m building error handling around the backup destination, so that if the directory doesn’t exist, it will prompt you to create the directory. I”ll likely work into the script an automatic creation when using a flag.

Now, the mysql-backup script, will mimic the home-backup because they both go through the same paces.

So, the home-restore script.. This is one I haven’t really dug into yet. Obviously, the first thing that needs to be done is error handling and a “rollback” feature. Right now, it expects that all conditions are met, and just goes. This one, may take quite a bit longer to be “ready”.

For now, I’ve put the WebUI portion of this project on the shelf. Somewhere down the line, I might pick it backup and actually make it happen. But, for now, it’s on the shelf.

If you’ve got any bash experience, and wish to lend a hand, please by all means contact me!

Thanks for reading!

Dan

Garden Cam Gallery Project

For awhile now, I’ve had my Garden Cam up and running but I wanted a photo gallery that wasn’t a pain to maintain or setup. I also needed it to auto detect directories and photos. I stumbled across SFPG (Single File Photo Gallery) which does just that!

My only problem was, the webcam only uploads a single image every 5 minutes. My fix? a simple bash script to copy that file into a directory 1 minute after the new image uploads. The image starts at cam_1.jpg but gets transferred with date and time stamps on it for easier sorting with another bash script. Once it’s in the new directory and renamed, it gets sorted:

DATE=$(date +%m%d)
Dir=$(date +%m-%d)

# Move the files
for file in cam-“$DATE”*.jpg; do
mkdir -p “$Dir”
mv $file “$Dir”
done

I have the sort script run one minute after the file gets moved, this way I don’t have any issues with the sort script running before the image is moved.

So check it out Gallery

Something you may have caught onto if you read any of my posts, I do things in a round about way. Mainly because I don’t want to have to pay for software to do something if I can make it happen my own way. (prime example: My back up and restore script)

If you want to talk to me about my scripts or want to use them, feel free to contact me! (the ways to contact me are over to the right)

Bash Scripts.

I run a web server that contains multiple websites and multiple SQL databases. I decided to use bash scripts to manage my backup’s and for any file restores that were needed. I know the argument will be, “Why bash scripts? there are software options that do the same thing”. The answer is, I couldn’t find one that fit my specific needs and so I decided to write something that fit my needs.

I read several forum threads of people trying to accomplish parts of what I wanted, but nobody had meshed everything together. If you are looking for a bash script that does a grandfather-father-son archive rotation of each user directory into it’s own archive and also does the same thing for databases, then these may work for you. The archives are stored in Daily, Weekly and Monthly. The rotation will do a daily backup Sunday through Friday. On Saturday a weekly backup is done giving you four weekly backups a month. The monthly backup is done on the first of the month rotating two monthly backups based on if the month is odd or even.

I hope you find these as useful as I did.

So this first script I found on the Ubuntu server Archive page.

This one archives the specified files/directories

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of specified directory
#
####################################

# What to backup.
backup_files=”/etc /root”

# Where to backup to.
dest=”/backup/system”

# Setup variables for the archive filename.
day=$(date +%A)
hostname=$(hostname -s)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”$hostname-week1.tgz”
elif (( $day_num > 7 && $day_num week_file=”$hostname-week2.tgz”
elif (( $day_num > 14 && $day_num week_file=”$hostname-week3.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”$hostname-week4.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”$hostname-month2.tgz”
else
month_file=”$hostname-month1.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”$hostname-$day.tgz”
else
archive_file=$week_file
fi

# Print start status message.
echo “Backing up $backup_files to $dest/$archive_file”
date
echo

# Backup the files using tar.
tar czf $dest/$archive_file $backup_files

# Print end status message.
echo
echo “Backup finished”
date

# Long listing of files in $dest to check file sizes.
ls -lh $dest/

 

This one keeps the same concept except that instead of defining which file/directory the script grabs all the directories within /home

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of each home directory within
# it’s own archive
#
####################################

# Where to backup to.
dest=”/backup/users”

# Setup variables for the archive filename.
day=$(date +%A)
#folder=$(backup)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”-week1.tgz”
elif (( $day_num > 7 && $day_num week_file=”-week2.tgz”
elif (( $day_num > 14 && $day_num week_file=”-week3.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”-week4.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”-month2.tgz”
else
month_file=”-month1.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”-$day.tgz”
else
archive_file=$week_file
fi

# Print start status message.
echo “Backing up This may take a few minutes.”

# Backup the files using tar.
for folder in $(ls /home); do
sudo -u $folder tar czf “$dest/$folder$archive_file” /home/”$folder”

# Print end status message.
echo
echo “Backup $folder complete.”
#date
done

# Long listing of files in $dest to check file sizes.
ls -lh $dest/
echo
echo “Backup is complete”
exit

 

Now, if you multiple SQL databases on your sever, this will benefit you a lot.

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of all your SQL Databases.
#
####################################

# Where to backup to.
dest=”/backup/sql-backup”

# Setup variables for the archive filename.
day=$(date +%A)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”-week1.sql.tgz”
elif (( $day_num > 7 && $day_num week_file=”-week2.sql.tgz”
elif (( $day_num > 14 && $day_num week_file=”-week3.sql.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”-week4.sql.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”-month2.sql.tgz”
else
month_file=”-month1.sql.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”-$day.sql.tgz”
else
archive_file=$week_file
fi

# Backup the files.
MYSQL=’/usr/bin/mysql’

MYSQLDUMP=’/usr/bin/mysqldump’
DUMPOPTS=’–opt –hex-blob –skip-extended-insert’

user=”CHANGEME”
pass=”CHANGEME”
# Get the names of the database tables
databases=`$MYSQL -u$user -p$pass –skip-column-names -e’SHOW DATABASES’`

# Write the compressed dump for each table
for db in $databases; do
filename=`date +”$dest/$db$archive_file”`
echo “creating $filename”
$MYSQLDUMP $DUMPOPTS -u$user -p$pass –database $db
| gzip -9 > $filename

done

echo “Backup of SQL Datases Complete”
exit

 

I currently have not written a restore for the SQL backup, if you have phpMyAdmin, you can use the import function.

Okay, so we have backed up all the home directories. What if someone needs something specific from a specific backup. I know that this script doesn’t account for a failure. I’m working on that.

#!/bin/bash
##############################
#
# Script written by Dan Walker
# for Merval.Org Hosting to
# restore a specific file
# from a specific backup.
#
##############################

# Specify Backup Directory
echo -n “Where are we restoring from? (default is /backup/users): ”
while read -e inputline
do
backup_path=”$inputline”
# Display what user typed
if [ -z “${backup_path}” ]
then
echo “You didn’t type anything”
backup_path=”/backup/users”
echo “Using $backup_path”
else

if [ -n “${backup_path}” ]
then
backup_path=”$inputline”
echo “Using custom location: $backup_path”
fi
fi

# Lets ask what user to restore
echo -n “Which user?: ”
read -e user

echo -n “What are we restoring? (leave out /home/): ”
read -e source

# Now lets figure out what backup to restore
echo -n “Which backup? (Daily, Weekly or Monthly): ”
read -e choice
if [ $choice = “Daily” ];
then
echo -n “Which day? (Sunday – Friday): ”
read -e date
echo “You chose $date”
echo “Starting restore process. This may take a moment”
cd $backup_path
sudo -u $user tar -xzf $user-$date.tgz -C / home/$user/$source
echo “Restored /home/$user/$source from the $date backup”
fi

if [ $choice = “Weekly” ];
then
echo -n “Which week? (1-4): ”
read -e week
echo “You chose to restore to $week(s) ago”
echo “Starting restore process. This may take a moment”
cd $backup_path
sudo -u $user tar -xzf $user-week$week.tgz -C / home/$user/$source
echo “Restored /home/$user/$source to the back from $week week(s) ago”
fi

if [ $choice = “Monthly” ]
then
echo -n “Which Month? (1 or 2): ”
read -e month
echo “You chose to restore to $month(s) ago”
echo “Starting restore process. This may take a moment.”
cd $backup_path
sudo -u $user tar -xzf $user-month$month.tgz -C / home/$user/$source
echo “Restored /home/$user/$source from $month month(s) ago”
fi

exit
done

 

Suggestions are always appreciated!

Thanks for reading!