Adding on to backup bash scripts

So, ages ago, I wrote up a couple of bash scripts that automated the backup process on a server. It’s all text based, so you’d have to be logged into a terminal or be ssh’d into the box.

Today, I got to thinking, I should give it a pretty web gui that makes the process easier and functions through a web gui.

It’s still really early in the through process, but I’m intrigued by the project and will start working on it soon.

Original Post: http://www.merval.org/2012/03/bash-scripts/

Github: https://github.com/merval/BackupScripts

The Hackers Diet, WordPress plugin

Awhile back, I started looking into fixing the hackers diet wordpress plugin and publishing it for the world to use. Here recently I finally started towards that goal. I reached out to the original creator and got his blessings to go forward.

 

So, first on the todo list, is to figure out the series of errors that keep showing up. I was able to correct the installation issues that were showing up. The biggest issue now is, when you enter your weight and save it, a whole slough of errors show up.

Second on the list is to update the jquery graphing, to one a bit more aesthetically pleasing.

With any luck, I will be able to give this plugin a new life and add a few features in the process.

When @afex wrote the original code, he laid the groundwork for a widget, which I completed and it works perfectly. Another issue that I ran into was, when entering a weight on a given day, it always rolled back a day. I got that fixed as well.

Thanks for reading, until next time.

Dan

 

I’ve been a code junkie recently.

So, recently I have been doing a lot of coding, both bash and php/html. The other day I was at work and a co-worker of mine who handles abuse complaints mentioned he had to process over 100 domains for something he was working on. I instantly thought about writing code to accomplish it, I asked what he needed off of the domains, to which he responded IP and who owns it.

At first I started with a pretty simple script, with two domains hard coded into the script for testing purposes, which worked decently. 

Here is the original script.

<?php
$x=array(“domain.tld”,”domain.tld”);
foreach($x as $host){
$ip = gethostbyname($host);

echo “$host – $ip”;
echo “<br />”;

}

?>

While this works, it was missing a few things, it would be really difficult to enter all 100 domains, each needing to be separated by a comma and wrapped in quotes, just a little too much work for me. Plus it doesn’t have the whois information. So, onward I go.

 

My second incarnation of the script allows for user entry of the domains, the only thing that’s needed is a comma separated list.

Here is the second script.

<html>
<form name=”list” action”” method=”post”>
<input type=”text” name=”list”>
<input type=”submit” name=”submit” value=”Submit”>
</form>
</html>
<?php
if(isset($_POST[‘submit’])){
$list = $_POST[‘list’];

$listreturn = explode(‘,’, $list);
foreach($listreturn as $host){
$ip = gethostbyname($host);

$whois = shell_exec(‘whois -h whois.arin.net ‘ . $ip);
$result = preg_match(‘/^OrgName:s*(.+)$/m’, $whois, $matches);
$org = $matches[1];
echo “$host – $ip – <b>$org</b>”;
echo “<br />”;
}
}

?>

This script offers the ability to have a list of comma separated domains and it does all the work for you. It takes the domains, gets the IP address of the domain and then does a whois on that IP address to determine who owns the IP and then presents a nice list of information.

Within an hour I had the second version up and running and about 5 minutes later the list of domains that my co-worker had compiled, was processed and done.

 

“Off The Grid” Cloud File Hosting

I have used Google Drive and Dropbox for quite sometime without any encryption, and thought nothing of it. That is until this whole N.S.A. thing cropped up.

I had previously installed and played with OwnCloud and found it be quite nice. It doesn’t have all of the options that Dropbox or Google Drive have, but for off the grid cloud hosting, it’s pretty damn nice!

One thing I wanted to have with my OwnCloud was encryption to all my files. I found that OwnCloud has an app for that! So I’ve enabled it and got it all working.

That got me thinking, even though there are a lot of services out there offering encrypted cloud file hosting, I still don’t entirely trust them.. So, what I’ve decided to do is offer 5GB of free Cloud file hosting. I have limited it to just 15 people though. Mainly because, unlike the big guys, I don’t have endless amounts of Data Storage. Just email dwalker at merval dot org asking for a cloud hosting account.

Garden Cam Gallery Project

For awhile now, I’ve had my Garden Cam up and running but I wanted a photo gallery that wasn’t a pain to maintain or setup. I also needed it to auto detect directories and photos. I stumbled across SFPG (Single File Photo Gallery) which does just that!

My only problem was, the webcam only uploads a single image every 5 minutes. My fix? a simple bash script to copy that file into a directory 1 minute after the new image uploads. The image starts at cam_1.jpg but gets transferred with date and time stamps on it for easier sorting with another bash script. Once it’s in the new directory and renamed, it gets sorted:

DATE=$(date +%m%d)
Dir=$(date +%m-%d)

# Move the files
for file in cam-“$DATE”*.jpg; do
mkdir -p “$Dir”
mv $file “$Dir”
done

I have the sort script run one minute after the file gets moved, this way I don’t have any issues with the sort script running before the image is moved.

So check it out Gallery

Something you may have caught onto if you read any of my posts, I do things in a round about way. Mainly because I don’t want to have to pay for software to do something if I can make it happen my own way. (prime example: My back up and restore script)

If you want to talk to me about my scripts or want to use them, feel free to contact me! (the ways to contact me are over to the right)

OwnCloud

You’re familiar with the term cloud right?… If not, I’m sure you have heard of Dropbox, Box, Google Driveetc. These are all Cloud services, they store your data on a server that is accessible from anywhere. So, what I’m going to do, is create my own using OwnCloud and an Ubuntu Server I have (currently it sits and runs PyTivo.. That’s a different blog post 🙂 ) (Edit: Yes I am aware of Ubuntu Enterprise Cloud Server.. I will look at installing and playing with this.)

I was watching TWiT‘s new show “Know How…” and their first episode was about rolling out your own cloud a few different ways, they shows the Tonido Plug and the Pogo Plug. Now Tonido has a software suite you can download and use but from the sounds of it, you are actually allow them to see some of your data (at least that is how it is with the Tonido plug device). I’m not so interested in allowing that to happen, not because I’m doing anything illegal but because I don’t like the idea of people willy nilly looking at my data.

Now, looking at OwnCloud you can run the software on your machine (Desktop or Server, I prefer server) and it will simply host a Web GUI that you can access your data from. OwnCloud comes with a client you install on your device and you can access your data from your server. I don’t know if there is a mobile app yet. I mainly want this setup so I can easily access content from my home server without needing to worry about a super low max limit (Dropbox currently has 4gb on my account and Box has 50gb).

I plan on toying with it and seeing what all it does and then writing a review on it. The idea of running your own cloud (If you have an ISP that gives you a nice upload speed and doesn’t limit you) is a really neat idea. You don’t have to worry about uploading to a server, where God knows who is looking at your data.

The other option is to use software to encrypt your data before you upload to these cloud services, the reason I dislike that idea is that when I want to run in and grab something quick I don’t want to have to worry about, “Does this computer/device have the software to allow me to view this?”

So, I will install and toy with OwnCloud on my Sandbox machine and see what I come up with.  If you do not have a sandbox machine. You should REALLY invest in one. Mine is basically an old computer I had laying around after an upgrade that I tossed some hardware into. You can also pick up computers pretty cheap on ebay or a local computer recycler. Free Geek is a good place to look too.

See you on the other side!

-Dan

Bash Scripts.

I run a web server that contains multiple websites and multiple SQL databases. I decided to use bash scripts to manage my backup’s and for any file restores that were needed. I know the argument will be, “Why bash scripts? there are software options that do the same thing”. The answer is, I couldn’t find one that fit my specific needs and so I decided to write something that fit my needs.

I read several forum threads of people trying to accomplish parts of what I wanted, but nobody had meshed everything together. If you are looking for a bash script that does a grandfather-father-son archive rotation of each user directory into it’s own archive and also does the same thing for databases, then these may work for you. The archives are stored in Daily, Weekly and Monthly. The rotation will do a daily backup Sunday through Friday. On Saturday a weekly backup is done giving you four weekly backups a month. The monthly backup is done on the first of the month rotating two monthly backups based on if the month is odd or even.

I hope you find these as useful as I did.

So this first script I found on the Ubuntu server Archive page.

This one archives the specified files/directories

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of specified directory
#
####################################

# What to backup.
backup_files=”/etc /root”

# Where to backup to.
dest=”/backup/system”

# Setup variables for the archive filename.
day=$(date +%A)
hostname=$(hostname -s)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”$hostname-week1.tgz”
elif (( $day_num > 7 && $day_num week_file=”$hostname-week2.tgz”
elif (( $day_num > 14 && $day_num week_file=”$hostname-week3.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”$hostname-week4.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”$hostname-month2.tgz”
else
month_file=”$hostname-month1.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”$hostname-$day.tgz”
else
archive_file=$week_file
fi

# Print start status message.
echo “Backing up $backup_files to $dest/$archive_file”
date
echo

# Backup the files using tar.
tar czf $dest/$archive_file $backup_files

# Print end status message.
echo
echo “Backup finished”
date

# Long listing of files in $dest to check file sizes.
ls -lh $dest/

 

This one keeps the same concept except that instead of defining which file/directory the script grabs all the directories within /home

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of each home directory within
# it’s own archive
#
####################################

# Where to backup to.
dest=”/backup/users”

# Setup variables for the archive filename.
day=$(date +%A)
#folder=$(backup)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”-week1.tgz”
elif (( $day_num > 7 && $day_num week_file=”-week2.tgz”
elif (( $day_num > 14 && $day_num week_file=”-week3.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”-week4.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”-month2.tgz”
else
month_file=”-month1.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”-$day.tgz”
else
archive_file=$week_file
fi

# Print start status message.
echo “Backing up This may take a few minutes.”

# Backup the files using tar.
for folder in $(ls /home); do
sudo -u $folder tar czf “$dest/$folder$archive_file” /home/”$folder”

# Print end status message.
echo
echo “Backup $folder complete.”
#date
done

# Long listing of files in $dest to check file sizes.
ls -lh $dest/
echo
echo “Backup is complete”
exit

 

Now, if you multiple SQL databases on your sever, this will benefit you a lot.

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of all your SQL Databases.
#
####################################

# Where to backup to.
dest=”/backup/sql-backup”

# Setup variables for the archive filename.
day=$(date +%A)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”-week1.sql.tgz”
elif (( $day_num > 7 && $day_num week_file=”-week2.sql.tgz”
elif (( $day_num > 14 && $day_num week_file=”-week3.sql.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”-week4.sql.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”-month2.sql.tgz”
else
month_file=”-month1.sql.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”-$day.sql.tgz”
else
archive_file=$week_file
fi

# Backup the files.
MYSQL=’/usr/bin/mysql’

MYSQLDUMP=’/usr/bin/mysqldump’
DUMPOPTS=’–opt –hex-blob –skip-extended-insert’

user=”CHANGEME”
pass=”CHANGEME”
# Get the names of the database tables
databases=`$MYSQL -u$user -p$pass –skip-column-names -e’SHOW DATABASES’`

# Write the compressed dump for each table
for db in $databases; do
filename=`date +”$dest/$db$archive_file”`
echo “creating $filename”
$MYSQLDUMP $DUMPOPTS -u$user -p$pass –database $db
| gzip -9 > $filename

done

echo “Backup of SQL Datases Complete”
exit

 

I currently have not written a restore for the SQL backup, if you have phpMyAdmin, you can use the import function.

Okay, so we have backed up all the home directories. What if someone needs something specific from a specific backup. I know that this script doesn’t account for a failure. I’m working on that.

#!/bin/bash
##############################
#
# Script written by Dan Walker
# for Merval.Org Hosting to
# restore a specific file
# from a specific backup.
#
##############################

# Specify Backup Directory
echo -n “Where are we restoring from? (default is /backup/users): ”
while read -e inputline
do
backup_path=”$inputline”
# Display what user typed
if [ -z “${backup_path}” ]
then
echo “You didn’t type anything”
backup_path=”/backup/users”
echo “Using $backup_path”
else

if [ -n “${backup_path}” ]
then
backup_path=”$inputline”
echo “Using custom location: $backup_path”
fi
fi

# Lets ask what user to restore
echo -n “Which user?: ”
read -e user

echo -n “What are we restoring? (leave out /home/): ”
read -e source

# Now lets figure out what backup to restore
echo -n “Which backup? (Daily, Weekly or Monthly): ”
read -e choice
if [ $choice = “Daily” ];
then
echo -n “Which day? (Sunday – Friday): ”
read -e date
echo “You chose $date”
echo “Starting restore process. This may take a moment”
cd $backup_path
sudo -u $user tar -xzf $user-$date.tgz -C / home/$user/$source
echo “Restored /home/$user/$source from the $date backup”
fi

if [ $choice = “Weekly” ];
then
echo -n “Which week? (1-4): ”
read -e week
echo “You chose to restore to $week(s) ago”
echo “Starting restore process. This may take a moment”
cd $backup_path
sudo -u $user tar -xzf $user-week$week.tgz -C / home/$user/$source
echo “Restored /home/$user/$source to the back from $week week(s) ago”
fi

if [ $choice = “Monthly” ]
then
echo -n “Which Month? (1 or 2): ”
read -e month
echo “You chose to restore to $month(s) ago”
echo “Starting restore process. This may take a moment.”
cd $backup_path
sudo -u $user tar -xzf $user-month$month.tgz -C / home/$user/$source
echo “Restored /home/$user/$source from $month month(s) ago”
fi

exit
done

 

Suggestions are always appreciated!

Thanks for reading!

Merval.Com

Well, the auction to get the domain ends in about 10 days. They are asking $100 dollars for the domain, which is a little steep. I offered $25 dollars and so far, I am the only person to bed. I started at 10 dollars and then 19 dollars and now 25 dollars.. With 10 days to go in the auction I would think they would realize that nobody else will bid on this domain and cave in. Besides.. I’m not paying 100 dollars for a domain.

Now we wait.

Dan

So.. Merval.Org

So I re-launched merval.org the other day and I really don’t know what I want to do with it.. I did not own the domain for several years as some guy bought it up the moment I allowed it to expire, I didn’t have the money to renew it. When I did enough money to renew the domain I contacted the guy who bought it (or a secretary of the guy.. it was weird) . He was asking me for 500 dollars to buy the domain back. Had I REALLY generated that much buzz in the prior 3 years I owned the domain? hardly. I think I had something like 200 – 400 unique hits a month. Which isn’t anything to write home about.

My other domain, pdxrevolution.com gets somewhere in the range of 500~ a month. I don’t know what this guy was thinking trying to sell me a domain I owned for 500 dollars, maybe he saw something in the domain I didn’t? I know there is a company overseas called Merval and they own… wow nevermind, I just checked merval.com and they don’t own the domain anymore. Godaddy is auctioning the domain off for $100 dollars.. It expires in June. I wonder how many have bid on it… *looking* hah 0 people. It has been viewed by 5 people. The domain should have some importance since (http://en.wikipedia.org/wiki/MERVAL) it is a stock exchange in Buenos Aire (http://en.wikipedia.org/wiki/Buenos_Aires_Stock_Exchange) …. Not too sure why they let that domain expire.. I’m going to keep my eye on that and see if anyone bids.

Back to Merval.Org. I’m having a hard time finding what I want to do with it. I don’t blog that often and when I actually blog I do it here (livejournal). I will sit here and drink my coffee and ponder the thought.

Thanks for reading!

Dan