Easily Maintain Multiple Websites with SSH

multiple websites

Over the past two weeks, here, and here, I told you how to set up SSH so your remote access is both more secure and more convenient.

Let’s put that to use!

The Scenario

Let’s say that you maintain a website, and you create and edit pages on both your office desktop computer and your laptop. Their hostnames are desktop and laptop, respectively. You run Apache on both, so your local copies are kept under /var/www/htdocs/. After you have made some changes, you want to bring your laptop into the office and push all the updates from wherever you made them to the other local computer and to the public web server. That public web server is hosted by GoDaddy, so you can connect in with SSH and your website is stored there in ~/html/.

All directory and file names are based entirely on upper and lower case letters A-Z and a-z, digits 0-9, and punctuation marks period “.”, hyphen “-“, and underscore “_“.

To keep things simple, you are only doing this for a single website. You can expand this to maintain multiple websites by keeping them in multiple locations and putting some logic in the loop to decide what goes where.

You could solve this problem with rsync, but without it, I have a great excuse (er, I mean opportunity) to share some possible uses of SSH within a script. The script will use scp to authenticate seamlessly over SSH using cryptographic keys. No passwords, so the script can automatically loop through a large number of scp connections.

What Does “New” Mean?

I maintain an empty file named /var/www/TIMESTAMP. Once everything is in sync on the three machines, I use touch to update that file’s timestamp. Everything under /var/www/htdocs/ newer than TIMESTAMP needs to be pushed out, and then TIMESTAMP will be updated.

I have to be careful about directories. If I have created new directories and new files within them, the script would fail if it blindly attempted to copy a file into a directory that doesn’t exist yet.

I also have to be careful about Vim temporary files (or “swap files” it calls them), which are named .*.swp. If I’m still in an editor session, I don’t want to scatter those around.

The Overall Plan

  1. Figure out where I am, and what the other local machine is.
  2. Find the list of new directories, creating them as needed.
  3. Find the list of new files, copying them with attributes intact.

Check every step, and exit early with a distinctive error message and return value if anything goes wrong.

Only if everything works will I update the timestamp file, indicating that only those files now newer than it needs to be distributed.


# Where are we, and what is the other local host?
THISHOST="$( uname -n | sed 's/\..*//' )"
if [ "$THISHOST" == "laptop" ]; then

# Do we have an SSH agent ready with keys?
# Command return status is:
#  0 = success
#  1 = agent has no keys
#  2 = no agent running
ssh-add -l > /dev/null
if [ $? != "0" ]; then
    echo ''
    echo 'ERROR: No SSH agent with keys!'
    echo ''
    echo 'Hint: run ssh-add and try again.'
    echo ''
    exit 1

# Change this to your GoDaddy hosted account:

# Define the timestamp file.

# Go to the locally maintained web site.
cd /var/www/htdocs

# Find the list of new files.
# Need to explicitly name .htaccess to also consider it.
# Strip out files named "*.swp".
# Sorting isn't needed for technical reasons,
# but it is needed to make the output easier to read.
NEWFILES=$(find .htaccess * -type f -newer $TS | grep -v '\..*\.swp$' | sort)

# Make an archive.
tar cf /tmp/archive-$$.tar "${NEWFILES}"
if [ $? != "0" ]; then
        echo ""
        echo "ERROR: Failed to create archive."
        echo ""
        exit 2
        echo "Created archive of $(tar tf /tmp/archive-$$.tar | wc -l) files."

# Copy the archive into place.
scp /tmp/archive-$$.tar ${ACCOUNT}:archive.tar
if [ $? != "0" ]; then
        echo ""
        echo "ERROR: Copy of archive to account ${ACCOUNT} failed."
        echo ""
        exit 3
scp /tmp/archive-$$.tar ${OTHERHOST}:archive.tar
if [ $? != "0" ]; then
        echo ""
        echo "ERROR: Copy of archive to ${OTHERHOST} failed."
        echo ""
        exit 4

# Extract the archive on the remote machines.
ssh $ACCOUNT "cd html ; tar xvf ~/archive.tar ; rm ~/archive.tar"
if [ $? != "0" ]; then
        echo ""
        echo "ERROR: Extracting archive failed on ${ACCOUNT}."
        echo ""
        exit 5
ssh $OTHERHOST "cd /var/www/htdocs ; tar xvf ~/archive.tar ; rm ~/archive.tar"
if [ $? != "0" ]; then
        echo ""
        echo "ERROR: Extracting archive failed on ${OTHERHOST}."
        echo ""
        exit 6

# If we reached this point, everything has worked.
# Update the timestamp file so we know we don't have
# to do these again.
touch $TZ
ssh $OTHERHOST touch $TZ
rm /tmp/archive-$$.tar

So here’s the challenge: This is just my duct-tape-and-baling-wire solution to the problem. If you take the shell programming course you will learn how to really solve a problem like this. What did I get wrong, or at least what could I have done better?

Type to search blog.learningtree.com

Do you mean "" ?

Sorry, no results were found for your query.

Please check your spelling and try your search again.