Versioning WordPress Content

Standard

Something that winds up being difficult when developing for WordPress (or Drupal, or any other database heavy CMS), is the versioning of content. Databases aren’t under your SCM directly. Because of this, deploying changes to production can be dicey.

The scenario is typically something like this:

  • Client wants 5 new pages, and a few revisions to existing pages.
  • Developer creates pages and makes modifications on staging environment.
  • Client requests revisions
  • Developer revises staging environment as requested.
  • Client approves
  • Developer has to remember what he did, and repeat those steps in production.

Needless to say, this practice is fraught with potential for error. If this is 80% successful, I’d be surprised. That’s not going to cut it for most clients however. If your doctor was 80% successful, you’d be very upset. If traffic lights were 80% successful, you might be dead.

So, there needs to be a way to snapshot your staging environment, and push it to production. Unfortunately, there aren’t many tools that do this sort of thing directly. Even the ones that do, are often not done very well. A hand-written solution is possible however. It’s really not even that hard to do.

Enter bash, and ant.

I suspect many of you are rolling your eyes right now. “BASH? ANT? That’s for nerds!” You’re right about that. However, to get across this issue, it’ll help you to learn from some nerds.

For this tutorial, let’s assume you have the following folder structure for your wordpress site:

ROOT
 - public (wordpress content)
 - var
 - - log
 - - backups
 - tmp (sessions, cache, etc)
 - etc (configurations)
 - bin (scripts)

Here’s a bash script you can run, to dump your database into 3 different formats (dev, staging, and production):

#!/bin/bash

#dump the database to var/backups, replacing local information
mysqldump -u root -p database_name > var/backups/dev.$(date +%Y-%m-%d).sql
mysqldump -u root -p database_name | sed 's/dev\.site_name/staging.site_name/g' > var/backups/staging.$(date +%Y-%m-%d).sql
mysqldump -u root -p database_name | sed 's/dev\.site_name/www.site_name/g' > var/backups/production.$(date +%Y-%m-%d).sql

# navigate to the backups folder, and create the updated symlink
cd var/backups

# remove old symlinks
rm development.sql
rm staging.sql
rm production.sql

# create new symlinks
ln -s dev.$(date +%Y-%m-%d).sql development.sql
ln -s staging.$(date +%Y-%m-%d).sql staging.sql
ln -s production.$(date +%Y-%m-%d).sql production.sql

That will dump the contents of your development environment, replacing anything that says “dev.site_name” into the appropriate substitution for your other environments. I use a development, staging, and production environment for most of my work, but you might have different ones. Adjust these scripts as necessary to accomodate what you require.

One more note: That script will ask you for the root password three times. You’re more than welcome to change the user from root as well. I do this, because typically I run these scripts on my local machine. I don’t really care about username / password security of the dev sites on there. I’m not using those credentials anywhere else.

So, to import these changes into a staging, or production environment, I typically use ant. This automates most of the work for me. Here’s an example script:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>
<project name="sitename.com" default="build">

    <!-- build details left out for brevity -->

    <target name="backup-development">
        <exec dir="${basedir}" executable="bash" failonerror="true">
            <arg line="../bin/backup-development.sh" />
        </exec>
    </target>

    <target name="import-development">
        <exec dir="${basedir}" executable="mysql" failonerror="true">
            <arg line="-u mysql_user -pdatabase_password mysql_db_name -e 'source ${basedir}/../var/backups/development.sql'"/>
        </exec>
    </target>

    <target name="import-staging">
        <exec dir="${basedir}" executable="mysql" failonerror="true">
            <arg line="-u mysql_user -pdatabase_password mysql_db_name -e 'source ${basedir}/../var/backups/staging.sql'"/>
        </exec>
    </target>

    <target name="import-production">
        <exec dir="${basedir}" executable="mysql" failonerror="true">
            <arg line="-u mysql_user -pdatabase_password mysql_db_name -e 'source ${basedir}/../var/backups/production.sql'"/>
        </exec>
    </target>

    <target name="development" depends="clean,build,import-development" />
    <target name="staging" depends="clean,build,import-staging" />
    <target name="production" depends="clean,build,import-production" />
</project>

So, the process now goes like this:

  • Client wants 5 new pages, and a few revisions to existing pages.
  • Developer creates pages and makes modifications on staging environment.
  • Client requests revisions
  • Developer revises staging environment as requested.
  • Client approves
  • Developer runs backup scripts, and adds updates files to SCM.
  • Developer runs update on production, which should also fire the import-production target

There’s plenty more to this than what I’ve written so far, but this is the generality I’ve been working with for a while. It’s served me pretty well.

  • HigherEdUser

    What’s in your “backup-development.sh” file? sorry I’m a newbie