Quicker Dumping of a Magento MySQL Database for Branching

Dump Magento Database Fast

Any experienced Magento developer will certainly have felt the pain of dumping a multi-gigabyte MySQL database for a Magento store. Which is certainly a tedious process when you want to make a quick branch or rapidly update your staging environment.

We're big advocates of branch often and merge often - but this also means updating the DB relatively frequently too (if there has been DB changes on master/trunk). Usually, its just an excuse to go get a fresh cup of coffee whilst you while away the 2 minutes it will take to dump and however long it then takes to re-import into a new DB. But, unfortunately, there is only so much coffee we should be drinking in the day, so speeding up this process is a must.

Typically, a staging or development branch, doesn't actually required all the data the live site has (sales_quotes, logs, reports, dataflow history), so we gain a big speed advantage by simply ignoring these tables.

Please note. This script is not to serve as a means to backup a database - as it removes data from tables that we deem non-essential when branching/mirroring a store for development.

The features

  • Using the script is really easy
  • Significantly faster dumps (on a 3GB DB, total time is 0m46s vs 2m10s)
  • Auto-population of username, password, database name and host name
  • Support for CE and EE
  • No table-level locks - so safe to use on a production/live site
  • You can use it to re-import the dump too

Dump a database

Running the script is really easy, it will automatically populate the username, password, database name and host from your local.xml file.

cd /path/to/my/magento/store
wget -O mage-dbdump.sh sys.sonassi.com/mage-dbdump.sh
chmod +x mage-dbdump.sh
./mage-dbdump.sh

So in practice, it looks a little like this. Execution on a 3GB database takes just under 50 seconds.

cd /home/sonassi/public_html
wget -O mage-dbdump.sh sys.sonassi.com/mage-dbdump.sh
chmod +x mage-dbdump.sh
./mage-dbdump.sh
########################################    

MYSQL DUMP COMPLETE

Backup Location: ./var/db.sql

########################################  

Restore a database

cd /home/sonassi/public_html
wget -O mage-dbdump.sh sys.sonassi.com/mage-dbdump.sh
chmod +x mage-dbdump.sh
./mage-dbdump.sh --restore
Are you sure you want to restore ./var/db.sql to *mage_live*? [y/N]: *y*
########################################    

MYSQL IMPORT COMPLETE

########################################  

You can download the file here.

  • Pingback: Our Magento Git Guide and Work Flow | Sonassi Makes Magento Ecommerce Websites()

  • http://twitter.com/hornofj Johan Hornof

    Nice. I'd add --routines into the mysqldump command as some modules use mysql stored procedures. (e.g. Camiloo AmazonImport module)

  • http://twitter.com/groggu greg croasdill

    Thanks! This works like a champ. I've been cutting and pasting these commands for years, good to have them packaged up in a handy script.

  • http://www.8uhr30.com Kim

    Just realized the file was updated. Thanks!

  • James Phillips

    This is probably a basic question, but where do I run this script?

    • sonassi

      Put it in the document root of your Magento installation and run it via SSH

  • Silas Palmer

    Great script guys. I've got a slight change that will make it work with databases on a non-standard port (eg 192.168.1.1:1234)

    Find this part of the code (around line 154)

    DBHOST=$(getParam "host")
    DBUSER=$(getParam "username")
    DBPASS=$(getParam "password" "sanitise" )
    DBNAME=$(getParam "dbname")

    Insert this line just under DBHOST=...

    DBHOST=`echo $DBHOST | sed 's/:/ -P /'`

    so that the new code snippet looks like:
    DBHOST=$(getParam "host")
    DBHOST=`echo $DBHOST | sed 's/:/ -P /'`
    DBUSER=$(getParam "username")
    DBPASS=$(getParam "password" "sanitise" )
    DBNAME=$(getParam "dbname")

    Cheers,
    Silas.

  • slice mario

    MY default_setup is................................

    http://s13.postimg.org/glglrp91j/Snap1.jpg

    THE ERROR is ....................................................

    $ ./mage-dbdump.sh -d
    mysqldump: Got error: 1049: Unknown database '/var/run/mysqld/mysqld.sock' when selecting the databa
    se
    ######################################

    MYSQL DUMP COMPLETE
    ######################################

    THE SOLUTION is to support UNIX Socket in mage-dbdump.sh

    if [[ $DBHOST == /* ]]; then
    MYSQL_ARGS="--protocol=socket -S $DBHOST -u $DBUSER $DBNAME"
    else
    MYSQL_ARGS="-h $DBHOST -u $DBUSER $DBNAME"
    fi

  • matgen

    mage-dbdump.sh is a great script, thanks!

    I don't know if its always this way, but I found that the Aggressive option was not working properly (and I added a new option that expanded upon the Aggressive option and a new array that included more tables I wanted to exclude). The array assignment on line 69:

    IGNORE_TABLES=( $IGNORE_TABLES $IGNORE_TABLES_AGGRESSIVE )

    This only assigns the first index value of each array to IGNORE_TABLES. So instead of having all of the tables from IGNORE_TABLES and IGNORE_TABLES_AGGRESSIVE, you end up only excluding IGNORE_TABLES[0]='dataflow_batch_export' and IGNORE_TABLES_AGGRESSIVE[0]='report_compared_product_index'.

    I fixed this by expanding the arrays in the assignment like this:

    IGNORE_TABLES=( ${IGNORE_TABLES[@]} ${IGNORE_TABLES_AGGRESSIVE[@]} )

    If you are interested in adding a custom option for adding more tables to exclude (maybe you have extensions that created tables you do not want), insert a new array variable at line 5 in the same format as IGNORE_TABLES and IGNORE_TABLES_AGGRESSIVE.

    For usability, add your new option explanation to the list that begins on line 28 (after inserting line 5 with the new variable).

    In the switch case while loop, the getopts string, add your new letter choice (after my other additions, this is on/around line 120).

    And finally, on line 72, immediately following what was originally line 69 described above, recreate line 69 and change the $OPT_ letter to your letter choice. Then insert your new array variable from line 5 to the end of the array assignment list in the same format as my fix above. e.g. line 72:
    [[ ! "$OPT_M" == "" ]] && IGNORE_TABLES=( ${IGNORE_TABLES[@]} ${IGNORE_TABLES_AGGRESSIVE[@]} ${MY_NEW_ARRAY[@]} )

    Again, great script Sonassi! Thank you so much. It will be a lot of help as we get started with it. I hope this can help someone.

    • sonassi

      Hey Matgen,

      Thanks for the bug fix and feature request - I've integrated them for you.

  • paul(uk)

    Hi, this looks useful but was wondering if you have a more details user guide on how to use it. my SSH skills are very poor, so looking for a step by step kinda user guide.

  • http://softag.pt Alex Parra

    Hey Sonassi, this looks like a great script! Thanks.

    I'm trying to use it on a Mac and get this error on dump:

    $ ./mage-dbdump.sh -dz
    sed: RE error: illegal byte sequence
    mysqldump: Got errno 32 on write

    Could not find out what it means...
    Thanks in advance.

    • https://www.sonassi.com sonassi

      Hi Alex,

      Try running this before using the script, `export LANG=C`

      • http://softag.pt Alex Parra

        PERFECT! Works now.
        Thank you very much!

  • https://www.kerstner.at mkerstner

    Hi, great script thanks!

    Here is little tweak to be able to specify the path to the mysqldump executable, e.g. when using XAMPP for development:

    1) Add

    MYSQLDUMPPATH="/Applications/XAMPP/xamppfiles/bin/mysqldump"

    to the configuration section and

    2) Replace

    ( mysqldump -p"$DBPASS" $MYSQL_ARGS --no-data --routines --triggers --single-transaction;
    mysqldump -p"$DBPASS" $MYSQL_ARGS $IGNORE_STRING --no-create-db --single-transaction ) | sed 's/DEFINER=[^*]**/*/g'

    with

    ( $MYSQLDUMPPATH -p"$DBPASS" $MYSQL_ARGS --no-data --routines --triggers --single-transaction;
    $MYSQLDUMPPATH -p"$DBPASS" $MYSQL_ARGS $IGNORE_STRING --no-create-db --single-transaction ) | sed 's/DEFINER=[^*]**/*/g'

    Thanks again
    Matthias

  • Lorenzo

    great! thank you