You want to keep a periodical, automatic backup of your database. And, for safety, you are thinking to keep these backups to a (preferably free) cloud storage. Then this post is for you ?.
It can be done in thousands of ways. But, I am trying to show you an easy, almost effortless process. Just bear with me for 10 mins and you’ll see it’s happening.
Let’s start.
Create SQL Dump of Database
Every Database System has its own way of generating SQL dump. For example, mysqldump
for MySQL and pg_dump
for PostgreSQL. We’ll use MySQL as example database for this post. So, a simplest example is (assuming DB on localhost) –
mysqldump -u db_user -p[db_pass] db_name > dump_file.sql
To ensure write permission and avoid naming conflict portably for (almost) all UNIX based systems, we’ll use an auto-generated temporary file to write the dump into.
tmpfile=$(mktemp) mysqldump -u db_user -p[db_pass] db_name > ${tmpfile}
Upload it to a cloud storage.
Now, let’s say, we’ll generate this type of dump files every midnight. Then where to store these backup files? Backing it up to a cloud storage is much preferable for safety and reliability. I’ve chosen MEGA because of 50GB space with free account, native apps for almost every desktop and mobile platform, strong security features and much more.
Go to MEGA and create a free account, hurry!
Install megatools
Another useful addition to MEGA features is, it has APIs, SDKs and tools to make our own integration! We’ll use a command line tool, megatools to upload our database backups files to MEGA. Installing megatools
is simple –
# For MacOS brew install megatools # For Ubuntu/Debiun sudo apt-get install megatools
For other systems, see the installation section of megatools readme.
Preserve backup files to MEGA cloud storage.
If you have activated your MEGA account and installed megatools
, it’s just a single line command to upload your backup. We’re dynamically renaming the file from current date-time to easily identify and avoid naming conflict.
# Comment out next (megamkdir) line if target directory is already created for remote_path megamkdir -u mega_account_email.com -p mega_password Root/DB_BACKUP megaput -u mega_account_email.com -p mega_password --path Root/DB_BACKUP/db_name_$(date '+%Y-%m-%d-%H-%M').sql ${tmpfile}
Remember that, the writable path for your MEGA storage will start with Root/
. Full Documentation on megatools is here.
Put everything together (and set Cron Job)
At this point, we’ve built all the pieces for our solution. Let’s put all into a shell script so that we can run all these steps with a single command. Also, let’s make it a little generic so that we can use it on any server, with any MEGA account, manageably.
Don’t worry, I’ve already done it for you. Just go there and check (, and also may star) it. I’ve moved all variable parts at the top in CONFIG section. So, just put the script in any UNIX based system, set the config variables, and it should work!
Take database backup automatically
The last task is automatically running this script periodically. For any Linux server or MacOS, You can do it easily by adding this script to Cron Job.
Now, for example, we want to take a backup at every midnight. First, Copy the script to your home directory with the name backup_db.sh
. Set you database and MEGA account info in CONFIG section. And run it to check if backup is uploading to your MEGA account.
sh ~/backup_db.sh
If you see backup files are uploading to the server, it’s time to add this command to Cronjob. Run –
crontab -e
It will open cron job file in systems default text editor. Add the following line at the bottom of file and save it.
00 00 * * * sh ~/backup_db.sh
For confirmation, you may check if our task is added to cron job list –
crontab -l
If the output shows our added task in list, we’re done. Just relax and wait for midnight to see out first automatic backup uploaded to MEGA storage. 🙂
To know more about Cron job scheduling syntax, check it.
Let us know if you’ve successfully implemented it. Also feel free to ask if you have any query or suggestion. Happy scripting!
I wouldn’t upload non-encrypted, sensitive data to free cloud storage account. So I propose additional step where the dump is encrypted.
Wouldn’t this require SSH access to the server? most hosts don’t provide SSH access due to security reasons.