Drupal: .htaccess Access Deny
August 21, 2014 – 7:59 am | No Comment

In this article I will tell how to forbid access to certain resources for some clients. The instructions will include descriptions of different directives.

Read the full story »
CSS Templates

Contain reviews and news about CSS Templates.

Freebies

Contain freebies such as icons, graphics, headers and images for your websites.

Fun Stuff

Contains other fun stuff for entertainment or interesting site showcase.

How-To

Contain technical elaborations on some specific workarounds or common tweak.

Joomla Templates

Contains reviews and news about Joomla templates.

Home » How-To

Drupal: How To Automate Creating Database Backups

Submitted by on November 30, 2011 – 11:30 amNo Comment

In this article I would tell you how to automate the process of making database backup copies.

Solution

There is a cron that allows to run some process at the required time or with the defined frequency. Cron is Unix-based so if you use Windows OS for your hosting, please contact your hosting provider to get know how the process could be run at the required time.  Actually, this article is for Unix-users.

Run crontab in unix shell and create a rule to start copying of database:

0 0 * * * mysqldump -uLOGIN -PPORT -hHOST -pPASS DBNAME | gzip -c > `date “+%Y-%m-%d”`.gz

Cron will run this command every day at 00:00 AM. It will create a dump of your database (DBNAME), gzip it (name of an archive will corresponds to the current date). For example, if you create a dump on Jan, 3, 2002, the command will create the following file: 2002-01-03.gz.

We use standard ‘date’ command to name files with a current date. This command allows to set a format for date output – date “+%Y-%m-%d”. We placed this command into backticks; for unix shell this means that the command will include other command work result.

Save the new rule for cron and wait for the results.  So, every day you will get a new archived copy of your database. You can quickly find the archive file by the date and restore the corrupted data. If you want to automate deleting of aold archives, try to use ‘find’ cron command. It is a standard unix commans.

Running find ~/folder-with-archives -name "*.gz" -mtime +7 from time to time, you will delete archives which were create more than 7 days ago. View the documentation about ‘find’ – it is available by man find command in unix shell.

If you have a PC that is constantly connected to the Internet, you can copy the created backup there by cron. Of course, provider’s hosting PC is a very reliable thing. But God helps those who help themselves Winking smile

Use ‘ftp’ and ‘scp’ commands to copy files to another PC. Add these commands to the cron. If your PC maintains SSH protocol use secure copy client to copy files – scp. You can read more about this command on the man-page: man scp.

Example:  scp 2002-01-03.gz login@your.host.ru: – download 2002-01-03.gz file to your.host.ru (being authorizing there as login).

Good luck!

Leave a comment!

Add your comment below, or trackback from your own site. You can also subscribe to these comments via RSS.

Be nice. Keep it clean. Stay on topic. No spam.

You can use these tags:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

This is a Gravatar-enabled weblog. To get your own globally-recognized-avatar, please register at Gravatar.