Can I backup my database and site files automatically?

Requested and Answered by Carnuke on 2004/10/30 13:00:02

Can I backup my database and site files automatically?

How to automate the database backup and restore process

It's not long before your site will begin to create a large database, probably with valuable content that you want to be able to backup and restore easily and regularly .

The problem is that using phpMyAdmin can result in sql timeouts that can cause difficulties in the restore process. If you have access to cPanal this is the easiest way to backup a database and a total file backup of your complete webspace. The cPanel backup facility is available from your database management pages.

The alternative is using phpMyAdmin that can export DB contents but this is a very labour intensive. Moreover, after a while your DB become too large to be restored with phpMyAdmin in less than 30" (Server sql session for a querry). In fact queries are often too long to run with this method. The problem is that the INSERT queries writes the full table contents in one query. This explains why the sql timeout can be reached at a volume limit.

There is now a solution that provides automatic updates as follows

- Automatic and schedduled backups,
- Get record by record INSERTs,
- be able to import data before sql timeout.


DB_Backup, THE script

DBBackup is a php script by wolf (wolf at that just fit our requirements to schedule backups.

Description: Designed to create a daily snapshot of the dynamic data (database) on your website. (e.g. content management systems, forums, guestbooks etc.) Some of the features include:

- Backup multiple databases and servers with different users and passwords.
- Backups can be scheduled using cron (on UNIX systems) or AT (on Windows systems).
- Create daily, weekly and monthly snpashots of your database.
- Keep the daily, weekly and monthly backups for a user-specified time.
- Archive and compress the SQL files (*.tar.gz).
- Creates a detailed report of everything that has been done.
- Reports can be mailed to you or any other user- specified email-address, saved to disk or displayed in browser.
- Generated SQL files can be mailed to you or any other user-specified email-address.
- Generated SQL files can be uploaded by FTP to any user-specified FTP server.
- Choose to create SQL files for each table or one for each database
- No system calls, everything in 100% pure PHP.


Installation and configuration
unzip .tar.gz file,
Copy into and open it.
Feed blank configuration vars (servers, DBs, Users, passes and all other parameters)
Upload db_backup on your web
In your browser, test script with this link:


If you access you server's cron jobs, ie in cPanel, you are able to schedule a task to run the script periodically.

Details for this installation are avalable in the INSTALL file.


This script is independent, it includes all its own ressources.
gzipped format, joined with mail support enable to regulary get on your desktop a copy of your DBs.
Generated SQLs uses record by record INSERTs. Then import timeout limit shouldn't be reached anymore.
File is directly usable in phpMyAdmin (with gzip activated).
A Xoops independant script that should complete you web site setup.


Bigdump, The script

Thanks Alexey Ozerov (alexey at for this script that imports big SQL dump files on limited sessions servers. Script Valid with IE 6.0 SP1, mozilla 1.x and Netscape 4.8.


Staggered import of large MySQL Dumps (like phpMyAdmin 2.x dumps and even GZip compressed dumps). Do you want to restore a large backup of your mySQL database (or a part of it) into a new or the same mySQL database? Maybe you can't access the server shell and you can't import the dump using phpMyAdmin due to hard memory/runtime limit of the web-server. BigDump will achieve this
even with webservers with hard runtime limit and those in safe mode in a couple of short sessions.


Installation and configuration

One unique script. Modify setup vars ( DB connextion and gzip file URL) and upload it on your server.


This is a manual operation (Databases are not usually restored daily)
Post gziped sql Dump at devoted URL and run the script.

Simple, ths script is an excellent DB_Backup complement. No more sql files cut to fit phpMyAdmin sessions that a so boring (and dagerous) as your DBs bocomes too big.

The only thing that's certain to happen quite regularly on a web page are page requests. This is where pseudo-cron comes into play: With every page request it checks if any cron jobs should have been run since the previous request. If there are, they are run and logged. pseudo-cron uses a syntax very much like the Unix cron's one. All job definitions are made in a text file on the server with a user-definable name (usually crontab.txt). An extensive logging facility is also included for logging job results, and you may choose to receive emails with the job results.


Happy backups

Adapated from French Xoops Team:, and

This Q&A was found on XOOPS Web Application System :