smartfaq
SmartFAQ is developed by The SmartFactory (http://www.smartfactory.ca), a division of InBox Solutions (http://www.inboxsolutions.net)

Can I backup my database and site files automatically?
Requested and Answered by Carnuke on 2004/10/30 13:00:02 (14882 reads)
How to automate the database backup and restore process

It's not long before your site will begin to create a large database, probably with valuable content that you want to be able to backup and restore easily and regularly .

The problem is that using phpMyAdmin can result in sql timeouts that can cause difficulties in the restore process. If you have access to cPanal this is the easiest way to backup a database and a total file backup of your complete webspace. The cPanel backup facility is available from your database management pages.

The alternative is using phpMyAdmin that can export DB contents but this is a very labour intensive. Moreover, after a while your DB become too large to be restored with phpMyAdmin in less than 30" (Server sql session for a querry). In fact queries are often too long to run with this method. The problem is that the INSERT queries writes the full table contents in one query. This explains why the sql timeout can be reached at a volume limit.

There is now a solution that provides automatic updates as follows

- Automatic and schedduled backups,
- Get record by record INSERTs,
- be able to import data before sql timeout.


Backups


DB_Backup, THE script

DBBackup is a php script by wolf (wolf at restkultur.ch) that just fit our requirements to schedule backups.

Description: Designed to create a daily snapshot of the dynamic data (database) on your website. (e.g. content management systems, forums, guestbooks etc.) Some of the features include:

- Backup multiple databases and servers with different users and passwords.
- Backups can be scheduled using cron (on UNIX systems) or AT (on Windows systems).
- Create daily, weekly and monthly snpashots of your database.
- Keep the daily, weekly and monthly backups for a user-specified time.
- Archive and compress the SQL files (*.tar.gz).
- Creates a detailed report of everything that has been done.
- Reports can be mailed to you or any other user- specified email-address, saved to disk or displayed in browser.
- Generated SQL files can be mailed to you or any other user-specified email-address.
- Generated SQL files can be uploaded by FTP to any user-specified FTP server.
- Choose to create SQL files for each table or one for each database
- No system calls, everything in 100% pure PHP.

Download:
http://restkultur.ch/personal/wolf/scripts/db_backup/

Installation and configuration
unzip .tar.gz file,
Copy def_config.inc.php into config.inc.php and open it.
Feed blank configuration vars (servers, DBs, Users, passes and all other parameters)
Upload db_backup on your web
In your browser, test script with this link:
http://localhost/db_backup/db_backup.php


Automatisation

If you access you server's cron jobs, ie in cPanel, you are able to schedule a task to run the script periodically.

Details for this installation are avalable in the INSTALL file.

Coment

This script is independent, it includes all its own ressources.
gzipped format, joined with mail support enable to regulary get on your desktop a copy of your DBs.
Generated SQLs uses record by record INSERTs. Then import timeout limit shouldn't be reached anymore.
File is directly usable in phpMyAdmin (with gzip activated).
A Xoops independant script that should complete you web site setup.


Restore

Bigdump, The script

Thanks Alexey Ozerov (alexey at ozerov.de) for this script that imports big SQL dump files on limited sessions servers. Script Valid with IE 6.0 SP1, mozilla 1.x and Netscape 4.8.

Description:

Staggered import of large MySQL Dumps (like phpMyAdmin 2.x dumps and even GZip compressed dumps). Do you want to restore a large backup of your mySQL database (or a part of it) into a new or the same mySQL database? Maybe you can't access the server shell and you can't import the dump using phpMyAdmin due to hard memory/runtime limit of the web-server. BigDump will achieve this
even with webservers with hard runtime limit and those in safe mode in a couple of short sessions.

Download:
http://www.hotscripts.com/Detailed/20922.html

Installation and configuration

One unique script. Modify setup vars ( DB connextion and gzip file URL) and upload it on your server.

Use

This is a manual operation (Databases are not usually restored daily)
Post gziped sql Dump at devoted URL and run the script.

Comment
Simple, ths script is an excellent DB_Backup complement. No more sql files cut to fit phpMyAdmin sessions that a so boring (and dagerous) as your DBs bocomes too big.

Pseudo-Cron
The only thing that's certain to happen quite regularly on a web page are page requests. This is where pseudo-cron comes into play: With every page request it checks if any cron jobs should have been run since the previous request. If there are, they are run and logged. pseudo-cron uses a syntax very much like the Unix cron's one. All job definitions are made in a text file on the server with a user-definable name (usually crontab.txt). An extensive logging facility is also included for logging job results, and you may choose to receive emails with the job results.


Download:
http://www.bitfolge.de/pseudocron-en.html

Happy backups

Adapated from French Xoops Team: http://www.frxoops.org, http://www.xoops-themes.org and http://themes.xoops.org.


The comments are owned by the author. We aren't responsible for their content.
  • Not too shy to talk

 So how about the backup of the html files?

Can anybody recommend a nice automated tool to backup the site files, including the CHMOD status of the files?

 
  • Home away from home

 Re: So how about the backup of the html files?

Thank you noisia for your PM to me about THIS THREAD

Another site backup development that taluncleford is developing. This also looks like it will be a top notch module... take a look

 
  • Moderator

 Re: Backup/Restore XOOPS Site

We need to update this FAQ....

Short story is that you need to backup:

1. Database
2. Files
   a
XOOPS_ROOT folders
   b
Folders "above" XOOPS_ROOT
      i
XOOPS_TRUSTED folders
     ii
others (for uploaded module filesetc.)

You can backup the database using the scripted noted above, the XoopsCare module, the DB Backup & Restore module, the BackPack module, or via phpMyAdmin.

Files should be tar/gzipped and held after (before?) every major change to your site.

Restoral is the reverse: restore database and restore files.

Clear as mud?