Backing up my stuffs
  • 2007/9/8 3:15

  • Brittany

  • Just popping in

  • Posts: 4

  • Since: 2007/3/28

I feel like a dummy asking this question but lately I've been concern about my website security. I'm afraid that my website will get hacked one of these days.

I was wondering how I can back up all of my stuffs? if for some reason my site gets hacked, I want use the backup file to get my website going again without having to lose anything from it.

As a Rookie... please explain how I can do this :)

Re: Backing up my stuffs
  • 2007/9/8 12:30

  • McDonald

  • Home away from home

  • Posts: 1072

  • Since: 2005/8/15

There are 2 things to backup:
1. your database
2. the files on your server

1. Database backup
You can backup your database by using phpMyAdmin. Normally it creates one big file which is too big for phpMyAdmin to restore. The script BigDump is helpfull in this case.
Other options are the XOOPS modules BackPack and DB Backup & Restore.
Both modules have BigDump integrated and are also able to backup the database tables seperately.

2. Files backup
The files on your server can easily backupped by downloading the files to your local hard drive with a FTP program like FileZilla.

Just make backups on a regular base, especially from your database.

To secure your website you should have Protector installed. Protector is well documented on XoopsWiki.

You should also have a .htaccess file in the root of your website. Here's an example:
#new rule - ban prefetch
RewriteCond %{HTTP_USER_AGENT"Firefox/1.0.7" [OR]
RewriteCond %{HTTP:x-moz} ^prefetch [OR]
RewriteCond %{X-moz} ^prefetch
(.*) noaccess_to_prefetch_accelerator.php [L]

deny from 82.99.30.
deny from
deny from 81.177.14
deny from 81.177.15.

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Bot mailto:craftbot@yahoo.com [OR] 
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] 
RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Download Demon [OR]
RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Express WebPictures [OR] 
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] 
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] 
RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] 
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] 
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] 
RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] 
RewriteCond %{HTTP_USER_AGENTHTTrack [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image Stripper [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Image Sucker [OR] 
RewriteCond %{HTTP_USER_AGENTIndy Library [NC,OR] 
RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Internet Ninja [OR] 
RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] 
RewriteCond %{HTTP_USER_AGENT} ^JOC Web Spider [OR] 
RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] 
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Mass Downloader [OR] 
RewriteCond %{HTTP_USER_AGENT} ^MIDown tool [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Mister PiX [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Net Vampire [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Offline Explorer [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Offline Navigator [OR] 
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Papa Foto [OR] 
RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] 
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] 
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] 
RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] 
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Teleport Pro [OR] 
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Web Image Collector [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Web Sucker [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebGo IS [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Website eXtractor [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Website Quester [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Xaldon WebSpider [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus
^.* - [F,L]

RewriteEngine on
# Options +FollowSymlinks
RewriteCond %{HTTP_REFERERviagra-buy.gotelli.cn [NC,OR]
RewriteCond %{HTTP_REFERERniched-movies.odpv.cn [NC,OR]
RewriteCond %{HTTP_REFERERmedic911.xoomwebs.com [NC,OR]
RewriteCond %{HTTP_REFERERfree-porno-links.orgfree.com [NC,OR]
RewriteCond %{HTTP_REFERERpillsius.t35.com [NC,OR]
RewriteCond %{HTTP_REFERERsearchpharm2.t35.com [NC,OR]
RewriteCond %{HTTP_REFERERt35.com [NC,OR]
RewriteCond %{HTTP_REFERERtoday-free-movies.orkeor.cn
.* - [F]

More information can be found in the Comprehensive Guide to .htaccess.

Create a file info.txt in the root of your website with following content:
# Contact info submission


Fill in the details you want to be published on sites like Alexa. Less is better for privacy.
The url should be without www. and end with /
Just find your website on Alexa for all the details.

Re: Backing up my stuffs
  • 2007/9/8 18:17

  • svaha

  • Just can't stay away

  • Posts: 896

  • Since: 2003/8/2 2

For the backup of my database I use http://www.mysqldumper.de/en

For the backup of your files there is sometimes an option in your cpanel, my host has the option to pack all files in one zip file.

Re: Backing up my stuffs
  • 2007/12/15 15:42

  • jfmoore

  • Quite a regular

  • Posts: 360

  • Since: 2004/6/6 5


McDonald wrote:
There are 2 things to backup:
1. your database
2. the files on your server

1. Database backup
You can backup your database by using phpMyAdmin. Normally it creates one big file which is too big for phpMyAdmin to restore. The script BigDump is helpfull in this case.
Other options are the XOOPS modules BackPack and DB Backup & Restore.
Both modules have BigDump integrated and are also able to backup the database tables seperately.

Anybody know if Backup and Restore 3.0 is safe to use with XOOPS or if there is a later version?


Re: Backing up my stuffs
  • 2007/12/20 9:15

  • venose

  • Just popping in

  • Posts: 20

  • Since: 2006/5/23

not for me anymore @2.0.16 it works butt after the update to 2.0.17 it didnt somebody?

php5 ,backpack 0.83,backup version 3 x2.0.17

Re: Backing up my stuffs
  • 2008/2/6 13:17

  • JardaM

  • Just popping in

  • Posts: 1

  • Since: 2006/4/18


Even though this thread is quite old, I place my question / remark here:

I am very surprised, that such an advanced system, like Xoops, contains no general and above all basic function for backing up.

From my point of view it is not possible to back up a XOOPS driven WEbsite reguraly - I mean automaticaly!

1. I think, for proper back up, you should stop the site, so that nobody changes the content or set up during the backup. I presume, this is not possible to perform automatically.

2. PHPadmin can't perform sheduled back ups, can it?

3. I believe it is important to backup the database and the files simultaneously to keep it consistant. This might be quite time consuming. For example my WEB site has app. 3.5 GB. So I can't stop the page and backup for 3-7 hours during day!

The best thing would be to start a sheduled backup regularly from let's say 1:00 to 7:00, when the trafic is statistically very low.

Re: Backing up my stuffs
  • 2008/2/6 13:42

  • Anonymous

  • Posts: 0

  • Since:

The most important thing to backup is your database.

With regard to files backup, most files don't change with time. More likely is the scenario where users upload additional files like photos, etc. It is these additional files that you will need to backup periodically. As websmaster, you will know what your site's users are allowed to upload and where, so backing up shouldn't be that onerous a task.

I keep a copy of my website on my PC and backup my database daily via my host's CPanel (which includes phpmyadmin). Once a week or so I download any newly uploaded files via my FTP client by synching the PC's folders with those on the server.

The database backup takes a few seconds (I save it as a gzip file) and the remote file-synch doesn't take long either.

Re: Backing up my stuffs
  • 2008/2/6 16:13

  • paulizaz

  • Not too shy to talk

  • Posts: 120

  • Since: 2007/1/18

How do you backup using phpMyAdmin, using export?
How efficient is this for very large voumes of data?

Re: Backing up my stuffs
  • 2008/2/6 17:05

  • Anonymous

  • Posts: 0

  • Since:

paulizaz wrote:

How do you backup using phpMyAdmin, using export?

Yep!! See this guide:

paulizaz wrote:

How efficient is this for very large voumes of data?

Pretty good I should think.

Restoring from large .sql backup files is another issue as many hosts have a maximum filesize limit. Use one of the compression options and phpmyadmin can read directly from these.

Re: Backing up my stuffs
  • 2008/2/12 16:10

  • paulizaz

  • Not too shy to talk

  • Posts: 120

  • Since: 2007/1/18

Thanks for that, How about restoring the data using phpmyadmin?

I have downloaded and tested the Backup module 'XOOPS DB Backup & Restore 3.0' to XOOPS 2.0.16.

1) Downloading to server did not seem to do anything
2) Clicking restore shows an empty drop down box - I dont think it works?

Anyone had any experience with this and/or can shed light?


Who's Online

227 user(s) are online (123 user(s) are browsing Support Forums)

Members: 0

Guests: 227



Goal: $100.00
Due Date: Jun 30
Gross Amount: $0.00
Net Balance: $0.00
Left to go: $100.00
Make donations with PayPal!

Latest GitHub Commits