Tags:
extract_doc1Add my vote for this tag extract_stuff1Add my vote for this tag create new tag
, view all tags
For backuping my wiki, I have a simple solution. Just put a script in the bin dir (I actually put all my local script in a scripts/ dir not to mix them with TWiki official ones) named wiki-backup.tgz being:
  cd YOUR_WIKI_DIR
  echo "Content-type: tar/gzip"; echo ""
  tar -cz -f - --atime-preserve . 
Thus I only have to get the script url (via wget, curl...) in my crontab to make the backups on another external machine. Here is the script I run in my crontab just before midnight daily:
#!/bin/bash
cd WHERE_YOU_BACKUP_WIKI
rm -f tar-wiki.tgz
wget -q http://wiki1.ilog.fr/wiki/bin/scripts/tar-wiki.tgz # URL OF SCRIPT
if [ -s tar-wiki.tgz ]; then
   if [ "`date +'%d'`" = "01" ]; then
      # archive previous Month
      cp tar-wiki.tgz tar-wiki-`date +'%B' -d '-1 day'`.tgz
   fi
   mv tar-wiki.tgz tar-wiki-`date +'%a'`.tgz
else
  echo "Problem: cannot get wiki archive from CGI script" >&2
fi
This keeps one wiki backup daily for one week, and a backup for each month for the past year.

I guess it could be doable to include such a backup script in the distrib in perl forking a configuration-settable archive programe...

-- ColasNahaboo - 18 Apr 2002

Interesting idea - took me a while to realise that there's a script with a name ending in .tgz... Might be cleaner to use the PATH_INFO to do this, i.e. use a script called backup, and a URL that goes /bin/scripts/backup/wiki-tar-180402.tgz. This also lets you use a suitable name based on today's date.

PATH_INFO has some issues on some webserver setups, download the latest testenv from CVSget:bin/testenv to diagnose these, and see CobaltRaqInstall.

-- RichardDonkin - 18 Apr 2002

on TWikiIRC we talked about using rsync via ssh as a backup method for a SourceForge project. The command recommended was:

rsync -avz -e ssh --delete username@projectnamePLEASENOSPAM.sf.net:/home/groups/p/pr/projectname .

Of course you need to replace username, projectname and the alphabetical /p/pr/ parts as well as the destination currently set to the current directly.a

-- GrantBow - 29 Jan 2003

I use rsync very successfully to maintain a mirror of our intranet Twiki, which contains mission-critical support information (which wouldn't be much use if the Twiki server crashed). I was even wondering whether to trigger the rsync when a topic is saved.

-- VaciKoblizek - 30 Jan 2004

The rsync method is generally useful, and can also be used also to allow a Distributed TWiki - this gives you the situation whereby you not only have backups (copied to physically different machines) but also gives you the ability for relatively simple failover.

One word of warning though - if you're using the --delete option and syncing often (highly recommended) then you can get into the situation whereby if the disk on the primary server becomes full you can lose edits, and lose the content on the rsync slaves. (Which means having a regime similar to Colas's is sensible.)

An alternative to both approaches is to implement ReadWriteOfflineWiki - that way you end up with large numbers of backups, all of which can become the primary system at any tme.

-- MS - 30 Jan 2004

After having a hosing service delete my entire TWiki installation early in 2002 (resulting in my losing months worth of work), I became very interested in figuring out a way to back it up my TWiki data and other key configuation files. However, since I was working with a dialup connection, it was also very important for me to optimize the size of my downloads. Also, because I'm bad about routine maintenance, I wanted the system to be as automoatic as possible. After looking around for options, I found a couple of small scripts that, together provided a very nice solution. They were:

  • BackerUpper - Creates compressed backup files with easy customization of which directories/files to include/exclude, frequency of backup, type of backup (full or incremental), etc.. I set it up to do incremental backups daily and full backups on the first of the month.
  • SendEmail - Sends me the compressed backup archives and then deletes them from the server.

I have a batch file that I invoke via cron that runs BackerUpper and then SendEmail. It also does all my other TWiki maintenance tasks like running mailnotify, deleting session files, etc., and sends me a nice email reporting on everything.

Perhaps someone with more unix knowledge could have fashioned a similar solution from scratch but I found this set-up relatively easy to set up and has been working like a charm for 2 years now. If anyone is interested in using this approach and needs some help, feel free to contact me and I'll give you the details of my setup. I've thought about packaging it as a TWikiAddOnProduct.

-- LynnwoodBrown - 30 Jan 2004

Lynnwood, I am sure someone will find your script useful. The preferred way to package an add-on is in the Plugins web, AddOnPackage.

-- PeterThoeny - 30 Jan 2004

Lynnwood, did you ever published your AddOnPackage, which seems to be very useful ?

-- BenoitFauvel - 03 Mar 2005

Benoit - I haven't published by backup solution yet on TWiki.org. It is available in "draft" form on my site at http://skyloom.com/Dev/HostedTWikiBackupAddOn. I welcome your trying it out and give me suggestions for polishing it up a bit before releasing it "officially" here on TWiki.org.

  • Thanks a lot ! -- BF

-- LynnwoodBrown - 03 Mar 2005

Edit | Attach | Watch | Print version | History: r11 < r10 < r9 < r8 < r7 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r11 - 2008-08-25 - TWikiJanitor
 
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2017 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.