[plug] Backup + 2 gig file limit = error

Russell Clarke clru at ljbc.wa.edu.au
Wed Jun 27 13:11:34 WST 2001


Gday, 

I am having a problem backing up large directories. 

The backup is run through a script that just runs tar/gz over the
directores and stores the backup files on a hard disk that is in a
removeable caddy.

Now my problem that some user directories push the backup size way over
the kernel (2 gig file limit) resulting in tar dying in mid flight,
giving me incomplete backups.

I am really after a simple hack that would work around this file limit
problem. 
Eg. Something that tar is piped through tar so when the file reaches a
certain size it will create a new file eg backup.tgz.1 and then
backup.tgz.2 and so on.

Ive looked at a few other solutions, but they are either not suited for
my situation or there is bit to much messing around/time wasting.

If anybody has some simple ideas around this, or know of any
sciprts/programs that do this with little fuss I would love to hear
them.

cheers.
russ



More information about the plug mailing list