[plug] Backup + 2 gig file limit = error
Kim Covil
kimc at ned.dem.csiro.au
Wed Jun 27 13:16:35 WST 2001
> Gday,
>
> I am having a problem backing up large directories.
>
> The backup is run through a script that just runs tar/gz over the
> directores and stores the backup files on a hard disk that is in a
> removeable caddy.
>
> Now my problem that some user directories push the backup size way over
> the kernel (2 gig file limit) resulting in tar dying in mid flight,
> giving me incomplete backups.
>
> I am really after a simple hack that would work around this file limit
> problem.
> Eg. Something that tar is piped through tar so when the file reaches a
> certain size it will create a new file eg backup.tgz.1 and then
> backup.tgz.2 and so on.
>
> Ive looked at a few other solutions, but they are either not suited for
> my situation or there is bit to much messing around/time wasting.
>
> If anybody has some simple ideas around this, or know of any
> sciprts/programs that do this with little fuss I would love to hear
> them.
>
> cheers.
> russ
To backup:
tar cf - directory | bzip2 | split --bytes=2000m - /pathtobackupstorage/directory.tar.bz2.
To restore:
cat /pathtobackupstorage/directory.tar.bz2.* | bunzip2 | tar xf -
Cheers
Kim
--
======================================================================
Kim Covil - CSIRO Exploration & Mining E-mail: kim.covil at dem.csiro.au
PO Box 437, Nedlands, Tel: +61 8 9284 8425 ,-_!\
Western Australia 6009 Fax: +61 8 9389 1906 / \
*_,-._/
=================================================================== v
Please direct all personal e-mail to kimbotha at covil.com.au
More information about the plug
mailing list