[plug] Problem with software RAID
Chris Caston
caston at arach.net.au
Sun Mar 7 09:27:07 WST 2004
On Sat, 2004-03-06 at 22:59, Tim Bowden wrote:
> On Sat, 2004-03-06 at 22:05, Craig Ringer wrote:
> > On Sat, 2004-03-06 at 21:46, Tim Bowden wrote:
> >
> > > I am just helping out a friend with a broken ide software raid setup on
> > > a RH 9 box. I would like to backup the hdd's as they are before I go
> > > playing with it.
> >
> > Sensible.
> >
> > > I can't tar -czvf /dev/hdc1 ./backup.tgz which is what
> > > I really want to do.
> >
> > ... because it just creates a tar file with a single device node in it,
> > yes?
> >
> > > I can of course dd if=/dev/hdc1 but how do I get
> > > the output piped to a zip file?
> >
> > Do you mean a zip file, or "a compressed image file"? I assume you mean
> > that latter, and if so, just `dd if=/dev/hdc1 | gzip > image.gz` . If
> > the former ...why?
> >
>
> Exactly what I needed, at least I thought it was. Given the size of the
> partition (40GB), the backup file is way too big for the windows machine
> it is being backed up onto (ie, greater than 2gb. In the absence of a
> sufficiently large hdd available at the moment, the only workaround I
> can think of is to break the gzip output into <2gb chunks as it is
> produced and save a number of files to the windows machine. Any ideas
> on how this might be done? Perhaps the best way may be just to get a
> spare large disk in the morning.
I just bought an 80gb Seagate but it's currently sitting in a PII with
the 32gb clip (bios wouldn't detect it as 80 and it needs flashing but I
loathe to do such things) and formatted with ext2. I assume your talking
about the 2gb file size limitation in fat32.
thanks,
Chris
--
Linux is ready for the desktop like a Boeing F-22 is ready for the
run-way.
More information about the plug
mailing list