[plug] backups

Richard Meyer meyerri at westnet.com.au
Fri Apr 28 22:33:27 WST 2006


On Fri, 2006-04-28 at 21:24 +0800, Timothy White wrote:
> On 4/22/06, Stuart Midgley <stuart.midgley at ivec.org> wrote:
> > I happily rsync more than 100000 files from scratch.  But then again,
> > I have 320GB of memory to play with :)
> >
> > If you want the sort of performance you see from scp, you should use
> > the -W flag.  No program is going to be great at copying thousands of
> > small individual files, so in that case tar should definitely be
> > used.  BUT, rsync should still work just fine.
> >
> > One major issue is the use of ssh.  You will struggle to see more
> > than 10MB/s via ssh.  The encryption costs too much cpu time.  I tend
> > to use rsh where possible, or an rsync server, and happily see 40MB/s
> > over our GB network.  When I rsync across the country, I use a highly
> > specialised rsync script which uses the -n flag to get a list of
> > files that need syncing and then break the list up and get 30+
> > rsync's going simultaneously.  I tend to only see about 1MB/s from
> > Perth to Canberra per file stream, so getting 30 streams going
> > simultaneously gives me 30+MB/s bandwidth.
> 
> I'd love to see how you do all this.
> 
> Thanks!
> 
> Tim
> --
> Linux Counter user #273956

Seeing that Stu is the director of the local Supercomputer centre, I'd
also like to see it, but in my case just to say I have rather than to
express disbelief.

That's a disguised request, Stu ...

Regards
RM
-- 
Richard Meyer <meyerri at westnet.com.au>
There are II types of people - those who can count like Romans and
those who can't.

Linux Counter user #306629




More information about the plug mailing list