[plug] backups

Stuart Midgley stuart.midgley at ivec.org
Sat Apr 22 12:46:59 WST 2006


I happily rsync more than 100000 files from scratch.  But then again,  
I have 320GB of memory to play with :)

If you want the sort of performance you see from scp, you should use  
the -W flag.  No program is going to be great at copying thousands of  
small individual files, so in that case tar should definitely be  
used.  BUT, rsync should still work just fine.

One major issue is the use of ssh.  You will struggle to see more  
than 10MB/s via ssh.  The encryption costs too much cpu time.  I tend  
to use rsh where possible, or an rsync server, and happily see 40MB/s  
over our GB network.  When I rsync across the country, I use a highly  
specialised rsync script which uses the -n flag to get a list of  
files that need syncing and then break the list up and get 30+  
rsync's going simultaneously.  I tend to only see about 1MB/s from  
Perth to Canberra per file stream, so getting 30 streams going  
simultaneously gives me 30+MB/s bandwidth.

Stu.


On 22/04/2006, at 10:27, W.Kenworthy wrote:

> Yes, its a great program, but if it wont work then thats that.   
> There's
> some tricks that can extend how many files it can handle but  
> eventually
> you run out of memory -  It is possible to scp the files across the
> first time, then it can work as it is only caching the metadata of  
> files
> that have changed, but it wont do 100000 files from scratch (without
> manual interventions like running multiple partials).  Doesnt help  
> me in
> this case!  Took some 15 hrs but the jobs done via scp/openvpn over  
> wap
> using bzip2/tar.  The limiting factor was that it was cpu bound, not
> channel bandwidth.  Next time I'll try gzip and see if its better than
> bzip2 for this.
>
> BillK


--
Dr Stuart Midgley
Industry Uptake Program Leader
iVEC, 'The hub of advanced computing in Western Australia'
26 Dick Perry Avenue, Technology Park
Kensington WA 6151
Australia

Phone: +61 8 6436 8545
Fax: +61 8 6436 8555
Email: industry at ivec.org
WWW:  http://www.ivec.org






More information about the plug mailing list