[plug] 'best fit' data archiving tools

Ryan ryan at is.as.geeky.as
Mon Sep 19 14:54:52 WST 2005


I'm not sure how to name what I want  ...

I want to break a large directory of data (600GB+) into smaller chunks
of a specified size limit (~16GB in this case).  In this particular case
the finest granularity will be a 2nd level directory, everything below
that must stay together with it's 2nd level parent, so essentially a
'file' from a size perspective is the contents of each 2nd level
directory.

I envisage something that can move or link the relevant files into
directories representing the many volumes of $MAXIMUM_SIZE that would
result from the data pool.

The idea is to get as close as possible to the size limit by picking the
appropriate files to fill the available space.  Generally this would go
something like:  put the largest file possible in the volume, then the
next possible largest, etc. etc. iterating through a defined number of
files trying to find a better size match until it's hit as close to the
limit as the iterations allow.

I know some CD-R software does this and I could nut it out and write
one, but I'm sure there is something around that could in some way
assist with this.

Another requirement is that it must be able to treat a given depth of
directories as a whole, such that files in directories more than X
levels deep can only be moved as an entire directory.

Does anyone know of such a tool?  Additionally, what is the technical
name for this process? :)

Thanks,

Ryan
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part
URL: <http://lists.plug.org.au/pipermail/plug/attachments/20050919/6a3df235/attachment.pgp>


More information about the plug mailing list