[plug] ftp scripting?

Russell Steicke r.steicke at bom.gov.au
Wed May 21 18:34:33 WST 2003


On Wed, May 21, 2003 at 05:47:17PM +0800, Brad Hill wrote:
> sorry to the rsync and unison guys, but it has to be ftp.
> 
> I've used Russel's suggestion so far but i'm wondering about the limitations
> of the amount of files that may wind up in the touch directory.... What's
> linux's limit of files per directory... ?

There isn't one, but large directories can be slow to search.  Reiserfs
is supposed to handle large dirs well.  The saner parts of the
concurrent discussion about reiserfs may have info on this.  If you have
up to a hundred or so files I wouldn't worry about it.  Past that I'd
only worry about it if this is meant to be interactive, or it puts an
unreasonable load on your system.

There's a race condition between putting the data into your url file and
changing its permissions.  Create the file first, change its
permissions, then put the sensitive data in:

  rm -f url
  touch url
  chmod 600 url
  echo stuff > url

Or change your umask before creating the file.  You could also keep the
password in ~/.netrc.  Search for "netrc" in the wget man page, and see
also netrc(5).


>         for file in `wget -O - --quiet -i url | sed "s/.*\">//" | sed "s/<.*//"`

If you can make this dependent on bash [1], arrays can be convenient
here:

  files=( $(wget ...) )
  for file in ${files[*]} ; do
    # stuff
  done


> Still have to handle clearing the touch folder out somehow, but since the

If you have the list of files in an array, you can rm any touch file
that is not in the array.

  for touchfile in touch/* ; do
    for file in ${files[*]} ; do
      if [ $touchfile = $file ] ; 
        continue
      fi
      rm -f $touchfile
    done
  done





[1] This is probably where the zsh advocate pops up. :)  And ksh has
arrays as well.


-- 
Russell Steicke

-- Fortune says:
Your domestic life may be harmonious.



More information about the plug mailing list