Question : transfering a directory with multiple files in parallel

Hi,

What I am trying to accomplish is to have a bash script that will look at a folder with multiple subfolders and files and from there, will it copy the same folder in parallel 10 files at the time to another server, until the entire folder has been completely transfer. I am talking about a 200 Gig file.

I would appreciate your input/code.

Regards,

Michael

Answer : transfering a directory with multiple files in parallel

If you insist on using scp the following may provide a serial means to transfer files, but note it will need to negotiate a key based session before transferring each file,  so if the number of files is large the so will the negotiation overhead:

cd /some/local/directory/
find . -type d -exec ssh [email protected] 'cd /some/remote/directory;mkdir -p .{}' \;
find . -type f -exec scp {} localhost:/some/remote/directory/{} \;


Personally I'd use rsync over ssh e.g.

#!/bin/sh
RSYNC=`which rsync`
SOURCE_DIR='/some/local/directory'
TARGET_DIR="$SOURCE_DIR"
$RSYNC -optl --delete-after --rsh=/usr/bin/ssh --rsync-path=$RSYNC --force $SOURCE_DIR/ ${REMOTE_HOST}:${TARGET_DIR}

Then again the following tar pipe solution may work for the first couple of GB worth of files:

cd /some/local/directory && tar cf - . | ssh [email protected] "cd /some/remote/directory; tar xf -"

If the boxes are not secure, no pesky firewalls, then the following netcat (port binding) approach may suit:
http://compsoc.dur.ac.uk/~djw/tarpipe.html

Note: Could always add a ssh tunnel if there is a firewall in the way.
Random Solutions  
 
programming4us programming4us