On Sat, 9 Jul 2011, Robert Nesius wrote:

> I'd use sftp over scp for big files like that.  You might also use wget 
> (through an ssh tunnel or some other manner), as wget can resume 
> downloads that didn't complete without retransmitting bits already sent. 
> (Not sure sftp does the same or not - it's worth reading the man page to 
> see if it does).


It doesn't look like sftp/scp can do that.  wget does it with the -c 
option, but for me it is stalling a lot, like almost every 20 MB or so. 
sftp did the whole file on the first attempt and with good speed.  I'm 
using wget just to test that I get the exact same file after stalling 20 
times.  This result just in: md5sums are identical.  So "wget -c" did the 
job.  It would be a much bigger hassle if I had to establish an ssh tunnel 
every time I restarted wget, but this wasn't a secure transfer, so wget 
was fine.

I wonder if they changed something in the network at the U.  I didn't used 
to have this kind of problem, but now it's happening all the time.

Mike