On Sun, Jul 10, 2011 at 3:02 PM, Mike Miller <mbmiller+l at gmail.com> wrote:

> On Sat, 9 Jul 2011, Robert Nesius wrote:
>
>  I'd use sftp over scp for big files like that.  You might also use wget
>> (through an ssh tunnel or some other manner), as wget can resume downloads
>> that didn't complete without retransmitting bits already sent. (Not sure
>> sftp does the same or not - it's worth reading the man page to see if it
>> does).
>>
>
>
> It doesn't look like sftp/scp can do that.  wget does it with the -c
> option, but for me it is stalling a lot, like almost every 20 MB or so. sftp
> did the whole file on the first attempt and with good speed.  I'm using wget
> just to test that I get the exact same file after stalling 20 times.  This
> result just in: md5sums are identical.  So "wget -c" did the job.  It would
> be a much bigger hassle if I had to establish an ssh tunnel every time I
> restarted wget, but this wasn't a secure transfer, so wget was fine.
>
> I wonder if they changed something in the network at the U.  I didn't used
> to have this kind of problem, but now it's happening all the time.


I'd be less suspicious of the U and more suspicious of the ISPs.  Bummer
that wget timed out as well, but at least you have it in your back pocket
now for a similar situation in the future.

-Rob
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.mn-linux.org/pipermail/tclug-list/attachments/20110711/6f3b3727/attachment.html>