Hi all,

I have a web product that runs on a linux host that I maintain on
several different off-site locations.  I use ssh exclusively since I'm
going through the internet to get to them.  Everything works fine doing
shell access and running commands and file edits.  But some of these
sites are behind firewalls that restrict outgoing connections, and they
cant get out to ftp or http sites for system updates, so I gather all
the updates at a local server here and transfer them through ssh.  The
works fine, except for a hand full of sites.

What seems to happen is that ssh will begin transferring the data, and
around 100k to 300k or so, the connection stalls, and finally times out.
  This makes transferring the files very annoying as I have to "tar |
split" the files and send them in a "for F in *;do scp" loop.  It
doesn't matter how I transfer the file (scp, dd | ssh dd, etc...), every
method fails at a random 100-300k size.  I was just wondering if anyone
has run into this issue before.

I've noticed this at a few other sites that have less restrictive
firewalls as well.  Since these are off site, I don't have control of
the firewall.  I can request them to make a change to their
configuration, but I need to know what change to make first.

Thanks
Chris Frederick