On Fri, 22 Oct 2004, John Sanborn wrote:

> Anyone have an idea where to find (or find out how to write) a script 
> that will save a URL to text file? The URL is on a Windows 2k3 server 
> and requires a login.
>
> Text files are written to a directory on a web server daily that are 
> accessible via https and a login. I'd like to automate the process to 
> move the files to a remote location and it seems to me that it would be 
> much easier to simply retrieve them from the remote location than to try 
> automating a process of shipping the files by secure ftp or email from 
> the web server.


If the files always have the same names, you can use lynx:

lynx -auth=login:passwd -source URL > filename

You'd have to write one line per file.

I think wget will handle the multiple files in a single command so long as 
your web server gives directory listings.

Mike

_______________________________________________
TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
Help beta test TCLUG's potential new home: http://plone.mn-linux.org
Got pictures for TCLUG? Beta test http://plone.mn-linux.org/gallery
tclug-list at mn-linux.org
https://mailman.real-time.com/mailman/listinfo/tclug-list