This isn't a complicated matter, but since it's been a while since I've 
dabbled in sed, I'm not sure how best to do this, so I thought I would ask 
the experts. :)
	I ran wget on a large site that has all the links hardcoded. I'd like to 
remove all instances of, say, 'http://www.site.com/directory' so that I can 
view it offline and have all the links work locally. So, what would be the 
best way to recursively work through the files and remove the text? 
	Thanks,
	:Peter


_______________________________________________
TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
http://www.mn-linux.org tclug-list at mn-linux.org
https://mailman.real-time.com/mailman/listinfo/tclug-list