On Tue, Jul 29, 2003 at 09:04:30PM -0500, Peter Clark wrote:
> 	I ran wget on a large site that has all the links hardcoded. I'd like to 
> remove all instances of, say, 'http://www.site.com/directory' so that I can 
> view it offline and have all the links work locally. So, what would be the 
> best way to recursively work through the files and remove the text? 
> 
(untested)

 find . -type f -print0 | xargs -0 perl -pi -e 's(http://www.site.com/directory)(my/new/dir)g'

Or maybe you want a file pattern in the find, like:

 find . -name '*.html' -print0 | ...

-- 
trammell at el-swifto.com  9EC7 BC6D E688 A184 9F58  FD4C 2C12 CC14 8ABA 36F5
Twin Cities Linux Users Group (TCLUG)      Minneapolis/St. Paul, Minnesota

_______________________________________________
TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
http://www.mn-linux.org tclug-list at mn-linux.org
https://mailman.real-time.com/mailman/listinfo/tclug-list