On July 29, 9:04 pm Peter Clark <peter-clark at bethel.edu> wrote:
>     This isn't a complicated matter, but since it's been a while since
> I've dabbled in sed, I'm not sure how best to do this, so I thought I
> would ask the experts. :)
>     I ran wget on a large site that has all the links hardcoded. I'd
> like to remove all instances of, say, 'http://www.site.com/directory' so
> that I can view it offline and have all the links work locally. So, what
> would be the best way to recursively work through the files and remove
> the text?     Thanks,
>     :Peter


well, sed wont edit the files in place, but you could use perl to do it.
Really you just want the command 's/http:\/\/www\.site\.com//g'  (in either
perl or sed)

You can use  `find /path/to/files -type f -exec scritpt \{\} \;` to do it
recursively.


Jay

_______________________________________________
TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
http://www.mn-linux.org tclug-list at mn-linux.org
https://mailman.real-time.com/mailman/listinfo/tclug-list