You are correct, but when I say slurp I think of:

   $/ = '';
   open(FILE,"< somefile.txt");
   $input = <FILE>;
   close(FILE);

similar to what File::Slurp provides. But that might not be the way
most folks would see it. :-|

>>> jpschewe at mtu.net 04/13/01 01:26PM >>>
Why not?  That's what pipes are for.  The file can start writing to disk to
free up memory.  You don't need to load the whole file into memory to cat it
into another one.

"Troy Johnson" <Troy.A.Johnson at state.mn.us> writes:

> Not a perl dependant one, IIRC. I wouldn't try to slurp up files larger than available memory, however.
> 
> >>> austad at marketwatch.com 04/13/01 02:38AM >>>
> isn't there a limit to the size of each variable?  
> 
> 
> _______________________________________________
> tclug-list mailing list
> tclug-list at mn-linux.org 
> https://mailman.mn-linux.org/mailman/listinfo/tclug-list 

-- 
Jon Schewe | http://mtu.net/~jpschewe | jpschewe at mtu.net 
For I am convinced that neither death nor life, neither angels 
nor demons, neither the present nor the future, nor any 
powers, neither height nor depth, nor anything else in all 
creation, will be able to separate us from the love of God that 
is in Christ Jesus our Lord. - Romans 8:38-39

_______________________________________________
tclug-list mailing list
tclug-list at mn-linux.org 
https://mailman.mn-linux.org/mailman/listinfo/tclug-list