On Sun, Feb 26, 2006 at 09:41:04AM -0600, Mike Miller wrote:
> On Sun, 26 Feb 2006, Brian Hurt wrote:
> 
> >On Sun, 26 Feb 2006, Ed Wilts wrote:
> >
> >>On Sun, Feb 26, 2006 at 08:36:39AM -0600, Jordan Peacock wrote:
> >>>Quick question: I have a whole ton of .tar files I need to extract,
> >>>but when I try 'tar -xvf *.tar' it doesn't work.... what am I doing
> >>>wrong?
> >>
> >>Try:
> >>
> >>find . -name '*.tar' -exec tar xvf {} \;
> >
> >That will extract all tar files in the current directory and all
> >directories under it.  Which may be the behavior desired- but may not be.
> >My for loop only extracts the tar files in the current directory.

I could have added a -maxdepth 1 to the find command to not extract
tarballs in sub-directories.
 
> Right.  And you have to be careful with tar files about what they are 
> putting out.  One might have contents that write over those of another 
> one.  Does each one produce a unique directory and put all files in that 
> directory?  The for loop is safer, but you should know before you run it 
> what those tar files contain.

The original for loop is no safer than the find command and will produce
the same result.  However, a new posting mentioned creating a new
directory per tarball.  That works, of course, but is more functionality
than what the original poster requested (whether or not he understood
the consequences of what he requested is unknown though).

        .../Ed

-- 
Ed Wilts, RHCE
Mounds View, MN, USA
mailto:ewilts at ewilts.org
Member #1, Red Hat Community Ambassador Program