On Wed, 12 Jan 2011, Robert Nesius wrote:

> It's line-buffered output - doesn't matter if it's /tmp/foo or a tty...
>
> You can go to "unbufferred I/O" to help mitigate this (an approach with 
> performance implications), or simply make sure the stuff you care about 
> is in the same stream.  Remapping streams to be the same also helps. Its 
> just an issue to be aware of is all...  .  Most people bump into this 
> eventually in unix/linux.  Often it's noticeable when I/O sent to stderr 
> in the midst of writes to stdout is all bunched up at the end of the 
> output, which happens because stdout gets flushed while tthe process is 
> being torn down and filehandles flushed and closed.  This usually 
> motivates one to ask "Why did all of my debug statements get bunched up 
> at the end?"


I first found out about this problem back in June when I was getting stuck 
on a problem with my use of "rm -i" in a bash script:

http://mlug.missouri.edu/pipermail/members/2010-June/014842.html

The prompt from rm -i goes to stderr, so when I was redirecting stderr so 
that I didn't see it, the script would hang.  I then discovered the useful 
hilite program:

http://mlug.missouri.edu/pipermail/members/2010-June/014846.html

And a friend explained buffering to me:

http://mlug.missouri.edu/pipermail/members/2010-June/014850.html


Here's another example of what you're talking about:

First I make a script called foo...

-----foo begins on next line-----
#!/bin/bash
( echo -e "stdout1" ; echo -e "stderr1" 1>&2 ; echo -e stdout2 )
----foo ends on previous line----

...and make it executable:

chmod 755 foo

Then I run it and see this:

$ ./foo
stdout1
stderr1
stdout2

But when I use it with hilite, in addition to getting "stderr1" 
highlighted in red, it pushes it to the end:

$ hilite ./foo
stdout1
stdout2
stderr1

This kind of stuff can drive you crazy.  ;-)

Best,
Mike