[CALUG] More I/O buffering foolishness

Jason C. Miller jason.c.miller at gmail.com
Wed Mar 1 08:43:15 CST 2006


I've asked a buffering question on here before and have now run into a 
similar-yet-different issue and was wondering if anyone had any insight.

Here's the short story...


At the command line (bash) ...
-----------------------------------
(cd /blah ; tar cvf - * 2>/dev/null) | (cd /blah2 ; tar xvf - 2>/tmp/log) | \
perl -e 'while (<STDIN>) { print }'
-----------------------------------
... works fine.  All it does is print the stdout from the tar extraction.  


However, once inserted into a script, the exact same string of commands 
will be block-buffered instead of line-buffered (ie. prints output in 
chunks).  Granted, this isn't unheard of.  This is particularly annoying 
because the stdin is being used as input for another program (Xdialog).   


The oddness is that, inside the script ...
-----------------------------------
(cd /blah ; tar cvf - * 2>/dev/null) | (cd /blah2 ; tar xvf - 2>/tmp/log)
-----------------------------------
... will line-buffer just fine.  It's when I try to pipe that into
anything else AFTER that the buffering issues come up.  I've tried it with
both perl and awk and they both do the same thing.  I tried forcing the 
flush in the perl, but that doesn't do anything and I really don't know if 
that's the issue or not.  


Any ideas?

                                           -jason



More information about the lug mailing list