[CALUG] More I/O buffering foolishness
Allbritten, Mark
Mark.Allbritten at grace.com
Wed Mar 1 10:40:38 CST 2006
Hi Jason,
I've run into this several times with other apps outputs. I am using HP-UX at the moment and I can't recreate your problem.
I am fairly confident if you just add the following it should work:
tar xvf /tmp/log 2>&1| while read X
do
echo $X (grep, ll, awk, etc)
done
Best Regards,
Mark
-----Original Message-----
From: lug-bounces at calug.com [mailto:lug-bounces at calug.com]On Behalf Of Jason C. Miller
Sent: Wednesday, March 01, 2006 11:17 AM
To: Allbritten, Mark
Cc: lug at calug.com
Subject: Re: [CALUG] More I/O buffering foolishness
Well....if I do ...
---------------------------------
(cd /blah ; tar cvf - * 2>/dev/null) | (cd /blah2 ; tar xvf - 2>/tmp/log)
---------------------------------
... stdout from the tar extraction works just fine. Even if I try to pipe
that output into 'grep', it'll still buffer everthing in chunks and I'm
really perplexed as to why that is. The output comes out of tar just
fine, but it seems that something happens to only allow it to be passed in
chunks into the next command. Really weird.
And, for all those who are curious, I've done all the usual
stdout/stderr redirection testing.
On Wed, 1 Mar 2006, Allbritten, Mark wrote:
> Hi Jason,
> Perl is taking standard in as one chunk. I believe you need to use a while readline loop... But if that's all your using perl for it would be easier to do something like:
> while read X ^Jdo^Jecho $X^Jdone
> I don't have a perl while readline snippet handy.....
> Best Regards,
> Mark
>
>
> -----Original Message-----
> From: lug-bounces at calug.com [mailto:lug-bounces at calug.com]On Behalf Of Jason C. Miller
> Sent: Wednesday, March 01, 2006 9:43 AM
> To: lug at calug.com
> Subject: [CALUG] More I/O buffering foolishness
>
> I've asked a buffering question on here before and have now run into a
> similar-yet-different issue and was wondering if anyone had any insight.
>
> Here's the short story...
>
>
> At the command line (bash) ...
> -----------------------------------
> (cd /blah ; tar cvf - * 2>/dev/null) | (cd /blah2 ; tar xvf - 2>/tmp/log) | \
> perl -e 'while (<STDIN>) { print }'
> -----------------------------------
> ... works fine. All it does is print the stdout from the tar extraction.
>
>
> However, once inserted into a script, the exact same string of commands
> will be block-buffered instead of line-buffered (ie. prints output in
> chunks). Granted, this isn't unheard of. This is particularly annoying
> because the stdin is being used as input for another program (Xdialog).
>
>
> The oddness is that, inside the script ...
> -----------------------------------
> (cd /blah ; tar cvf - * 2>/dev/null) | (cd /blah2 ; tar xvf - 2>/tmp/log)
> -----------------------------------
> ... will line-buffer just fine. It's when I try to pipe that into
> anything else AFTER that the buffering issues come up. I've tried it with
> both perl and awk and they both do the same thing. I tried forcing the
> flush in the perl, but that doesn't do anything and I really don't know if
> that's the issue or not.
>
>
> Any ideas?
>
> -jason
>
> _______________________________________________
> Columbia, Maryland Linux User's Group (CALUG) mailing list
> CALUG Website: http://www.calug.com
> Email postings to: lug at calug.com
> Change your list subscription options: http://calug.com/mailman/listinfo/lug
>
--
***************************************************
My blog: http://millersplace.blogspot.com/
***************************************************
_______________________________________________
Columbia, Maryland Linux User's Group (CALUG) mailing list
CALUG Website: http://www.calug.com
Email postings to: lug at calug.com
Change your list subscription options: http://calug.com/mailman/listinfo/lug
More information about the lug
mailing list