I think this behaviour is supposed to be a feature of shell. Pipelines are executed in parallel.<p>Also, see `sponge` from `moreutils` (<a href="http://kitenet.net/~joey/code/moreutils/" rel="nofollow">http://kitenet.net/~joey/code/moreutils/</a>):<p><pre><code> $ man sponge
...
sponge reads standard input and writes it out to the specified file.
Un‐like a shell redirect, sponge soaks up all its input before opening the output file.
This allows constructing pipelines that read from and write to the same file.</code></pre>
This is the expected behavior of pipes[0]. A quick look at Lists [1] clears up how you can control this a little better (+Job Control). jamesdutc also linked to sponge, which may suit your usecase as well.<p><pre><code> [0] https://www.gnu.org/software/bash/manual/bashref.html#Pipelines
[1] https://www.gnu.org/software/bash/manual/bashref.html#Lists</code></pre>
That was a contrived example.<p>The shell sets up the pipe and file redirections for each process in the pipeline, then lets them go. It has no control over when the child processes read or write. It could be a race condition, but it's written in such a way that it appears that the file will get truncated before it can be read virtually 100% of the time.
NB: If you're going to use CSS styling for code examples, choose foreground and background colors, as well as text sizes, which are actually legible.<p>Monospace fonts, black on white, at default sizes, are your best bets.