It seems to me that this can be achieved by the following bash-native way of creating extra file descriptor pipes:<p><pre><code> pipe_path="$(mktemp -u)"
mkfifo "$pipe_path"
exec 3<>"$pipe_path"
rm -f "$pipe_path"
</code></pre>
Here, exec associates the file descriptor (3 here, replace with any desired descriptor) with the pipe created by mkfifo. The filesystem path to the pipe is removed immediately after we obtain a file descriptor to it, so that the the only remaining reference to the pipe in the system would be from this script, and thus when the script dies, the kernel will automatically free the pipe.<p>An example use case would be like so: <a href="https://unix.stackexchange.com/a/216475/585293" rel="nofollow">https://unix.stackexchange.com/a/216475/585293</a>
Hm interesting, also see dgsh, the directed graph shell<p><a href="https://www2.dmst.aueb.gr/dds/sw/dgsh/" rel="nofollow">https://www2.dmst.aueb.gr/dds/sw/dgsh/</a><p><a href="https://github.com/dspinellis/dgsh">https://github.com/dspinellis/dgsh</a><p><a href="https://news.ycombinator.com/item?id=21700014">https://news.ycombinator.com/item?id=21700014</a><p>dgsh uses Unix domain sockets, not pipes. I don't remember exactly why, but it's in the paper, perhaps to avoid deadlocks compared to pipes.<p>I'd also be interested in some more examples with pipexec or dgsh!
This is neat, but outside of a contrived ouroboros example, what’s a real world use case for this?<p>There’s a natural flow of outputs becoming inputs and I’m struggling to identify a situation where I would feed things back into the source. Also, named pipes kind of solve that already.
What a beautifully designed tool. In our Python codebase we end up reaching for inline sh scripting a lot whenever we need to pipe between processes. In a way it feels ok — after all, no one has any qualms about reaching for inline SQL to get things done, so what’s wrong with a little shell script in the middle of a Python module?<p>Just as there are efforts — both wise and misguided — to represent the building of an SQL query with Python syntax, what Python tools are there to build sh pipelines between processes with a more pythonic syntax? Do they provide value in excess of the novelty tax one has to pay for using a non standard library?
It's an interesting package, and I have some of the use cases it appears to address. However, the documentation is inadequate to quickly understand how to robustly build some of the more complex cases. In particular, how to build bash-style process substitution. Robust here is the pipeline exits non-zero if any of the substituted processes fail, as demonstrated by this example:<p><pre><code> #!/bin/bash
set -beEu -o pipefail
cat <(date) <(false)
echo did not exit non-zero
</code></pre>
If this is addressed, it would be worth more time to figure out Pipexec.
Considering how airflow/dagster are trendy these days, concurrency too.. I assume a leaner, os/language agnostic solution for this problem might emerge not too far in the future.
In the category of “command line representations of graphs” see also ffmpeg’s filtergraphs [1].<p>[1] <a href="https://ffmpeg.org/ffmpeg-filters.html#Filtering-Introduction" rel="nofollow">https://ffmpeg.org/ffmpeg-filters.html#Filtering-Introductio...</a>
It would be cool to have a tool similar to this, but for composing a graph of commands similar to aws step functions (or Jenkins pipelines). I’d call it mapexec :)