Excerpt: "layer is a program for doing neural network inference the Unix way. Many modern neural network operations can be represented as sequential, unidirectional streams of data processed by pipelines of filters. The computations at each layer in these neural networks are equivalent to an invocation of the layer program, and multiple invocations can be chained together to represent the entirety of such networks."<p>Another poster commented that performance might not be that great, but I don't care about performance, I care about the essence of the idea, and the essence of this idea is brilliant, absolutely brilliant!<p>Now, that being said, there is one minor question I have, and that is, how would backpropagation apply to this apparently one-way model?<p>But, that also being said... I'm sure there's a way to do it... maybe there should be a higher-level command which can run each layer in turn, and then backpropagate to the previous layer, if/when there is a need to do so...<p>But, all in all, a brilliant, brilliant idea!!!