I don't see any claims about performance, but I would be very surprised if it was anything better than abysmal. In a modern neural network pipeline, just sending data to the CPU memory is treated as a ridiculously expensive operation, let alone serializing to a delimited text string.<p>Come to think of it, this is also a problem with the Unix philosophy in general, in that it requires trading off performance (user productivity) for flexibility (developer productivity), and that trade-off isn't always worth it. I would love to see a low overhead version of this that can keep data as packed arrays on a GPU during intermediate steps, but I'm not sure it's possible with Unix interfaces available today.<p>Maybe there's a use case with very small networks and CPU evaluation, but so much of the power of modern neural networks comes from scale and performance that I'm skeptical it is very large.
Excerpt: "layer is a program for doing neural network inference the Unix way. Many modern neural network operations can be represented as sequential, unidirectional streams of data processed by pipelines of filters. The computations at each layer in these neural networks are equivalent to an invocation of the layer program, and multiple invocations can be chained together to represent the entirety of such networks."<p>Another poster commented that performance might not be that great, but I don't care about performance, I care about the essence of the idea, and the essence of this idea is brilliant, absolutely brilliant!<p>Now, that being said, there is one minor question I have, and that is, how would backpropagation apply to this apparently one-way model?<p>But, that also being said... I'm sure there's a way to do it... maybe there should be a higher-level command which can run each layer in turn, and then backpropagate to the previous layer, if/when there is a need to do so...<p>But, all in all, a brilliant, brilliant idea!!!
What's wonderful about this concept (and unix concept in general) is that the flexibility it gives you is amazing. You can for example pipe it over the network and distribute the inference across machines. You can tee the output and save each layers output to a file. The possibilities are endless here.
Great concept. Would like to see more of this idea applied to neural network processing and configuration in general (which in my experience can sometimes be a tedious, hard-coded affair).
I've been thinking about something like this for a long time, but could never quite wrap my head around a good way to do it (especially since I kept getting stuck on making it full featured, i.e. more than inference), so thank you for putting it together! I love the concept, and I'll be playing with this all day!
This might not be a great way to build neural networks (as other commenters have said regarding performance). But, it could be a great way to learn about neural networks. I always find the command line a great way to understand a pipeline of information.
Great idea but however equally great caveat - it's just for (forward) inference. Unix pipelines are fundamentally one way and this approach won't work for back propagation.
See also Trevor Darrell's group's work on neural module networks:<p><a href="https://bair.berkeley.edu/blog/2017/06/20/learning-to-reason-with-neural-module-networks/" rel="nofollow">https://bair.berkeley.edu/blog/2017/06/20/learning-to-reason...</a>
Wonderful idea and the Chicken Scheme implementation looks nice also.<p>I wrote some Racket Scheme code that reads Keras trained models and does inferencing but this is much better: I used Racket’s native array/linear algebra support but this implementation uses BLAS which should be a lot faster.
Well - I will say I like the general concept. I just wish it wasn't implemented in Scheme (only because I am not familiar with the language; looking at the source, though - I'm not sure I want to go there - it looks like a mashup of Pascal, Lisp, and RPN).<p>It seems like today - and maybe I am wrong - but that data science and deep learning in general has pretty much "blessed" Python and C++ as the languages for such tasks. Had this been implemented in either, it might receive a wider audience.<p>But maybe the concept itself is more important than the implementation? I can see that as possibly being the case...<p>Great job in creating it; the end-tool by itself looks fun and promising!
Reading the title, I can't help but think of the 'Unix haters handbook' and groan, why would you want to apply the unix philosophy to nets??