Hey HN!<p>This is our latest demo for TensorFire (<a href="https://tenso.rs/" rel="nofollow">https://tenso.rs/</a>), a Javascript library for running neural networks in the browser GPU-accelerated by WebGL.<p>Each of these models is a simple two-layer char-rnn (the kind explored in Karpathy's "Unreasonable Effectiveness of RNNs" post <a href="http://karpathy.github.io/2015/05/21/rnn-effectiveness/" rel="nofollow">http://karpathy.github.io/2015/05/21/rnn-effectiveness/</a>). When you hit "Tab" the editor feeds in the contents of your document letter-by-letter into a network which is trying to predict the most likely next letter. It samples from that probability distribution and feeds it back into the network to generate an arbitrarily long string.<p>We've published code and instructions for training new models on Github (<a href="https://github.com/tensorfire/cyborg-writer/tree/master/training" rel="nofollow">https://github.com/tensorfire/cyborg-writer/tree/master/trai...</a>). With about a megabyte of any sort of text, you can train a model and easily embed it into any web application.<p>We (@bijection and @antimatter15) will be answering questions in the comments.
Ha very nice!<p>I made a vary similar (but much rougher around the edges) project for fun a few months ago using a smaller french corpus of press releases from <i>La Quadrature du Net</i> (a non-profit lobbying for net neutrality in France and Europe).<p>There is a live demo at <a href="http://jorquera.net/quadra_neural/" rel="nofollow">http://jorquera.net/quadra_neural/</a>. You can press tab to auto-complete word-by-word, or just start typing and the suggestions update on the fly. Instead of "Weirdness" I called my cursor "Drunkenness" :)<p>I used keras-js (<a href="https://github.com/transcranial/keras-js" rel="nofollow">https://github.com/transcranial/keras-js</a>) to run my model in the browser.<p>If you are interested in this kind of experiments, do go for it! It's quite accessible, and projects of this scale do not require heavy hardware (I generated my models using a GTX 770).<p>If you need a kickstart, in addition to the repo from the OP all the code for my project is Free Software (<a href="https://github.com/tomjorquera/quadra_neural" rel="nofollow">https://github.com/tomjorquera/quadra_neural</a>), and I tried to document the generation process in a jupyter notepad <a href="https://github.com/tomjorquera/quadra_neural/blob/master/generation/model_gen.ipynb" rel="nofollow">https://github.com/tomjorquera/quadra_neural/blob/master/gen...</a>. While I did it on a french corpus, it should work with any language.<p>I used two main resources for inspiration :<p>- <a href="http://karpathy.github.io/2015/05/21/rnn-effectiveness/" rel="nofollow">http://karpathy.github.io/2015/05/21/rnn-effectiveness/</a><p>- <a href="https://github.com/fchollet/keras/blob/master/examples/lstm_text_generation.py" rel="nofollow">https://github.com/fchollet/keras/blob/master/examples/lstm_...</a>
Pretty hilarious. I've found Tupac to work the best, e.g. (my starting text in quotes):<p><pre><code> "I was reading Hacker News when" he gladed out
Somebody excuse I went down today
I wonder if I give a fuck, when I creep
</code></pre>
For some reason Donald Trump always seems to produce garbage...
Eh, it's scarcely better than a Markov chain. I was hoping for something a little more useful, like the ability to scroll through many similar words or concepts.
Fun with Shakespeare:<p>It seemed, somehow mock'd; Unbashed as it was my father.<p>My lady shall have him, perchance and reasons will abide it. Upon his unspoken thing, the devil shall see.<p>MISTRESS QUICKLY: Hide amongst you the gods? Wherefore comes here, sir?
We have to be pretty close to someone having a system to generate mountains of convincing/unique "content" using Neural Nets (or something comparable) for the purposes of web spam/seo right?<p>Back in the day, SEOs would use spun content to generate lots of unique variants. But at this point, surely a ML algorithm can crack this nut?