TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Tinker with a Neural Network in Your Browser

855 pointsby shancarterabout 9 years ago

22 comments

erostrateabout 9 years ago
The swiss roll problem also illustrates nicely the idea behind deep learning.<p>Before deep learning people would manually design all these extra features sin(x_1), x_1^2, etc. because they thought it was necessary to fit this swiss roll dataset. So they would use a shallow network with all these features like this: <a href="http:&#x2F;&#x2F;imgur.com&#x2F;H1cvt8d" rel="nofollow">http:&#x2F;&#x2F;imgur.com&#x2F;H1cvt8d</a><p>Then the deep learning guys realized that you don&#x27;t have to engineer all these extra features, you can just use basic features x_1, x_2 and let the network learn more complicated transformations in subsequent layers. So they would use a deep network with only x_1, x_2 as inputs: <a href="http:&#x2F;&#x2F;imgur.com&#x2F;XBRjROP" rel="nofollow">http:&#x2F;&#x2F;imgur.com&#x2F;XBRjROP</a><p>Both these approaches work here (loss &lt; 0.01). The difference is that for the first one you have to manually choose the extra features sin(x_1), x_1^2, ... for each problem. And the more complicated the problem the harder it is to design good features. People in the computer vision community spent years and years trying to design good features for e.g. object recognition. But finally some people realized that deep networks could learn these features themselves. And that&#x27;s the main idea in deep learning.
评论 #11487343 未加载
评论 #11488841 未加载
评论 #11487520 未加载
评论 #11487796 未加载
评论 #11487950 未加载
评论 #11489090 未加载
评论 #11486853 未加载
评论 #11487566 未加载
评论 #11487863 未加载
eggyabout 9 years ago
I started reading about ANNs in the 1980s, and had similar confusion to those here, since it was just for fun. I suggest reading a basic book or online information that goes over the basics [1]. I struggled through $200 text books, and jumped from one to the other as an autodidact. I am now studying TWEANNs (Topology and Weight Evolving Artificial Neural Networks), which basically are what you see here with the exception that they are able to not only change their weights, but also their topology, that is how many and where the neurons and layers are. ANNs (Artificial Neural Networks - as opposed to biological ones) can be a lot of fun, and are very relevant to machine learning and big data nowadays. It was exploratory for me. I used them for generative art and music programs. Be careful: soon you&#x27;ll be reading about genetic algorithms, genetic programming [2], and artificial life ;) Genetic Programming can be used to evolve neural networks as well as generate computer programs to solve a problem in a specified domain. Hint: You&#x27;ll probably want to use Lisp&#x2F;Scheme for genetic programming!<p><pre><code> [1] http:&#x2F;&#x2F;natureofcode.com&#x2F;book&#x2F;chapter-10-neural-networks&#x2F; [2] http:&#x2F;&#x2F;www.genetic-programming.com</code></pre>
评论 #11485230 未加载
评论 #11486529 未加载
评论 #11486057 未加载
评论 #11485502 未加载
nlabout 9 years ago
This is great, but I think they should make it clear that this isn&#x27;t using TensorFlow.<p>From the title and domain I though they either had ported TF to Javascript(!) or we connecting to a server.
评论 #11485808 未加载
minimaxirabout 9 years ago
When it says &quot;right here in your browser,&quot; it&#x27;s not joking. On my desktop (Safari), the window becomes unresponsive after a few iterations. Does not happen in Chrome.<p>On my phone (Safari&#x2F;iOS 9.3), the default neural nework doesn&#x27;t converge at all even after 300 iterations while it does on the desktop, which is legit weird: <a href="https:&#x2F;&#x2F;i.imgur.com&#x2F;KNaXeHH.png" rel="nofollow">https:&#x2F;&#x2F;i.imgur.com&#x2F;KNaXeHH.png</a>
评论 #11485602 未加载
评论 #11485322 未加载
评论 #11486393 未加载
danielvfabout 9 years ago
In case you are an idiot like me, you have to train your neural network by pressing &quot;play&quot;.
评论 #11485134 未加载
评论 #11485585 未加载
gojomoabout 9 years ago
While it doesn&#x27;t involve training, these &#x27;confusion matrix&#x27; animations of NNs classifying images or digits are fun, too:<p><a href="http:&#x2F;&#x2F;ml4a.github.io&#x2F;dev&#x2F;demos&#x2F;cifar_confusion.html" rel="nofollow">http:&#x2F;&#x2F;ml4a.github.io&#x2F;dev&#x2F;demos&#x2F;cifar_confusion.html</a> <a href="http:&#x2F;&#x2F;ml4a.github.io&#x2F;dev&#x2F;demos&#x2F;mnist_confusion.html" rel="nofollow">http:&#x2F;&#x2F;ml4a.github.io&#x2F;dev&#x2F;demos&#x2F;mnist_confusion.html</a><p>Something about the high-speed updating makes me think of WOPR, in &#x27;War Games&#x27;, scoring nuclear-war scenarios.
timroyabout 9 years ago
This demonstration goes really well with Michael Nielsen&#x27;s <a href="http:&#x2F;&#x2F;neuralnetworksanddeeplearning.com&#x2F;" rel="nofollow">http:&#x2F;&#x2F;neuralnetworksanddeeplearning.com&#x2F;</a>. At the bottom of the page the author gives a shout out to Nielsen, Bengio, and others.<p>For someone (like me) who&#x27;s done a bit of reading but not much implementation, this playground is fantastic!
评论 #11485765 未加载
CGamesPlayabout 9 years ago
Neat stuff, fun to play with. I wasn&#x27;t able to get a net to classify the swiss roll. Last time I was playing around with this stuff I found the single biggest factor in the success was the optimizer used. Is this just using a simple gradient descent? I would like to see a drop down for different optimizers.
评论 #11484826 未加载
评论 #11485072 未加载
评论 #11484492 未加载
评论 #11484736 未加载
_AllisonMobleyabout 9 years ago
Can somebody explain what I&#x27;m watching when I press play?
评论 #11485023 未加载
评论 #11485211 未加载
评论 #11484709 未加载
评论 #11485125 未加载
评论 #11484738 未加载
评论 #11484740 未加载
karpathyabout 9 years ago
this is very nice! I think that the reason swiss roll doesn&#x27;t work as easily might be because of initialization. In 2 dimensions you have to be very careful with initializing the weights or biases because small networks get more easily stuck in bad local minima.
评论 #11485445 未加载
danblickabout 9 years ago
Has anyone been able to learn a function for the spiral (Swiss roll) data that&#x27;s as good as a human-designed function would be?
评论 #11486296 未加载
评论 #11486157 未加载
评论 #11486532 未加载
评论 #11486238 未加载
评论 #11486374 未加载
评论 #11486394 未加载
评论 #11486104 未加载
halotropeabout 9 years ago
You could totally optimise network architecture by crowdsourcing topology discovery for different problems into a multiplayer game with loss as a score.
Your_Creatorabout 9 years ago
So glad anns are becoming mainstream<p>Eventually it will have to be recognized as a new species of life, so I hope programmers, tinkerers and everyone else keeps that in mind because all life must be respected<p>And this particular form will be our responsibility, we can either embrace it as we continue to merge with our technology, or we can allow ourselves to go extinct like so many other species already have<p>For the naysayers - ever notice how attached we are to our phones? Many behave as if they are missing a limb without it - it&#x27;s because they are, the brain adapts rapidly and for many, the brain has adapted to outsourcing our cognition. It used to be books, day runners, journals, diaries - now we have devices and soon they&#x27;ll be implants or prosthetics<p>The writers at marvel who came up with the idea of calling iron man&#x27;s suit a prosthetic were definately onto something and suits like that are probably our best chance of successful colonization of other planets. We&#x27;ll need ai to be our friend out there, working with us
aab0about 9 years ago
This is a lot of fun. The default dataset is too easy, though, try out the Swiss Roll one!
评论 #11484243 未加载
评论 #11484489 未加载
sparky_about 9 years ago
This is a very cool toy. As someone with no experience in ML, this is an interesting visual approach to the absolute basics.<p>And great for challenging your friends in an epic battle of convergence!
评论 #11487856 未加载
评论 #11485828 未加载
pkayeabout 9 years ago
I&#x27;m not well versed in neural networks but a lot of the new neural network software stacks coming out seem to be quite plug and plug. What kind of expertise would engineers need to have a few years from now when the technology is well developed and it doesn&#x27;t need to be rewritten from scratch every time?
评论 #11486002 未加载
plaflabout 9 years ago
Beautiful. The next time someone asks what is machine learning about I&#x27;m going to send a link to this page.
nxzeroabout 9 years ago
&quot;Don’t Worry, You Can’t Break It. We Promise.&quot;<p>(Nice, but it&#x27;s completely unclear what&#x27;s going on.)
评论 #11485855 未加载
nkozyraabout 9 years ago
Is a 50&#x2F;50 training:test a normal default ratio for an ANN? I expected to see a higher amount of training data represented as the initial setting.
hyh1048576about 9 years ago
One of the finest data visualization I&#x27;ve seen.
icelancerabout 9 years ago
This is so great. An easy way to show my friends WTF I do sometimes for math&#x2F;CS work. Thank you so much.
imaginenoreabout 9 years ago
I wish they had more interesting data sets.
评论 #11485041 未加载