Explore how (stochastic) gradient descent works on a simple linear regression. I built this demo to help me better understand backpropagation, by keeping an eye on the values of the weights as they get updated.<p>I've also added step-by-step remarks and graph plots to the values of the weights and the loss function.<p>Things you can play around with: optimiser, learning rate, variable initialiser, loss function, batch size, no. of epochs<p>JavaScript libraries used: Dagre-D3 (GraphViz + d3), MathJax, ApexCharts, jQuery<p>Any comments to this demo are welcome!