Tensorflow has a whole bunch of tutorials, but those are the "Hello World"s of tensorflow, not of neural networks.<p>In order to get started with neural networks, begin with drawing simple neural nets for basic operations like addition, multiplication, XOR. Just represent boolean tables as neural networks.<p>Once you can do that, move on to implementing the algorithm yourself. A simple 3 layer network is enough to understand how the concept works. 4/2/2 nodes is plenty. Just understand how the calculations work.<p>Then move on to a framework - only after you understood the math. The machine learning course on coursera by Andrew Ng(?) explains the algorithms.
Not trying to be a smartass, if "Hello, World" is the most basic program, why would a single layer perceptron model not be what OP is looking for?
I once implemented a simple back propagation algorithm in Haskell (without any libraries) that could identify the pattern (one amoung 'A', 'B', 'C' or 'D') represented on an 8x8 matrix...<p>Here is the code..<p><a href="https://bitbucket.org/sras/haskell-stuff/src/b58f3fc017ce303fd9733184f599e916739f6ed2/snn.hs?at=default&fileviewer=file-view-default" rel="nofollow">https://bitbucket.org/sras/haskell-stuff/src/b58f3fc017ce303...</a>