Tom7 back to his usual madness. This time, exploiting floating point rounding, first to allow a neural network activation function to be linear (except for rounding errors), then expanding until ultimately making a 6502 emulator and proving that linear operations + rounding errors are Turing complete.