The "Making it Work, On-device" paragraph makes it seem like TensorFlow Lite will easily get your model running fast on-device, but in reality RNNs aren't currently supported by the TFLite Converter and the TFLiteLSTMCell example is super slow for training, so this is actually based on proprietary code not available to mere mortals using open source TensorFlow. If you were to actually try reproducing this work, you'd have to use several workarounds, dig deep into the TensorFlow source code, and possibly still end up with a suboptimal TFLite model.<p>Don't get me wrong, in terms of deployability and flexibility for production usage, TensorFlow/TFLite is <i>really good</i>, specially compared to other frameworks, but Google tends to oversell the abilities of open-source TensorFlow significantly in their marketing material, and you only find out when you go and try doing it yourself.