It's really difficult to overstate how important embeddings are going to be for ML.<p>Word embeddings have already transformed NLP. Most people I know, when they sit down to work on an NLP task, the first thing they do is use an off-the-shelf library to turn it into a sequence of embedded tokens. They don't even think about it; it's just the natural first step, because it makes everything so much easier.<p>In the last couple years, embeddings for other data types (images, whole sentences, audio, etc.) have started to enter mainstream practice too. You can get near-state-of-the-art image classification with a pretrained image embedding, a few thousand examples, and a logistic regression trained on your laptop CPU. It's astonishing.<p>(Note: I work on <a href="https://www.basilica.ai" rel="nofollow">https://www.basilica.ai</a> , an embeddings-as-a-service company, so I'm definitely a little bit biased.)