As pre-built word vectors go, Conceptnet Numberbatch [1], introduced less flippantly as the ConceptNet Vector Ensemble [2], already outperforms this on all the measures evaluated in its paper: Rare Words, MEN-3000, and WordSim-353.<p>This fact is hard to publicize because somehow the luminaries of the field decided that they didn't care about these evaluations anymore, back when RW performance was around 0.4. I have had reviewers dismiss it as "incremental improvements" to improve Rare Words from 0.4 to 0.6 and to improve MEN-3000 to be as good as a high estimate of inter-annotator agreement.<p>It is possible to do much, much better than Google News skip-grams ("word2vec"), and one thing that helps get there is lexical knowledge of the kind that's in ConceptNet.<p>[1] <a href="https://blog.conceptnet.io/2016/05/25/conceptnet-numberbatch-a-new-name-for-the-best-word-embeddings-you-can-download/" rel="nofollow">https://blog.conceptnet.io/2016/05/25/conceptnet-numberbatch...</a><p>[2] <a href="https://blog.luminoso.com/2016/04/06/an-introduction-to-the-conceptnet-vector-ensemble/" rel="nofollow">https://blog.luminoso.com/2016/04/06/an-introduction-to-the-...</a>