In natural language understanding, there is a hierarchy of lenses through which we can extract meaning - from words to sentences to paragraphs to documents. At the document level, one of the most useful ways to understand text is by analyzing its topics.
I made an introductory talk on word embeddings in the past and this write-up is an extended version of the part about philosophical ideas behind word vectors.
ConceptNet Numberbatch consists of state-of-the-art semantic vectors (also known as word embeddings) that can be used directly as a representation of word meanings or as a starting point for further machine learning.