Word Embedding (Part I)

Intuition and (some) maths to understand end-to-end Skip-gram model

Matyas Amrouche
Towards Data Science
4 min readFeb 27, 2019

--

Photo by Susan Yin on Unsplash

The original issue of NLP is the encoding of a word/sentence into an understandable format for computer processing. Representation of words in a vector space allows NLP models to learn.
A first and simple representation of words into vectors is one-hot-encoding as shown in Figure 1.

--

--

Multimodal Deep Learning Engineer working on Search Relevance @Leboncoin 📦