Neural Language Models
From feedforward through stacked recurrent
Published in
36 min readMay 7, 2021
In NLP, a language model is a probability distribution over sequences on an alphabet of tokens. A central problem in language modeling is to learn a language model from examples, such as a model of English sentences from a training set of sentences.
Language models have many uses. Such as…