Deep Dive into the LSTM-CRF Model

With PyTorch code

Alexey Kravets
Towards Data Science
23 min readOct 12, 2023

--

In the rapidly evolving field of natural language processing, Transformers have emerged as dominant models, demonstrating remarkable performance across a wide range of sequence modelling tasks, including part-of-speech tagging, named entity recognition, and chunking. Prior to the era of Transformers, Conditional Random Fields (CRFs) were the go-to tool for sequence modelling and specifically linear-chain CRFs that model sequences as directed graphs while CRFs more generally can be used on arbitrary graphs.

--

--