Attention: Sequence 2 Sequence model with Attention Mechanism

Detailed explanation about Attention mechanism in a sequence 2 sequence model suggested by Bahdanau and Luong

Renu Khandelwal
Towards Data Science
8 min readJan 20, 2020

--

In this article, you will learn:

  • Why we need attention mechanisms for sequence 2 sequence models?
  • How does Bahdanua’s attention mechanism work?
  • How does Luong’s attention mechanism work?
  • What is local and global attention?

--

--

A Technology Enthusiast who constantly seeks out new challenges by exploring cutting-edge technologies to make the world a better place!