Hands-on Tutorials
Is attention what you really need in Transformers?
Exploring promising alternatives and improvements to self-attention mechanism in Transformers models.
Published in
7 min readJun 2, 2021
In recent years there has been an explosion of methods based on self-attention and in particular Transformers, first in the field of Natural Language…