XLM-RoBERTa: The alternative for non-english NLP

Are multilingual models closing the gap on single language models?

Tower of Babel (image from Wikipedia)

If you are doing NLP in a non-english language, you’ll often be agonising over the question “what language model should I use?” While there’s a growing number of monolingual models trained by the community, there’s also an alternative that seems to get less attention…

Already have an account? Sign in

Branden Chan

Written by

Machine Learning Engineer at deepset.ai developing cutting edge NLP systems

Towards Data Science

A Medium publication sharing concepts, ideas, and codes.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade