BART for Paraphrasing with Simple Transformers

Paraphrasing is the act of expressing something using different words while retaining the original meaning. Let’s see how we can do it with BART, a Sequence-to-Sequence Transformer Model.

Thilina Rajapakse
Towards Data Science
8 min readAug 5, 2020

--

Photo by Alexandra on Unsplash

Introduction

BART is a denoising…

--

--