Classifying Documents with Quantum-enhanced Transfer Learning

Supplementing pre-trained models with variational quantum circuits brings us to the era of quantum-enhanced Natural Language Processing

Riccardo Di Sipio
Towards Data Science
7 min readJan 16, 2020

--

In recent time, a trend appeared quite evident: large, pre-trained models are needed to achieve state-of-the-art performance in computer vision and language understanding. In particular, the field of natural language processing (NLP) is seeing an exponential expansion in terms of capabilities but also…

--

--

Senior Machine Learning developer at Dayforce. NLP, graph neural networks. Formerly physicist at U Toronto, Bologna, CERN LHC/ATLAS.