Retrieval-Augmented Transformer-XL for Close-Domain Dialog Generation

Authors

  • Giovanni Bonetta University of Turin
  • Rossella Cancelliere
  • Ding Liu
  • Paul Vozila

DOI:

https://doi.org/10.32473/flairs.v34i1.128369

Abstract

Transformer-based models have demonstrated excellent capabilities of capturing patterns and structures in natural language generation and achieved state-of-the-art results in many tasks. In this paper we present a transformer-based model for multi-turn dialog response generation. Our solution is based on a hybrid approach which augments a transformer-based generative model with a novel retrieval mechanism, which leverages the memorized information in the training data via k-Nearest Neighbor search. Our system is evaluated on two datasets made by customer/assistant dialogs: the Taskmaster-1, released by Google and holding high quality, goal-oriented conversational data and a proprietary dataset collected from a real customer service call center. Both achieve better BLEU scores over strong baselines.

Downloads

Published

2021-04-18

How to Cite

Bonetta, G., Cancelliere, R., Liu, D., & Vozila, P. (2021). Retrieval-Augmented Transformer-XL for Close-Domain Dialog Generation. The International FLAIRS Conference Proceedings, 34. https://doi.org/10.32473/flairs.v34i1.128369

Issue

Section

Special Track: Applied Natural Language Processing