Introduction
This work presents a new sequence-to-sequence pre-training model called ProphetNet, which introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism.
Code Link
Paper
ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training, EMNLP 2020 Paper