filmov
tv
Query Focused Abstractive Summarization via Incorporating Query Relevance and Transfer Learning with
![preview_player](https://i.ytimg.com/vi/jLlOpsvkhoQ/maxresdefault.jpg)
Показать описание
Md Tahmid Rahman Laskar (York University), Enamul Hoque (York University) and Jimmy Huang (York University).
In the Query Focused Abstractive Summarization (QFAS) task, the goal is to generate abstractive summaries from the source document that are relevant to the given query. In this paper, we propose a new transfer learning technique by utilizing the pre-trained transformer architecture for the QFAS task in the Debatepedia dataset. We find that the Diversity Driven Attention model (DDA), which was the first model applied on this dataset, only performs well when the dataset is augmented by creating more training instances based on random replacement of words with their synonyms. We also demonstrate that the performance of the DDA model significantly degrades when the dataset is not augmented. In contrast, without requiring any in-domain data augmentation, our proposed approach outperforms both augmented and non-augmented versions of DDA and sets a new state-of-the-art result.
In the Query Focused Abstractive Summarization (QFAS) task, the goal is to generate abstractive summaries from the source document that are relevant to the given query. In this paper, we propose a new transfer learning technique by utilizing the pre-trained transformer architecture for the QFAS task in the Debatepedia dataset. We find that the Diversity Driven Attention model (DDA), which was the first model applied on this dataset, only performs well when the dataset is augmented by creating more training instances based on random replacement of words with their synonyms. We also demonstrate that the performance of the DDA model significantly degrades when the dataset is not augmented. In contrast, without requiring any in-domain data augmentation, our proposed approach outperforms both augmented and non-augmented versions of DDA and sets a new state-of-the-art result.