Extractive Text Summarization using Recurrent Neural Networks with Attention Mechanism

preview_player
Показать описание
#AI #Machineleraning #Neuralnetwork #CSIT #CMLA_2021 #NLPD
Extractive Text Summarization using Recurrent Neural Networks with Attention Mechanism
Shimirwa Aline Valerie and Jian Xu, Nanjing University of Science and Technology, China

Abstract
Extractive summarization aims to select the most important sentences or words from a document to generate a summary. Traditional summarization approaches have relied extensively on features manually designed by humans. In this paper, based on the recurrent neural network equipped with the attention mechanism, we propose a data-driven technique. We set up a general framework that consists of a hierarchical sentence encoder and an attention based sentence extractor. The framework allows us to establish various extractive summarization models and explore them. Comprehensive experiments are conducted on two benchmark datasets, and experimental results show that training extractive models based on Reward Augmented Maximum Likelihood (RAML)can improve the model’s generalization capability.

Keywords
Extractive summarization, Recurrent neural networks, Attention mechanism, Maximum Likelihood Estimation, Reward Augmented Maximum Likelihood.

#AI #Machineleraning #Neuralnetwork #CSIT #CMLA_2021 #NLPD
Рекомендации по теме