ISSN (Online) : 2456 - 0774

Email : ijasret@gmail.com

ISSN (Online) 2456 - 0774


SEQUENCE-TO-SEQUENCEWITH ATTENTION MECHANISM FOR MARATHI TEXT SUMMARIZATION


Abstract

Abstract: -Automatic text summarization is a data scienceproblem commonly seen in machine learning and natural language processing. Textsummarization is the technique of generating concise and precise informationfrom a larger dataset. Extractive summarization and abstractive summarizationare two types of automatic summarization. Extractive summarization generates asummary by providing a verbatim subset of the actual data. In this case, as thename suggests, important sentences are extracted using methods liketext-ranking, sentence scoring, etc. In the case of abstractive summarization,it interprets and processes the given data to generate a summary that includessentence rephrasing and the creation of new sentences that resembles human-likesummaries. In this paper, we have proposed an intelligent system that providesconcise and precise abstractive summaries using a sequence-to-sequence recurrentneural network (RNN) model along with the Attention layer mechanism. Thesesummaries generated are in the Marathi language has never been achievedpreviously. It preserves the importance of the content and saves the time ofthe user.

Keywords: - Abstractive summarization, sequence-to-sequence, encoder-decoder, long-short-term-memory (LSTM), attention layer, soft-maxfunction.


Full Text PDF

IMPORTANT 

Submit paper at ijasret@gmail.com

Paper Submission Open For May 2020
UGC indexed  2017-2019
Last date for paper submission 5th June, 2020
Deadline Submit Paper any time
Publication of Paper Within 01-02 Days after completing all the formalities
Paper Submission Open For online Conference 
Publication Fees(6 Authors) Rs.1000    (Up to 06 Authors)