- In the age of information, the massive amount of data produced every day will remain unuseful for humans unless it makes it available with new tools and technologies.
- Abstractive Text Summarization tries to get the essential content of a text corpus and compress it to a shorter text while keeping its meaning and maintaining its semantic and grammatical correctness.
- Neural architectures are becoming dominant in the Abstractive Text Summarization.
- The use of deep learning architectures in natural language processing entered a new era after the sequence to sequence models in the recent decade.
- These models are founded on a couple of recurrent neural networks connecting the input and output data in an encoder-decoder architecture.
- Better results were possible by adding the Attention Mechanism to the RNN layers.
- Many works have shown the competitive performance of these new architectures in Machine Translation (ML).