- In the age of information, the massive amount of data produced every day will remain unuseful for humans unless it makes it available with new tools and technologies.
- Abstractive Text Summarization tries to get the essential content of a text corpus and compress it to a shorter text while keeping its meaning and maintaining its semantic and grammatical correctness.
- Neural architectures are becoming dominant in the Abstractive Text Summarization.
- The use of deep learning architectures in natural language processing entered a new era after the sequence to sequence models in the recent decade.
- These models are founded on a couple of recurrent neural networks connecting the input and output data in an encoder-decoder architecture.
- Better results were possible by adding the Attention Mechanism to the RNN layers.
- Many works have shown the competitive performance of these new architectures in Machine Translation (ML).
-
Notifications
You must be signed in to change notification settings - Fork 1
Text Summarization using Abstractive text summarization
License
nehalmuthu/Abstractive-text-summarization-using-Attention-based-Neural-Network
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Text Summarization using Abstractive text summarization
Topics
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published