INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue XI, November 2025
5. Nallapati R, Xiang B, Zhou B , ”Sequence-to-Sequence RNNs for Text Summarization”, CoRR
abs/1602.06023: ,2016
6. Hochreiter S, Schmidhuber J , ”Long Short-Term Memory”. Neural Comput 9:1735–1780,
7. Shi T, Keneshloo Y, Ramakrishnan N, Reddy CK , ”Neural Abstractive Text Summarization with
Sequence-to-Sequence Models”, CoRR abs/1812.02303:, 2018
8. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I , ”Attention
is All you Need”. ArXiv abs/1706.03762:,2017
9. Devlin J, Chang M-W, Lee K, Toutanova K BERT: ”Pre-training of Deep Bidirectional Transformers
for Language Understanding”, 2019,CoRR abs/1810.04805
10. Zhang J, Zhao Y, Saleh M, Liu PJ , ”PEGASUS: Pre-training with Extracted Gap-sentences for
Abstractive Summarization”, 2019, CoRR abs/1912.08777
11. Dong L, Yang N, Wang W, Wei F, Liu X, Wang Y, Gao J, Zhou M, Hon H-W, ”Unified Language
ModelPre-training for Natural Language Understanding and Generation”, 2019 CoRRabs/1905.03197
12. Radford A , ”Improving Language Understanding by Generative Pre-Training”,2018.
13. Wolf T, Debut L, Sanh V, Chaumond J, Delangue C, Moi A, Cistac P, Rault T, Louf R, Funtowicz M,
”HuggingFace’s Transformers: State-of-the-art Natural Language Processing”, 2019 CoRR
abs/1910.03771.
14. Lin, C. Y. .” ROUGE: A Package for Automatic Evaluation of Summaries.” pages 74–81, 2004
Association for Computational Linguistics.
15. Balaji N, Karthik Pai B H, Bhaskar Bhat B, Praveen Barmavatu, ”Data Visualization in Splunk and
Tableau: A Case Study Demonstration”, in Journal of Physics: Conference Series, 2021.
16. Lhoest Q, del Moral AV, Jernite Y, Thakur A, von Platen P, Patil S, Chaumond J, Drame M, Plu J,
Tunstall L, Davison J. “Datasets: A community library for natural language” arXiv arXiv:2109.02846.
2021
17. M.Lewis, Y Lin, N Goyal . “BART: Denoising Sequence-to-Sequence Pretraining for Natural Language
Generation.” ACL, 2020.
Page 970