eSummarizer AI Service- Document Summarization Model Using BART

Article Sidebar

Main Article Content

Rongdeep Pathak
Mriganka Mohan Bora
Nelson R Varte

The exponential growth of governmental documentation in India has created an urgent need for efficient automated summarization systems. This research presents eSummarizer AI Service, a custom Transformer-based machine learning model designed specifically for summarizing Indian government documents such as policy papers, circulars, legislative texts, and departmental reports. The study employs advanced Natural Language Processing (NLP) techniques, including BART (Bidirectional and Auto-Regressive Transformer), reinforcement learning, and a custom dataset curated from official government portals. Evaluation using ROUGE metrics demonstrates significant improvements over existing baseline models, achieving high coherence, contextual relevance, and factual consistency. The proposed system has practical applications for policymakers, researchers, and citizens, enhancing the accessibility and comprehension of complex governmental information.

eSummarizer AI Service- Document Summarization Model Using BART. (2025). International Journal of Latest Technology in Engineering Management & Applied Science, 14(11), 965-970. https://doi.org/10.51583/IJLTEMAS.2025.1411000093

Downloads

References

Babar S, Tech-Cse M, Rit, ”Text Summarization: An Overview”.2013

Christian H, Agus M, Suhartono D,”Single Document Automatic Text Summarization using Term Frequency-Inverse Document Frequency (TF-IDF)”, ComTech: Computer, Mathematics and EngineeringApplications 2016.

Nomoto T ”Bayesian Learning in Text Summarization Models” ,2005

Graves A ,”Generating Sequences With Recurrent Neural Net works”, CoRR abs/1308.0850:2013

Nallapati R, Xiang B, Zhou B , ”Sequence-to-Sequence RNNs for Text Summarization”, CoRR abs/1602.06023: ,2016

Hochreiter S, Schmidhuber J , ”Long Short-Term Memory”. Neural Comput 9:1735–1780, https://doi.org/10.1162/neco.1997.9.8.1735, 1997

Shi T, Keneshloo Y, Ramakrishnan N, Reddy CK , ”Neural Abstractive Text Summarization with Sequence-to-Sequence Models”, CoRR abs/1812.02303:, 2018

Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I , ”Attention is All you Need”. ArXiv abs/1706.03762:,2017

Devlin J, Chang M-W, Lee K, Toutanova K BERT: ”Pre-training of Deep Bidirectional Transformers for Language Understanding”, 2019,CoRR abs/1810.04805

Zhang J, Zhao Y, Saleh M, Liu PJ , ”PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization”, 2019, CoRR abs/1912.08777

Dong L, Yang N, Wang W, Wei F, Liu X, Wang Y, Gao J, Zhou M, Hon H-W, ”Unified Language ModelPre-training for Natural Language Understanding and Generation”, 2019 CoRRabs/1905.03197

Radford A , ”Improving Language Understanding by Generative Pre-Training”,2018.

Wolf T, Debut L, Sanh V, Chaumond J, Delangue C, Moi A, Cistac P, Rault T, Louf R, Funtowicz M, ”HuggingFace’s Transformers: State-of-the-art Natural Language Processing”, 2019 CoRR abs/1910.03771.

Lin, C. Y. .” ROUGE: A Package for Automatic Evaluation of Summaries.” pages 74–81, 2004 Association for Computational Linguistics.

Balaji N, Karthik Pai B H, Bhaskar Bhat B, Praveen Barmavatu, ”Data Visualization in Splunk and Tableau: A Case Study Demonstration”, in Journal of Physics: Conference Series, 2021.

Lhoest Q, del Moral AV, Jernite Y, Thakur A, von Platen P, Patil S, Chaumond J, Drame M, Plu J, Tunstall L, Davison J. “Datasets: A community library for natural language” arXiv arXiv:2109.02846. 2021

M.Lewis, Y Lin, N Goyal . “BART: Denoising Sequence-to-Sequence Pretraining for Natural Language Generation.” ACL, 2020.

Article Details

How to Cite

eSummarizer AI Service- Document Summarization Model Using BART. (2025). International Journal of Latest Technology in Engineering Management & Applied Science, 14(11), 965-970. https://doi.org/10.51583/IJLTEMAS.2025.1411000093