Sinarwati, Mohamad Suhaili and Mohamad Nazim, Jambli (2025) A lightweight neural attentionbased model for service chatbots. Scientific Reports, 15 (29688). pp. 1-33. ISSN 2045-2322
|
PDF
s41598-025-14215-5.pdf Download (8MB) |
Abstract
The growing demand for efficient service chatbots has led to the development of various deep learning techniques, such as generative neural attention-based mechanisms. However, existing attention processes often face challenges in generating contextually relevant responses. This study introduces a lightweight neural attention mechanism designed to enhance the scalability of service chatbots by integrating a scalar function into the existing attention score computation. While inspired by scaling practices in transformer models, the proposed scalar is tailored to seq2seq architectures to optimize the alignment sequences, resulting in improved context relevance and reduced resource requirements. To validate its effectiveness, the proposed model was evaluated on a real-world Customer Support Twitter dataset. Experimental results demonstrate a +0.82 BLEU-4 improvement and a 28% reduction in training time per epoch over the baseline. Moreover, the model achieves the target validation loss two epochs earlier, indicating faster convergence and improved training stability. Further experiments investigated activation functions and weight initializers integrated into the proposed model to identify optimal configurations that optimize the model’s performance. Comparative experimental results show that the proposed modifications significantly enhance response accuracy and contextual relevance. This lightweight attention mechanism addresses key limitations of existing attention mechanisms. Future work may extend this approach by combining it with transformerbased architectures to support broader sequence prediction tasks, including machine translation, recommender systems, and image captioning.
| Item Type: | Article |
|---|---|
| Uncontrolled Keywords: | Attention mechanisms, Activation functions, Deep learning, Scalar functions, Score functions, Sequence-to-sequence, Weight initializer. |
| Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
| Divisions: | Academic Faculties, Institutes and Centres > Centre for Pre-University Studies Faculties, Institutes, Centres > Centre for Pre-University Studies Academic Faculties, Institutes and Centres > Centre for Pre-University Studies |
| Depositing User: | Gani |
| Date Deposited: | 23 Sep 2025 04:11 |
| Last Modified: | 23 Sep 2025 04:11 |
| URI: | http://ir.unimas.my/id/eprint/49542 |
Actions (For repository members only: login required)
![]() |
View Item |
