Sinarwati, Mohamad Suhaili and Mohamad Nazim, Jambli (2025) A lightweight neural attention-based model for service chatbots. Scientific Reports, 15 (29688). pp. 1-33. ISSN 2045-2322
|
PDF
A lightweight neural attention-based model for service chatbots.pdf Download (8MB) |
Abstract
The growing demand for efficient service chatbots has led to the development of various deep learning techniques, such as generative neural attention-based mechanisms. However, existing attention processes often face challenges in generating contextually relevant responses. This study introduces a lightweight neural attention mechanism designed to enhance the scalability of service chatbots by integrating a scalar function into the existing attention score computation. While inspired by scaling practices in transformer models, the proposed scalar is tailored to seq2seq architectures to optimize the alignment sequences, resulting in improved context relevance and reduced resource requirements. To validate its effectiveness, the proposed model was evaluated on a real-world Customer Support Twitter dataset. Experimental results demonstrate a +0.82 BLEU-4 improvement and a 28% reduction in training time per epoch over the baseline. Moreover, the model achieves the target validation loss two epochs earlier, indicating faster convergence and improved training stability. Further experiments investigated activation functions and weight initializers integrated into the proposed model to identify optimal configurations that optimize the model’s performance. Comparative experimental results show that the proposed modifications significantly enhance response accuracy and contextual relevance. This lightweight attention mechanism addresses key limitations of existing attention mechanisms. Future work may extend this approach by combining it with transformer-based architectures to support broader sequence prediction tasks, including machine translation, recommender systems, and image captioning.
| Item Type: | Article |
|---|---|
| Uncontrolled Keywords: | Attention mechanisms, Activation functions, Deep learning, Scalar functions, Score functions, Sequence-to-sequence, Weight initializer, Lightweight AI models; Service automation |
| Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
| Divisions: | Academic Faculties, Institutes and Centres > Faculty of Cognitive Sciences and Human Development Faculties, Institutes, Centres > Faculty of Cognitive Sciences and Human Development Academic Faculties, Institutes and Centres > Faculty of Cognitive Sciences and Human Development |
| Depositing User: | Mohamad Suhaili |
| Date Deposited: | 01 Oct 2025 23:35 |
| Last Modified: | 01 Oct 2025 23:35 |
| URI: | http://ir.unimas.my/id/eprint/49728 |
Actions (For repository members only: login required)
![]() |
View Item |
