Law, Irene Xin Lin (2023) Hand Gestures Interaction in Augmented Reality (AR) Learning Application. [Final Year Project Report] (Unpublished)
PDF
Irene Law Xin Lin (24pgs).pdf Download (345kB) |
|
PDF (Please get the password by email to repository@unimas.my , or call ext: 3914 / 3942 / 3933)
Irene Law Xin Lin (fulltext).pdf Restricted to Registered users only Download (2MB) |
Abstract
Augmented Reality (AR) is a technology that has become popular in this recent years. It can mix the virtual objects generated by computers into the real world. Hand gesture recognition is the gesture from human body language which can be recognized by the computer. AR and hand gesture recognition had faced numerous problems such as problems in target recognition, lack of interaction of Augmented Reality (AR), and high cost of hardware used. The project Hand Gesture Interaction in Augmented Reality (AR) Learning Application will be developed to solve this problem. The objective that will be used to solve the problem is to identify hand gesture techniques for learning applications, to design an interactive AR learning application using hand gestures, and to develop a prototype of a learning application with AR using hand gesture techniques. The methodology to develop the application of hand gesture interaction in AR learning is agile methodology. The process of the AR hand gesture by using ManoMotion is stated in the agile methodology. The outcome of the application is the user can use this application to learn astronomy courses such as Sun and Moon through AR. The user also can use the hand gesture to modify the size of the AR to make their learning more interactive. The testing results of the gesture detected to trigger the action reveal that the highest percentages are observed for Right Grab is 36% and Right Pick is 21% while Left Grab attained a rate of 29%, and Left Pick exhibited the lowest percentage at 14%. The limitations of the application include a limited set of hand gestures, the accuracy of detecting hand gestures being affected by lighting conditions and background noise, non-standard hand gesture performance, and hardware limitations. Future work includes exploring new hand gestures and integrating alternative SDKs, improving the ability to recognize various lighting conditions, enhancing hand detection accuracy, and leveraging smartphones with better hardware conditions
Item Type: | Final Year Project Report |
---|---|
Additional Information: | Project report (B.Sc.) -- Universiti Malaysia Sarawak, 2023. |
Uncontrolled Keywords: | technology, learning applications |
Subjects: | Q Science > Q Science (General) T Technology > T Technology (General) |
Divisions: | Academic Faculties, Institutes and Centres > Faculty of Computer Science and Information Technology Faculties, Institutes, Centres > Faculty of Computer Science and Information Technology Academic Faculties, Institutes and Centres > Faculty of Computer Science and Information Technology |
Depositing User: | Patrick |
Date Deposited: | 10 Jan 2024 08:43 |
Last Modified: | 10 Jan 2024 08:43 |
URI: | http://ir.unimas.my/id/eprint/44053 |
Actions (For repository members only: login required)
View Item |