With the joint cooperation of Payam Noor University and the Scientific Association of Iran Public Library Advancement

Document Type : Research Paper

Authors

1 Professor, Department of Knowledge and Information Science, Tonekabon Branch, Islamic Azad University, Tonekabon, Iran.

2 Ph.D Canditate, Department of Knowledge and Information Sciences, Tonekabon Branch, Islamic Azad University, Tonekabon, Iran

Abstract

Purpose: The purpose of the study is to explain the hand movement ability to interact with the user interface of websites.
Methodology:  This research is an applied research. The research method is qualitative. The population studied in this research is library resources including 2 books and 14 articles, which were selected and used by available sampling method until theoretical saturation. The method of data analysis using source analysis is analogous to the coding method. The collected data has been analyzed with MaxQDA software.
 Findings: Based on the findings, 18 features and 3 basic concepts were extracted from the analysis of 9 propositions. Individual concept (7 features of ease of use, satisfaction, user-friendliness, low cost, tendency to use more, adaptability, psychological control), functional concept (9 features of usefulness, recognition of interaction, understanding ability, computer processing capability, performance coverage necessary for data recovery, minimal need for hardware devices, the ability to be used on the same level as interface equipment, input of movement commands, the ability to recognize hands) and the concept of support (2 features of support programs in the computer for better interaction, optimizing the relationship between processes Cognitive and physical user).
Conclusion: The ability to move the hand is one of the great points and the ability to interact with the user interface of websites, which is symmetrical with the eye ability to interact with the user interface of websites, it has the ability to design and study more deeply inside the country. Noticing huge capacity of information, needs of users, acceleration in getting comprehensive and useful information in Iran, effective hand mobility to interact with intermediate users of websites have always been valuable and noticeable to be designed by experts.

Keywords

Hall, E. T. (1973). The Silent Language (Reissue edition). New York: Anchor.
Hendrik, B., Masril, M., Wijaya, Y. F., & Andini, S. (2019, December). Implementation and Design User Interface Layout Use Leap Motion Controller with Hand Gesture Recognition. In Journal of Physics: Conference Series (Vol. 1339, No. 1, p. 012058). IOP Publishing.
Kumar, P., Rautaray, S. S., & Agrawal, A. (2012, March). Hand data glove: A new generation real-time mouse for human-computer interaction. In 2012 1st International Conference on Recent Advances in Information Technology (RAIT) (pp. 750-755). IEEE.
Lee, U., & Tanaka, J. (2013, September). Finger identification and hand gesture recognition techniques for natural user interface. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction (pp. 274-279).
Mitra, S., & Acharya, T. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), 311-324.
Muhammad, P., & Devi, S. A. (2016). Hand gesture user interface for smart devices based on mems sensors. Procedia Computer Science93, 940-946.
Rautaray, S. S., & Agrawal, A. (2015). Vision based hand gesture recognition for human computer interaction: a survey. Artificial intelligence review, 43(1), 1-54.
Rempel, D., Camilleri, M. J., & Lee, D. L. (2014). The design of hand gestures for human–computer interaction: Lessons from sign language interpreters. International journal of human-computer studies72(10-11), 728-735.
Ren, Z., Meng, J., Yuan, J., & Zhang, Z. (2011, November). Robust hand gesture recognition with kinect sensor. In Proceedings of the 19th ACM international conference on Multimedia (pp. 759-760).
Seong, J. H., & Choi, Y. (2018, October). Design and implementation of user interface through hand movement tracking and gesture recognition. In 2018 International Conference on Information and Communication Technology Convergence (ICTC) (pp. 552-555). IEEE.
Stokoe Jr, W. C. (2005). Sign language structure: An outline of the visual communication systems of the American deaf. Journal of deaf studies and deaf education10(1), 3-37.
Sungkur, R. K., Antoaroo, M. A., & Beeharry, A. (2016). Eye tracking system for enhanced learning experiences. Education and Information Technologies21(6), 1785-1806.
Torres, R. D. S., Medeiros, C. B., Gonçalves, M. A., & Fox, E. A. (2004, June). An OAI compliant content-based image search component. In Proceedings of the 2004 Joint ACM/IEEE Conference on Digital Libraries, 2004. (p. 418). IEEE.
Tsai, T. H., & Tasi, Y. R. (2017, April). Design and implementation of a 3D hand gesture architecture system under complicated environment. In 2017 International Symposium on VLSI Design, Automation and Test (VLSI-DAT) (pp. 1-4). IEEE.
Wang, J., & Payandeh, S. (2017). Hand motion and posture recognition in a network of calibrated cameras. Advances in Multimedia2017.
Wigdor, D., & Wixon, D. (2011). Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier.
Yang, C., Jang, Y., Beh, J., Han, D., & Ko, H. (2012, January). Gesture recognition using depth-based hand tracking for contactless controller application. In 2012 IEEE International Conference on Consumer Electronics (ICCE) (pp. 297-298). IEEE.
Zahedi Nougabi, M., Fatahi, R., Salehi Faderdi, J., & Nokarizi, M. (2019). Analysis of users' eye movements and the role of their abilities when interacting with the user interface of websites.Pajohesh nameh Modiryat Etelat, 37(33), 1-31. (In Persion)
Zeresaz, M., & Fatahi, R. (2006). Basic considerations in designing the user interface of computer systems and databases. Librarianship and Informaion Organization Studies, 17(2), 251-268. (In Persion)