Hall, E. T. (1973). The Silent Language (Reissue edition). New York: Anchor.
Hendrik, B., Masril, M., Wijaya, Y. F., & Andini, S. (2019, December). Implementation and Design User Interface Layout Use Leap Motion Controller with Hand Gesture Recognition. In Journal of Physics: Conference Series (Vol. 1339, No. 1, p. 012058). IOP Publishing.
Kumar, P., Rautaray, S. S., & Agrawal, A. (2012, March). Hand data glove: A new generation real-time mouse for human-computer interaction. In 2012 1st International Conference on Recent Advances in Information Technology (RAIT) (pp. 750-755). IEEE.
Lee, U., & Tanaka, J. (2013, September). Finger identification and hand gesture recognition techniques for natural user interface. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction (pp. 274-279).
Mitra, S., & Acharya, T. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), 311-324.
Muhammad, P., & Devi, S. A. (2016). Hand gesture user interface for smart devices based on mems sensors. Procedia Computer Science, 93, 940-946.
Rautaray, S. S., & Agrawal, A. (2015). Vision based hand gesture recognition for human computer interaction: a survey. Artificial intelligence review, 43(1), 1-54.
Rempel, D., Camilleri, M. J., & Lee, D. L. (2014). The design of hand gestures for human–computer interaction: Lessons from sign language interpreters. International journal of human-computer studies, 72(10-11), 728-735.
Ren, Z., Meng, J., Yuan, J., & Zhang, Z. (2011, November). Robust hand gesture recognition with kinect sensor. In Proceedings of the 19th ACM international conference on Multimedia (pp. 759-760).
Seong, J. H., & Choi, Y. (2018, October). Design and implementation of user interface through hand movement tracking and gesture recognition. In 2018 International Conference on Information and Communication Technology Convergence (ICTC) (pp. 552-555). IEEE.
Stokoe Jr, W. C. (2005). Sign language structure: An outline of the visual communication systems of the American deaf. Journal of deaf studies and deaf education, 10(1), 3-37.
Sungkur, R. K., Antoaroo, M. A., & Beeharry, A. (2016). Eye tracking system for enhanced learning experiences. Education and Information Technologies, 21(6), 1785-1806.
Torres, R. D. S., Medeiros, C. B., Gonçalves, M. A., & Fox, E. A. (2004, June). An OAI compliant content-based image search component. In Proceedings of the 2004 Joint ACM/IEEE Conference on Digital Libraries, 2004. (p. 418). IEEE.
Tsai, T. H., & Tasi, Y. R. (2017, April). Design and implementation of a 3D hand gesture architecture system under complicated environment. In 2017 International Symposium on VLSI Design, Automation and Test (VLSI-DAT) (pp. 1-4). IEEE.
Wang, J., & Payandeh, S. (2017). Hand motion and posture recognition in a network of calibrated cameras. Advances in Multimedia, 2017.
Wigdor, D., & Wixon, D. (2011). Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier.
Yang, C., Jang, Y., Beh, J., Han, D., & Ko, H. (2012, January). Gesture recognition using depth-based hand tracking for contactless controller application. In 2012 IEEE International Conference on Consumer Electronics (ICCE) (pp. 297-298). IEEE.
Zahedi Nougabi, M., Fatahi, R., Salehi Faderdi, J., & Nokarizi, M. (2019). Analysis of users' eye movements and the role of their abilities when interacting with the user interface of websites.Pajohesh nameh Modiryat Etelat, 37(33), 1-31. (In Persion)
Zeresaz, M., & Fatahi, R. (2006). Basic considerations in designing the user interface of computer systems and databases. Librarianship and Informaion Organization Studies, 17(2), 251-268. (In Persion)