Windows Operating System’s Volume Controller Using Hand Gestures
Keywords:
Hand gesture, Open CV Python Module, Volume Controller, MediaPipe Library, NumPy Module, Hand Landmark Model, Human-Computer Interface, Palm Detection ModelAbstract
In this study, we develop volume controller in which we use hand gesture as input to control system. OpenCV module is basically used in this implementation for gesture control. Basically, this system uses the web camera to record or take pictures/videos, and the app controls the system volume based on the input. The primary purpose is to modify the system volume. Hand gestures can control basic computer tasks, such as adjusting volume. This natural and innovative form of communication replaces burdensome machine-like skills. The system's potential in human-computer interaction is vast. The purpose of our project is to show how to control volume using hand gestures. Identifying particular human gestures for device control is the goal of hand gesture recognition. We are creating a system that uses video input to allow specific users to control a computer with hand gestures. Our project uses OpenCV and Python for a gesture-controlled volume system. No keyboard or mouse is required.
References
Martendra Pratap Singh, Arzoo Poswal, et al. Volume Control using Gestures. International Journal of Innovative Science and Research Technology (IJSRT). 2022; 7(5): 203–206.
Nehaniv CL, Dautenhahn K, Kubacki J, Haegele M, Parlitz C, Alami R. A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction. In ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005. 2005 Aug 13; 371–377.
Krueger MW, Gionfriddo T, Hinrichsen K. VIDEOPLACE—an artificial reality. In Proceedings of the SIGCHI conference on Human factors in computing systems. 1985 Apr 1; 35–40.
Sharma A, Sharma P, Modi P. Hand Gesture Recognition using OpenCV. Project Report. Himachal Pradesh: Jaypee University of Information Technology Waknaghat; 2020. http://www.ir.juit.ac.in:8080/jspui/bitstream/123456789/7027/1/Hand%20Gesture%20Recognition%20using%20OpenCV.pdf
Manresa C, Varona J, Mas R, Perales FJ. Hand tracking and gesture recognition for human-computer interaction. Electron Lett Comput Vis Image Anal (ELCVIA). 2005 Nov 1; 5(3): 96–104.
OpenCV Library. OpenCV Vision Challenge - OpenCV. 2014. Available from: https://opencv.org/opencv-vision-challenge/
Zhang Z, Wu Y, Shan Y, Shafer S. Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper. In Proceedings of the 2001 workshop on Perceptive user interfaces. 2001 Nov 15; 1–8.
Murthy GR, Jadon RS. A review of vision-based hand gestures recognition. International Journal of Information Technology and Knowledge Management. 2009 Jul; 2(2): 405–410.
Hasan MM, Misra PK. Brightness factor matching for gesture recognition system using scaled normalization. Int J Comput Sci Inf Technol. 2011; 3(2): 35–46.
Prakasam S, Venkatachalam M, et al. Gesture Recognition Using a Touchless Sensor to Reduce Driver Distraction. Int Res J Eng Technol. 2016; 3(9): 221–224.
Licsár A, Szirányi T. Dynamic training of hand gesture recognition system. In Proceedings of the IEEE 17th International Conference on Pattern Recognition, 2004 (ICPR 2004). 2004 Aug 26; 4: 971–974.
Freeman WT, Roth M. Orientation histograms for hand gesture recognition. In International workshop on automatic face and gesture recognition. 1995 Jun 26; 12: 296–301.
Bretznar L, Linderberg T. Relative orientation from extended sequence of sparse point and line correspondences using the affine trifocal sensor. Lect Notes Comput Sci. 1998 Jun; 1406: 141–57.
Zhang Z, Wu Y, Shan Y, Shafer S. Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper. In Proceedings of the 2001 workshop on Perceptive user interfaces. 2001 Nov 15; 1–8.
