top of page
Search

Controlling computers using human gestures

  • Writer: Praxis Business School
    Praxis Business School
  • Apr 21, 2020
  • 2 min read

Updated: Apr 24, 2020

Recall the times when you watch videos on your computer and feel really lazy to use the mouse or the keyboards to control the video player. Well, if you are one of them, then we got a solution for you. This problem was initially started with a team from the batch of July 2018 in Praxis Bangalore. The project was designed with the focus on making a computer recognize some special gestures which will enable one to control a video player by just showing a number of gestures. For example, showing one's palm in front of the system will enable the pause and the un-pause function. One will also be able to control the volume, fast forward a video or rewind it. All these things can be done just by showing a few gestures. This is not it…the project was also extended to a wide range of other things like change the slides of a PPT, change pages, scroll, etc. without grabbing the mouse or the keyboard.


This project was done by Rohan Issac, Parthipan V, Praveen P, Gontla Praveen and Sai Dinesh - From the batch of July 2018 and was mentored by Prof. Gourab Nath of Praxis Business School.

The members of the Gesture Recognition Team - Batch of July 2018, Praxis Business School, Bangalore

As a part of their project the team had also designed two wonderful product demo video. The first video showed how gestures are used to control the keyboard functionalities while in the second one they showed how they took over the mouse functionalities using gestures. The second video is given below. You may want to take a look at it.



Below, you will find a short clip of the team having some fun with their professor during one of the testing phase of the algorithms. The team was testing the performing of the system they had designed while detecting the gestures to control the keyboard functionalities. This was a fun moment! The person in the video is Prof. Gourab Nath, the project guide of this project.



The team gave an excellent effort in designing the system. However, they missed the winning award by a very slight margin. This project was later extended the by one of the teams from their junior batch (Batch of Jan 2019). The team consisted of Neha Maheshwari, Adithi Nairy, Aishwaria Aljapur and Anushree G. They did the project under the guidance of the same professor and won the "Best Project Award" in their batch. Note that all these projects are vetted by the academic experts and industry experts. They added some extra functionality of which the most notable one was the zoom-in and zoom-out functionality. They targetted to attain high accuracy in terms of the levels of zooming, based on the inward and outward movement of both the hands.

Photo of the Winning Team from January 2019 Batch





 
 
 

Opmerkingen


bottom of page