[PDF] Real Time Hand Gesture Recognition For Mouse Controlling Function - eBooks Review

Real Time Hand Gesture Recognition For Mouse Controlling Function


 Real Time Hand Gesture Recognition For Mouse Controlling Function
DOWNLOAD

Download Real Time Hand Gesture Recognition For Mouse Controlling Function PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Real Time Hand Gesture Recognition For Mouse Controlling Function book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages. If the content not found or just blank you must refresh this page



Real Time Hand Gesture Recognition For Mouse Controlling Function


 Real Time Hand Gesture Recognition For Mouse Controlling Function
DOWNLOAD
Author : Suhas Baliram
language : en
Publisher:
Release Date : 2022-07-25

Real Time Hand Gesture Recognition For Mouse Controlling Function written by Suhas Baliram and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2022-07-25 with categories.


Hand gestures are an easy to use and natural way of interaction. Using hands as a device can help people communicate with computers in a more intuitive and natural way. When we interact with other people, our hand movements play an important role and the information they convey is very rich in many ways. We use our hands for pointing at a person or at an object, conveying information about space, shape and temporal characteristics. We constantly use our hands to interact with objects: move them, modify them, and transform them. In the same unconscious way, we gesticulate while speaking to communicate ideas ('stop', 'come closer', 'no', etc). Hand movements are thus a mean of non-verbal communication, ranging from simple actions (pointing at objects for example) to more complex ones (such as expressing feelings or communicating with others). In this sense, gestures are not only an ornament of spoken language, but are essential components of the language generation process itself [1].



Real Time 2d Static Hand Gesture Recognition And 2d Hand Tracking For Human Computer Interaction


Real Time 2d Static Hand Gesture Recognition And 2d Hand Tracking For Human Computer Interaction
DOWNLOAD
Author : Pavel Alexandrovich Popov
language : en
Publisher:
Release Date : 2020

Real Time 2d Static Hand Gesture Recognition And 2d Hand Tracking For Human Computer Interaction written by Pavel Alexandrovich Popov and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2020 with categories.


The topic of this thesis is Hand Gesture Recognition and Hand Tracking for user interface applications. 3 systems were produced, as well as datasets for recognition and tracking, along with UI applications to prove the concept of the technology. These represent significant contributions to resolving the hand recognition and tracking problems for 2d systems. The systems were designed to work in video only contexts, be computationally light, provide recognition and tracking of the user's hand, and operate without user driven fine tuning and calibration. Existing systems require user calibration, use depth sensors and do not work in video only contexts, or are computationally heavy requiring GPU to run in live situations. A 2-step static hand gesture recognition system was created which can recognize 3 different gestures in real-time. A detection step detects hand gestures using machine learning models. A validation step rejects false positives. The gesture recognition system was combined with hand tracking. It recognizes and then tracks a user's hand in video in an unconstrained setting. The tracking uses 2 collaborative strategies. A contour tracking strategy guides a minimization based template tracking strategy and makes it real-time, robust, and recoverable, while the template tracking provides stable input for UI applications. Lastly, an improved static gesture recognition system addresses the drawbacks due to stratified colour sampling of the detection boxes in the detection step. It uses the entire presented colour range and clusters it into constituent colour modes which are then used for segmentation, which improves the overall gesture recognition rates. One dataset was produced for static hand gesture recognition which allowed for the comparison of multiple different machine learning strategies, including deep learning. Another dataset was produced for hand tracking which provides a challenging series of user scenarios to test the gesture recognition and hand tracking system. Both datasets are significantly larger than other available datasets. The hand tracking algorithm was used to create a mouse cursor control application, a paint application for Android mobile devices, and a FPS video game controller. The latter in particular demonstrates how the collaborating hand tracking can fulfill the demanding nature of responsive aiming and movement controls.



Real Time Hand Gesture Detection And Recognition For Human Computer Interaction


Real Time Hand Gesture Detection And Recognition For Human Computer Interaction
DOWNLOAD
Author : Nasser Hasan Abdel-Qader Dardas
language : en
Publisher:
Release Date : 2012

Real Time Hand Gesture Detection And Recognition For Human Computer Interaction written by Nasser Hasan Abdel-Qader Dardas and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2012 with Gesture categories.


This thesis focuses on bare hand gesture recognition by proposing a new architecture to solve the problem of real-time vision-based hand detection, tracking, and gesture recognition for interaction with an application via hand gestures. The first stage of our system allows detecting and tracking a bare hand in a cluttered background using face subtraction, skin detection and contour comparison. The second stage allows recognizing hand gestures using bag-of-features and multi-class Support Vector Machine (SVM) algorithms. Finally, a grammar has been developed to generate gesture commands for application control. Our hand gesture recognition system consists of two steps: offline training and online testing. In the training stage, after extracting the keypoints for every training image using the Scale Invariance Feature Transform (SIFT), a vector quantization technique will map keypoints from every training image into a unified dimensional histogram vector (bag-of-words) after K-means clustering. This histogram is treated as an input vector for a multi-class SVM to build the classifier. In the testing stage, for every frame captured from a webcam, the hand is detected using my algorithm. Then, the keypoints are extracted for every small image that contains the detected hand posture and fed into the cluster model to map them into a bag-of-words vector, which is fed into the multi-class SVM classifier to recognize the hand gesture. Another hand gesture recognition system was proposed using Principle Components Analysis (PCA). The most eigenvectors and weights of training images are determined. In the testing stage, the hand posture is detected for every frame using my algorithm. Then, the small image that contains the detected hand is projected onto the most eigenvectors of training images to form its test weights. Finally, the minimum Euclidean distance is determined among the test weights and the training weights of each training image to recognize the hand gesture. Two application of gesture-based interaction with a 3D gaming virtual environment were implemented. The exertion videogame makes use of a stationary bicycle as one of the main inputs for game playing. The user can control and direct left-right movement and shooting actions in the game by a set of hand gesture commands, while in the second game, the user can control and direct a helicopter over the city by a set of hand gesture commands.



Robust Hand Gesture Recognition For Robotic Hand Control


Robust Hand Gesture Recognition For Robotic Hand Control
DOWNLOAD
Author : Ankit Chaudhary
language : en
Publisher:
Release Date : 2018

Robust Hand Gesture Recognition For Robotic Hand Control written by Ankit Chaudhary and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2018 with Robot hands categories.


This book focuses on light invariant bare hand gesture recognition while there is no restriction on the types of gestures. Observations and results have confirmed that this research work can be used to remotely control a robotic hand using hand gestures. The system developed here is also able to recognize hand gestures in different lighting conditions. The pre-processing is performed by developing an image-cropping algorithm that ensures only the area of interest is included in the segmented image. The segmented image is compared with a predefined gesture set which must be installed in the recognition system. These images are stored and feature vectors are extracted from them. These feature vectors are subsequently presented using an orientation histogram, which provides a view of the edges in the form of frequency. Thereby, if the same gesture is shown twice in different lighting intensities, both repetitions will map to the same gesture in the stored data. The mapping of the segmented image's orientation histogram is firstly done using the Euclidian distance method. Secondly, the supervised neural network is trained for the same, producing better recognition results. An approach to controlling electro-mechanical robotic hands using dynamic hand gestures is also presented using a robot simulator. Such robotic hands have applications in commercial, military or emergency operations where human life cannot be risked. For such applications, an artificial robotic hand is required to perform real-time operations. This robotic hand should be able to move its fingers in the same manner as a human hand. For this purpose, hand geometry parameters are obtained using a webcam and also using KINECT. The parameter detection is direction invariant in both methods. Once the hand parameters are obtained, the fingers? angle information is obtained by performing a geometrical analysis. An artificial neural network is also implemented to calculate the angles. These two methods can be used with only one hand, either right or left. A separate method that is applicable to both hands simultaneously is also developed and fingers angles are calculated. The contents of this book will be useful for researchers and professional engineers working on robotic arm/hand systems.



Challenges And Applications For Hand Gesture Recognition


Challenges And Applications For Hand Gesture Recognition
DOWNLOAD
Author : Kane, Lalit
language : en
Publisher: IGI Global
Release Date : 2022-03-25

Challenges And Applications For Hand Gesture Recognition written by Kane, Lalit and has been published by IGI Global this book supported file pdf, txt, epub, kindle and other format this book has been release on 2022-03-25 with Computers categories.


Due to the rise of new applications in electronic appliances and pervasive devices, automated hand gesture recognition (HGR) has become an area of increasing interest. HGR developments have come a long way from the traditional sign language recognition (SLR) systems to depth and wearable sensor-based electronic devices. Where the former are more laboratory-oriented frameworks, the latter are comparatively realistic and practical systems. Based on various gestural traits, such as hand postures, gesture recognition takes different forms. Consequently, different interpretations can be associated with gestures in various application contexts. A considerable amount of research is still needed to introduce more practical gesture recognition systems and associated algorithms. Challenges and Applications for Hand Gesture Recognition highlights the state-of-the-art practices of HGR research and discusses key areas such as challenges, opportunities, and future directions. Covering a range of topics such as wearable sensors and hand kinematics, this critical reference source is ideal for researchers, academicians, scholars, industry professionals, engineers, instructors, and students.



Real Time Hand Gesture Recognition System Using Neural Network


Real Time Hand Gesture Recognition System Using Neural Network
DOWNLOAD
Author : Herman Khalid Omer
language : en
Publisher:
Release Date : 2015

Real Time Hand Gesture Recognition System Using Neural Network written by Herman Khalid Omer and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2015 with Human-computer interaction categories.




Novel Methods For Robust Real Time Hand Gesture Interfaces


Novel Methods For Robust Real Time Hand Gesture Interfaces
DOWNLOAD
Author : Nathaniel Sean Rossol
language : en
Publisher:
Release Date : 2015

Novel Methods For Robust Real Time Hand Gesture Interfaces written by Nathaniel Sean Rossol and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2015 with Computer vision categories.


Real-time control of visual display systems via mid-air hand gestures offers many advantages over traditional interaction modalities. In medicine, for example, it allows a practitioner to adjust display values, e.g. contrast or zoom, on a medical visualization interface without the need to re-sterilize the interface. However, there are many practical challenges that make such interfaces non-robust including poor tracking due to frequent occlusion of fingers, interference from hand-held objects, and complex interfaces that are difficult for users to learn to use efficiently. In this work, various techniques are explored for improving the robustness of computer interfaces that use hand gestures. This work is focused predominately on real-time markerless Computer Vision (CV) based tracking methods with an emphasis on systems with high sampling rates. First, we explore a novel approach to increase hand pose estimation accuracy from multiple sensors at high sampling rates in real-time. This approach is achieved through an intelligent analysis of pose estimations from multiple sensors in a way that is highly scalable because raw image data is not transmitted between devices. Experimental results demonstrate that our proposed technique significantly improves the pose estimation accuracy while still maintaining the ability to capture individual hand poses at over 120 frames per second. Next, we explore techniques for improving pose estimation for the purposes of gesture recognition in situations where only a single sensor is used at high sampling rates without image data. In this situation, we demonstrate an approach where a combination of kinematic constraints and computed heuristics are used to estimate occluded keypoints to produce a partial pose estimation of a user's hand which is then used with our gestures recognition system to control a display. The results of our user study demonstrate that the proposed algorithm significantly improves the gesture recognition rate of the setup. We then explore gesture interface designs for situations where the user may (or may not) have a large portion of their hand occluded by a hand-held tool while gesturing. We address this challenge by developing a novel interface that uses a single set of gestures designed to be equally effective for fingers and hand-held tools without the need for any markers. The effectiveness of our approach is validated through a user study on a group of people given the task of adjusting parameters on a medical image display. Finally, we examine improving the efficiency of training for our interfaces by automatically assessing key user performance metrics (such as dexterity and confidence), and adapting the interface accordingly to reduce user frustration. We achieve this through a framework that uses Bayesian networks to estimate values for abstract hidden variables in our user model, based on analysis of data recorded from the user during operation of our system.



Real Time Dynamic Hand Gesture Recognition System Based On 3d Depth Sensing And Fingertip Features


Real Time Dynamic Hand Gesture Recognition System Based On 3d Depth Sensing And Fingertip Features
DOWNLOAD
Author : 陳郁元
language : en
Publisher:
Release Date : 2011

Real Time Dynamic Hand Gesture Recognition System Based On 3d Depth Sensing And Fingertip Features written by 陳郁元 and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2011 with categories.




Image Based Gesture Recognition With Support Vector Machines


Image Based Gesture Recognition With Support Vector Machines
DOWNLOAD
Author : Yu Yuan
language : en
Publisher: ProQuest
Release Date : 2008

Image Based Gesture Recognition With Support Vector Machines written by Yu Yuan and has been published by ProQuest this book supported file pdf, txt, epub, kindle and other format this book has been release on 2008 with Human activity recognition categories.


Recent advances in various display and virtual technologies, coupled with an explosion in available computing power, have given rise to a number of novel human-computer interaction (HCI) modalities, among which gesture recognition is undoubtedly the most grammatically structured and complex. However, despite the abundance of novel interaction devices, the naturalness and efficiency of HCI has remained low. This is due in particular to the lack of robust sensory data interpretation techniques. To address the task of gesture recognition, this dissertation establishes novel probabilistic approaches based on support vector machines (SVM). Of special concern in this dissertation are the shapes of contact images on a multi-touch input device for both 2D and 3D. Five main topics are covered in this work. The first topic deals with the hand pose recognition problem. To perform classification of different gestures, a recognition system must attempt to leverage between class variations (semantically varying gestures), while accommodating potentially large within-class variations (different hand poses to perform certain gestures). For recognition of gestures, a sequence of hand shapes should be recognized. We present a novel shape recognition approach using Active Shape Model (ASM) based matching and SVM based classification. Firstly, a set of correspondences between the reference shape and query image are identified through ASM. Next, a dissimilarity measure is created to measure how well any correspondence in the set aligns the reference shape and candidate shape in the query image. Finally, SVM classification is employed to search through the set to find the best match from the kernel defined by the dissimilarity measure above. Results presented show better recognition results than conventional segmentation and template matching methods. In the second topic, dynamic time alignment (DTA) based SVM gesture recognition is addressed. In particular, the proposed method combines DTA and SVM by establishing a new kernel. The gesture data is first projected into a common eigenspace formed by principal component analysis (PCA) and a distance measure is derived from the DTA. By incorporating DTA in the kernel function, general classification problems with variable-sized sequential data can be handled. In the third topic, a C++ based gesture recognition application for the multi-touchpad is implemented. It uses the proposed gesture classification method along with a recursive neural networks approach to recognize definable gestures in real time, then runs an associated command. This application can further enable users with different disabilities or preferences to custom define gestures and enhance the functionality of the multi-touchpad. Fourthly, an SVM-based classification method that uses the DTW to measure the similarity score is presented. The key contribution of this approach is the extension of trajectory based approaches to handle shape information, thereby enabling the expansion of the system's gesture vocabulary. It consists of two steps: converting a given set of frames into fixed-length vectors and training an SVM from the vectorized manifolds. Using shape information not only yields discrimination among various gestures, but also enables gestures that cannot be characterized solely based on their motion information to be classified, thus boosting overall recognition scores. Finally, a computer vision based gesture command and communication system is developed. This system performs two major tasks: the first is to utilize the 3D traces of laser pointing devices as input to perform common keyboard and mouse control; the second is supplement free continuous gesture recognition, i.e., data gloves or other assistive devices are not necessary for 3D gestures recognition. As a result, the gesture can be used as a text entry system in wearable computers or mobile communication devices, though the recognition rate is lower than the approaches with the assistive tools. The purpose of this system is to develop new perceptual interfaces for human computer interaction based on visual input captured by computer vision systems, and to investigate how such interfaces can complement or replace traditional interfaces. Original contributions of this work span the areas of SVMs and interpretation of computer sensory inputs, such as gestures for advanced HCI. In particular, we have addressed the following important issues: (1) ASM base kernels for shape recognition. (2) DTA based sequence kernels for gesture classification. (3) Recurrent neural networks (RNN). (4) Exploration of a customizable HCI. (5) Computer vision based 3D gesture recognition algorithms and system.



Dual Sensor Approaches For Real Time Robust Hand Gesture Recognition


Dual Sensor Approaches For Real Time Robust Hand Gesture Recognition
DOWNLOAD
Author : Kui Liu
language : en
Publisher:
Release Date : 2015

Dual Sensor Approaches For Real Time Robust Hand Gesture Recognition written by Kui Liu and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2015 with Gesture categories.


The use of hand gesture recognition has been steadily growing in various human-computer interaction applications. Under realistic operating conditions, it has been shown that hand gesture recognition systems exhibit recognition rate limitations when using a single sensor. Two dual-sensor approaches have thus been developed in this dissertation in order to improve the performance of hand gesture recognition under realistic operating conditions. The first approach involves the use of image pairs from a stereo camera setup by merging the image information from the left and right camera, while the second approach involves the use of a Kinect depth camera and an inertial sensor by fusing differing modality data within the framework of a hidden Markov model. The emphasis of this dissertation has been on system building and practical deployment. More specifically, the major contributions of the dissertation are: (a) improvement of hand gestures recognition rates when using a pair of images from a stereo camera compared to when using a single image by fusing the information from the left and right images in a complementary manner, and (b) improvement of hand gestures recognition rates when using a dual-modality sensor setup consisting of a Kinect depth camera and an inertial body sensor compared to the situations when each sensor is used individually on its own. Experimental results obtained indicate that the developed approaches generate higher recognition rates in different backgrounds and lighting conditions compared to the situations when an individual sensor is used. Both approaches are designed such that the entire recognition system runs in real-time on PC platform.