Humancomputerinteraction(HCI)orhumanalternativeand augmentative communication (HAAC) is becoming increasingly essential in our daily lives in the current intelligent computing environment. One of the most importantresearchtechniquesinHCIorHAACapplicationsis gesturerecognition.Gesturesarecommunicative,meaningful body motions or body language expressions that involve physical actions of the fingers, hands, arms, face, head, or torsowiththeaimoftransmittingmeaningfulinformationor engagingwiththeenvironmentingeneral.Handsareutilized to perform the majority of essential body language expressions since they are more flexible and controlled elements of the human body. As a result, hand gestures are appropriate for communicating information such as expressing a sentiment, indicating a number, pointing out an object,andsoon.Signlanguageandgesture basedcomputer control both employ hand gestures as a main interface method [1]. [7]. Simple mechanical devices such as a keyboard and mouse are utilized for man machine interaction in a conventional HCI system. These gadgets, on the other hand, limit the pace and spontaneity with which manandmachinecommunicate.Ontheotherhand,duetoits natural interaction ability, interaction methods based on hand gestures and computer vision have become a popular alternative communication modality for man machine contact in recent years. For successful HCI or HAAC applications such as robots, sign language communication, virtual reality, and so on, a proper design of hand gesture recognition framework may be utilized to build an advanced hand gesture based interface system. The method of recognizingthemovementsmadebyahumanhandorhands isknownashandgesturerecognition[13].Becausehandgesturesvaryintime and location, as well as across ethnic backgrounds, it's nearly difficult to identify any single one. Any gesture, broadly defined, is a deliberate hand movement that involves movements of the fingers, palm, and wrist to communicate information. When employed to generate a range of postures or motions, gestures are regarded to extend to the arms as well as the handsinthefieldofgesturerecognition.Acomputerorother machine that detects human gestures is one that can be operatedefficientlyusingtheuser'shandsandarms.Because gesturescomeeasilytohumansandareanessentialelement ofhowwecommunicate,usinggesturesforhuman computer interactions is a simple process. Users should be able to instruct computers to perform complicated activities using only a single posture or a few basic, continuous, dynamic hand gestures [23]. Hand gesture recognition technologies are divided into two groups depending on sensing techniques: (i) gloves, sensor, or wearable band based approaches,and(ii)vision basedtechniques.
Certified Journal | Page1088
1Student in Electronics & Communication Dept., Dr. A.P.J.Abdul Kalam University, Ind withore /MP/India
Keywords—Hand Gesture Recognition, Human Computer Interaction, Electromyography, Artificial Neural Networks. I. INTRODUCTION
Design a System for Hand Gesture Recognition with Neural Network Vaishali M. Gulhane1 , Dr. Amol Kumbhare2
International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008

2Asso. prof. in Electronics Department, Dr. A.P.J.Abdul Kalam University, Indore /MP/India ***
EMG signals are used in a variety of applications, including neuromuscular illness diagnosis, prosthetic/orthotic device control, human machine interfaces,virtual realitygaming,andthecreationofmuscle oriented exercise equipment. EMG signals have been used in a varietyof researchprojectstointerfacewithmachines and computers. The goal of gesture recognition research is to developa systemthatcan recognizedistincthumangestures
ABSTRACT:
The intellectual computing of an effective human computer interaction (HCI) or human alternative and augmentative communication (HAAC) is vital in our lives in today's technological environment. One of the most essential approaches for developing a gesture based interface system for HCI or HAAC applications is hand gesture recognition. As a result, in order to create an advanced hand gesture recognition system with successful applications, it is required to establish an appropriate gesture recognition technique. Human activity and gesture detection are crucial components of the rapidly expanding area of ambient intelligence, which includes applications such as robots, smart homes, assistive systems, virtual reality, and so on. We proposed a method for recognizing hand movements using surface electromyography based on an ANN. The CapgMyo dataset based on the Myo wristband (an eight channel sEMG device) is utilized to assess participants' forearm sEMG signals in our technique. The original sEMG signal is preprocessed to remove noise and detect muscle activity areas, then signals are subjected to time and frequency based domain feature extraction. We used an ANN classification model to predict various gesture output classes for categorization. Finally, we put the suggested model to the test to see if it could recognize these movements, and it didso withanaccuracyof87.32 percent.
The majority of the studies utilized hand gesture recognitionmethodsthatmaybedividedintotwocategories: i) glove, sensor, or wearable band based techniques, and ii) vision basedtechniques,allofwhicharedetailedbelow.
International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page1089

Flow 3DHOOF, a gesture recognition system was described [9]. [10] suggested an effective deep convolutional neural networks technique for hand gesture detection that used transfer learning to overcome the need of a big labelled dataset.The area under the ROC curve metric achieved 98 percent overall performance using a machine learning technique developed [11] for real time identification of 16 movements of user hands using the Kinect sensor that respects such constraints. Hand movements and finger detectioninstillpicturesandvideosequencesarethesubject of T. Bravenec et al. [12]. Using a wrist mounted tri axial accelerometer, a computational solution for human robot interaction was given [13]. Surface electromyography was usedtoidentifyhandmovementsusingatechniquebasedon support vector machines (SVM) [14]. (sEMG). A model [15] for real time hand gesture identification that takes as input electromyography (EMG) data recorded on the forearm and usesanauto encoderforautomatedfeatureextractionandan artificial feed forward neural network for classification, utilizing the commercial sensor Myo Armband. A novel 3D handgestureidentificationtechnique[16]isbasedonadeep learning model in which sequences of hand skeletal joint locations are analyzed by parallel convolutions in a new Convolutional Neural Network (CNN). Experimental results are consistent with theoretical estimates and illustrate the benefits of the suggested gesture recognition system design [17], which utilizes a hand detector to identify hands in the frame and then switches to a gesture classifier if a hand is found. A hand gesture recognition solution based on LSTM RNNs and 3D Skeleton Features [18] presents, with
complex to solve. Many studies have explored various types of EMG signal categorization techniques with satisfactory recognition results. The goal of this study is to create a system architecture that can detect hand movements that communicatespecificinformationutilizinganEMGsignal.
The following is a breakdown of the paper's structure. The work of several researchers on the examined,extraction,includessuggestedcategorizationoftheEMGsignalisdiscussedinsectionII.ThetechniqueispresentedinSectionIII,whichsignalprocessing,namelysegmentation,featureandclassification.InpartIV,thefindingsareandinsectionV,theconclusionispresented.
and utilize them to communicate information or control devices [4] [6] [8] [14]. Gestures can come from anybodyaction or state, although they are most often made with the hands or the face. EMG signals, which are used to assess muscle activity, are used to record hand movements. The development of EMG based control has gotten a lot of attention in the last three decades since it will improve the social acceptability of handicapped and elderly people by classificationperson'svaluableinteractionnoise,whichincludepathways,characteristicssubstantialmyoelectricofenhancingtheirqualityoflife.However,patternidentificationEMGdataisthemostdifficultelementofdesigningcontrolbasedinterfaces[21][30].ThisisduetodifferencesinEMGsignals,whichhavevariousbasedonage,muscleactivity,motorunitskinfatlayer,andgesturestyle.EMGsignalsmorecomplexkindsofnoisethanotherbiosignals,aregeneratedbyintrinsicequipmentandambientelectromagneticradiation,motionartefacts,andtheofvarioustissues.Itmightbedifficulttoextractcharacteristicsfromanamputee'sorhandicappedremainingmuscles.Whendealingwithamulticlassproblem,thisproblembecomesevenmore
P N Huu et al. [1] researched, surveyed, and performed research in order to give an overview of human hand gesture recognition, covering the main processes of hand gesture recognition, as well as the most often used approach and methodology. A convolutional neural network based gesturerecognitiontechniqueispresented,inwhichtraining andtestingarecarriedoutwithvariousconvolutionalneural networks, as opposed to other known methods and designs [2]. For the problem of dynamic hand gesture detection, an efficient strategy for utilizing knowledge from various modalities in training unimodal 3D convolutional neural networks (3D CNNs) is described [3]. A. Chahid and colleagues [4] devised a technique. The Quantization based position Weight Matrix (QuPWM) feature extraction approach for multiclass classification to improve the interpretation of biomedical signals has shown promise in extracting important features from a variety of biomedical signals, including EEG and MEG signals. A technique [5] was suggested to categorize hand motions by one smartphone utilizing inaudible high frequency sound, and the model classified 8 hand gestures with 94.25 percent accuracy. A. Devaraj et al. [6] utilized SVM and KNN machine learning classifiers to recognize a specific hand motion from an EMG signal recorded by a sensor based band. B. Besma et al. [7] presented a method for identifying two movements (wrist flexion and wrist extension) by decoding surface electromyography (EMG) signals of amputees, which has provedtohaveanoverallaccuracyofabout80%andmaybe utilized for prosthesis. A comparison is presented between the created band and the Myo Armband, which recognizes gestures using surface Electromyography (s EMG) [8]. Based on spatio temporal characteristics such as Histogram of OrientedGradients3DHOGandHistogramofOrientedOptical
II. RELATED WORK
Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008
Certified Journal | Page1090
III. PROPOSED WORK Process,analyze,andrecognizethehandgesturesignalis the goal of the EMG based hand gesture recognition system. Thewholeprocessofahandgesturerecognitionsystemmay bebrokendownintotwophases:trainingandtesting.Figure 1 shows a schematic representation of a hand gesture recognition system. The preprocessing, feature extraction, and feature selection processes are the same in both the trainingandtestingphases.Preprocessing,featureextraction, feature reduction, andclassificationarethe phasesofa hand equipmentgestures,sEMGgesturerecognitionsystemingeneral.Fig1:Theschematicdiagramofahandgesturerecognitionsystem1.HandGestureDatasetTheCapgMyoisabenchmarkdatabaseofhighdensity(HDsEMG)recordingsofdiverseparticipants'handbasedonan8x16electrodearrayandanacquisitionasillustratedinfig2.Fig2:Originalsurfaceelectromyography(sEMG)signalsrecordedbytheMYOarmband.

International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056

model [21] developed that usedsurfaceelectromyographyinthetransientstate,support vector machines (SVM), and discrete wavelet transforms to recognize handmotionsthatlasteda briefperiod(i.e.,short termgestures)(DWT).Author[22]hasprovidedasuccessful dynamic hand gesture and movement trajectory recognition system that may be utilized in real time for effective HRI interaction. E. Kaya et al. [23] developed a hand gesture recognition method based on surface electromyography (EMG) signals collected from a wearable device, the Myo armband, to classify and recognize numbers from 0 to 9 in Turkish Sign Language. To recognize the hand gestures, they used machine learning techniques such as kNN, SVM, and ANN.ThesuggestedapproachbyJ.Kimetal.[24]transforms reflected and recorded sound data into an image in a short time using a short time Fourier transform, and then applies the acquired data to a convolutional neural network (CNN) model to identify hand motions. On three datasets, a neural network design consisting of two types of recurrent neural network (RNN) cells was built [25], revealing that this very modest network beats state of the art hand gesture identificationtechniquesthatdependonmulti modaldataby distincthandsbasedsensors.wearabledatasetenteringagentthecamera.thearecreateawidemargin.K.N.Krisandriaetal.[26]usehandmotionstointeractionsbetweenhumansandcomputers,whichrecognizedbythepalmofthehand,whichisderivedfromfindingsofhumanskeletalsegmentationusingtheKinectTheframeworkcombinesincomingsignals[27]atsemanticlevel,amethodsimilartothatusedinmultisystems,wheremodalsgivelocalsemanticsbeforethefusionmodule.Anovelhumanhandgestureisgiven,whichwascollectedusingalowcost,IoTbaseddevicewithaccelerometerandgyroscopeArealtimehandgesturerecognitionacceleratoronhandskeletonextractionhasbeensuggestedurfaceelectromyography(sEMG)wascollectedfromsixandforearmmusclesandcategorizedusingthreetechniques
experimental results showing that the suggested technique has a resilience of 92.196 percent on a self defined dataset. The use of the temporal inter frame pattern on the identification of both static and dynamic hand gestures is exploredina three level system[19].Asensor basedsystem thatdeciphersthissignlanguageofhandgesturesforEnglish alphabetshasbeencreated[20].Ahandgestureidentification

The first three forms of noise can be removed by employing standard filtering techniques such as a band pass filter or a band stop filter, or by utilizing high quality equipment with correct electrode placement. Other noises/artifacts and interferences of random noise that are in between the main frequencyrangeofEMGaredifficulttoeliminate.Waveletbasedapproachesareusefulforstudying many forms of non stationary data, such as EMG. For example, the Discrete Wavelet Transform (DWT) scales and shifts the mother wavelet and decomposes a discrete time signal x[k] intoacollectionofsignals.Findingtheappropriatenumberof wavelet decomposition levels is the first step in the DWT decomposition. At the same time, the signal x[k] passes through the high pass and low pass filters. In wavelet decomposition, detail (D) represents the signal at high frequency, while approximation (A) represents the low frequency component (A). The similarity between the signal and the wavelet functions is measured by these coefficients. Afterdownsamplingtheresultantfilteredsignalbytwo,this procedure is repeated on the low pass approximation coefficients obtained at each level. This research focuses on frequency decomposition levels. At each decomposition level oftheDWT,the resultingdetailedcoefficientsreflectdistinct frequency bands of the EMG signals. In this experiment, we discovered that DWT with DB wavelet provided the best results. Wavelet based feature extraction methods create a vector that is far too large to be utilized as a classifier input. Thisapproachreducestheamountofcharacteristicsthatmay be extracted from wavelet coefficients. The chosen characteristicsofEMGsignalsareextractedusingDWT.After obtainingDWTcoefficients,statisticalcharacteristicsforeach ofthefiveDWTsub bandsareretrieved.
3. BecauseFeaturesExtractionandSelectionofthecomplexityofEMG signals, effective feature selection is critical for successful classification. The characteristicsutilizedtorepresenttherawEMGsignalshave a huge impact on the pattern classification system's performance. Because it is difficult to extract a feature parameter that completely reflects the unique characteristic of the recorded EMG signals to a motion instruction, many feature parameters are required for EMG signal categorization. For the categorization of EMG signals, traditional characteristics derived from the time domain, frequency domain, time frequency domain, and time scale domain are used. After wavelet transformation or 3 level DB decomposition of signal, we employed hand created, that is, manuallyderivedfeaturesinthetimeandfrequencydomain.
Mean absolute value (MAV), waveform length (WL), zero crossings (ZC), slope sign changes (SSC), RMS, and standard deviation are all assessed in the time domain. Skewness, mean frequency, and kurtosis are all terms used in the frequencydomain.4.Classification(train,validateandtest)Aftercollectingfeaturedatasetfortrainand test with its corresponding output labels, hand gesture classification was done using Artificial Neural Network via train, validate, and test stages to produce anticipated output as hand gesture.
The ANN employed in this study is a dynamic and strong back propagation (BP) type network. Its state evolves over time until it reaches the final equilibrium point, which is attained by successful training. The Widrow Hoff learning rule is applied to a multiple layer network with a nonlinear differentiable transfer function to generate BP. The learning rule for neural network propagation determines how the weights between the layers change. ANN in which hidden layer,activationfunction,epoch,errorrate,andlearningrate are utilized as hyper parameters to adjust or train the inmovementnetworkcharacteristicsfunctionnodescharacteristicsenoughuntrainedtoleranceadvantagesaANNoutputtopology,network'sAccordingsinceMarquardtArtificialcategorizationmachineclassifiersclassifierfortheoptimalvalidationofthetrainingprocess.Wherethetrainingdatahasalreadybeenlabelled,thefromthesupervisedlearningmodelareutilized.Inlearning,therearemanydifferenttypesofalgorithms.TheMultiLayerPerceptron(MLP)NeuralNetwork(ANN)basedontheLevenbergalgorithmisusedforclassificationinthisstudyitisarobustapproachinthisparticularinstance.totheliterature,theaccuracyofanartificialneuralclassificationdependsonthefeatureset,networkandtrainingtechniquechosen.Aseriesofinputandunitslinkedtogethertoformanetworkiswhatanis.Therearethreelayersinthenetwork:aninputlayer,tansigmoidhiddenlayer,andalinearoutputlayer.Theofneuralnetworksareprimarilytheirhighfornoisyinputandtheirabilitytoclassifypatterns.Theymightalsobebeneficialifthereisn'tinformationabouttherelationshipsbetweenandclasses.Hiddenlayersandoutputlayerwereactivatedusingahyperbolictangentsigmoidandalinearfunction,respectively.TheacquiredofsEMGsignalsarefedintotheANN,andtheoutputisthecategorization,ortheestimatedcaused.Thenetwork'soveralldiagramisdepictedFig.5.
2. BecausePreprocessingofitssensitivity, EMG signals are often polluted by external noise sources and artefacts. Using these tainted signalswillalsoresultinapoorclassificationresult,whichis undesirable. Electrode noise, motion artefacts, power line noise, ambient noise, and intrinsic noise in electrical and electronicequipmentarethemostcommonsourcesofnoise, artefacts,andinterferencethatcancontaminateEMGsignals.
International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | 1091
Page

IV. RESULTS AND DISCUSSION In this experimentation, we used MATLAB R2018b to construct a recommended architecture to assess the proposed model. On a desktop computer with an Intel® CoreTM i5 CPU and 8GB of RAM, the suggested model was tested. The CapgMyo dataset is a benchmark collection of high density sEMG (HD sEMG) recordings of hand gestures made by diverse individuals (able bodied people ranging in criteria.performanceorandinusingeachmovedistinctaagefrom23to26years)utilizingan8x16electrodearrayandnewlybuiltacquisitionequipment.Weemployedeighthandmotionsinthisproject,asshowninFig.4.Eachwasperformed10timesandheldfor3to10secondstime(10trials).Fig4:TheusedeighthandgesturesfromCapgMyoDataset[22].Theperformanceoftheclassifiermodelisdescribedaconfusionmatrixalsocallederrormatrix.It'samatrixwhicheachrowrepresentsexamplesfromanactualclasseachcolumnrepresentsinstancesfromapredictedclass,viceversa.Theconfusionmatrixisusedtoevaluatetheusingtheaccuracy,sensitivity,andspecificity
����������������=(����+����)/(����+����+����+����) ��ensitivity=����/(����+����)

International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page1092 aCountingimpairwasdemonstrate.learningnetworkwhilepropagationrepresenttohiddenwithhiddentoestimationset.trainingintooverfitting,trialtheinnetwork'sneuronstrainingvariables,accuracyneuronsneuronsadjustedoccursdoesn'ttypicallyneuronshiddensinceFig3:GeneralArchitectureofArtificialNeuralNetworkThehiddenlayersizeisanessentialparameterforANN,itaddstonetworkaccuracy.Theweightslinkingthelayertotheoutputgetsmallerasthenumberofinthehiddenlayergrows.Increasingthisvalueincreasesthenetwork'strainingperformance,butitalwaysassistwithgeneralization.Underfittingwhenanetwork'sweightandbiasesareimproperlyduringtrainingduetotheuseofasmallnumberofinthehiddenlayers.Ifthenumberofhiddenisincreasedbeyondacertainpoint,thenetwork'smaysuffer.Tosaveahighnumberofnetworkitispreferabletohavealotofmemory,thustheprocessbecomescomplicated.Therightnumberofinthehiddenlayerischosentobalancetheefficiencyandcomplexity.Thenumberofneuronsaneuralnetworkisnotdeterminedbyaruleofthumb.Forselectionofneuronsintheneuralnetworkinthisstudy,aanderrortechniqueisemployed.Tominimizethetraininginputdatawererandomlyseparatedthreesets,with70%ofthesamplesassignedtotheset,15%tothevalidationset,and15%tothetestToexploretheinfluenceofhiddenlayersizeonclassaccuracy,weraisedthenumberofneuronsfrom115.Underfittingisshowninneuralnetworkswith1neuron,whileoverfittingisseeninneuralnetworks15hiddenneurons,with5beingtheidealnumberofneurons.Intheinputlayer,fourneuronscorrespondtheinputfeature,whiletwoneuronsintheoutputlayerthetwoclasses.Duringtraining,thebackmethodisusedtomodifytheweightsandbiasesreducingthedifferencebetweenthegoalandneuraloutput.Moreinformationisn'tnecessarilybetterinmachineapplications,asfeatureselectionapproachesFollowingthefeatureextractionprocedure,itdiscoveredthataspecificcollectionoffeaturesmayorprovidenovaluetotheclassifier'sperformance.thenumberoftimes,afeaturedividesatreecanbeusefulmetricforfeatureselection.


Sensitivity
Fig6:comparativeresultperformance.
Figure 5 displays a sample of the test signal waveform after discrete wavelet transformation preprocessing (DWT). Training, validation, and testing are the three steps of our system's evaluation. 70% of the data samples are utilized in the training and validation phases, whereas 30% are used in thetestingphase.Afterthetraindatahasbeenvalidated,test samples are analyzed to determine the proper hand gesture as a projected output. Table 1 lists the parameters that were assessed for testing and compares them to existing techniquesusedbyresearchers,alsographicallycomparedin fig.6.

V. CONCLUSION
[1] P. N. Huu and H. L. The, "Proposing Recognition Algorithms For Hand Gestures Based On Machine Learning Model," 2019 19th International Symposium on Communications and Information Technologies 81.25 83.1 86 87.32 70.48 0 0 75.44 59.72 0 0 70.35 [7] Ref [31] Ref [32] ProposedMethod Results (%) Performance Accuracy Sensitivity Specificity
VI. REFERENCES
1009080706050403020100 Ref
Table1:ComparativeResults Parameters Results (%) Ref [7] [Ref31] [Ref32] MethodProposed Accuracy 81.25 83.1 86.0 87.32 70.48 75.44 59.72 70.35
Fig5:DecompositionoftestsignalusingDWT












International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page1093 ����������������������=����/(����+����) Were, TP true positive, TN true negative, FP false positive,FN falsenegative

Specificity
Comparative
This paper provided a hand gesture detection algorithm based on CapgMyo datasets from the Myo armband device. The approach first preprocesses the data before extracting features from it using temporal and frequency domain statistics.Finally,utilizing70%ofthedatasetfeaturevectors, an artificial neural network with feed forward backpropagationnetworkisusedtocreateaclassifier.Inthis experiment, we test the remaining 30% of feature vectors. Instead of merely obtaining the results recognized by any gesture, every test feature vectors must be categorized by a classifier so that each feature vector may be recognized correctly. A feature vector is classed as no gesture if it is not detected. Our suggested model has an accuracy of 87.32 percent, which is greater than existing approaches. In the future, we'll try to implement the approach in a real time application. Based on the outcomes of this study, it is clear thatPPGcanprovidesimilarHGRresultsass EMG.
[11] M.BenmoussaandA.Mahmoudi,"Machinelearningfor hand gesture recognition using bag of words," 2018 International Conference on Intelligent Systems and Computer Vision (ISCV), Fez, 2018, pp. 1 7, doi: 10.1109/ISACV.2018.8354082.
[13] D. O. Anderez, L. P. Dos Santos, A. Lotfi and S. W. Yahaya, "Accelerometer based Hand Gesture Recognition for Human Robot Interaction," 2019 IEEE Symposium Series on Computational Intelligence (SSCI), Xiamen, China, 2019, pp. 1402 1406, doi: 10.1109/SSCI44817.2019.9003136.
[8] K. Subramanian, C. Savur and F. Sahin, "Using Photoplethysmography for Simple Hand Gesture Recognition,"2020IEEE15thInternationalConference of System of Systems Engineering (SoSE), Budapest, Hungary, 2020, pp. 307 312, doi: 10.1109/SoSE50414.2020.9130489.
[15] E. A. Chung and M. E. Benalcázar, "Real Time Hand Gesture Recognition Model Using Deep Learning Techniques and EMG Signals," 2019 27th European Signal Processing Conference (EUSIPCO), A Coruna, Spain, 2019, pp. 1 5, doi: 10.23919/EUSIPCO.2019.8903136.
[18] H.Guo,Y.YangandH.Cai,"ExploitingLSTM RNNsand 3D Skeleton Features for Hand Gesture Recognition," 2019 WRC Symposium on Advanced Robotics and Automation(WRCSARA),Beijing,China,2019,pp.322 327,doi:10.1109/WRC SARA.2019.8931937.
[12] T. Bravenec and T. Fryza, "Multiplatform System for Hand Gesture Recognition," 2019 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Ajman, United Arab Emirates, 2019, pp. 1 5, doi: 10.1109/ISSPIT47144.2019.9001762.
[17] R. Golovanov, D. Vorotnev and D. Kalina, "Combining Hand Detection and Gesture Recognition Algorithms for Minimizing Computational Cost," 2020 22th International Conference on Digital Signal Processing anditsApplications(DSPA),Moscow,Russia,2020,pp. 1 4,doi:10.1109/DSPA48919.2020.9213273.
[3] M. Abavisani, H. R. V. Joze and V. M. Patel, "Improving the Performance of Unimodal Dynamic Hand Gesture Recognition With Multimodal Training," 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 2019, pp. 1165 1174,doi:10.1109/CVPR.2019.00126.
International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page1094 (ISCIT),HoChi Minh City,Vietnam,2019,pp.496 501, doi:10.1109/ISCIT.2019.8905194.

[14] 10.1109/ITAIC.2019.8785542.Chongqing,andIEEEsEMGW.ChenandZ.Zhang,"HandGestureRecognitionusingSignalsBasedonSupportVectorMachine,"20198thJointInternationalInformationTechnologyArtificialIntelligenceConference(ITAIC),China,2019,pp.230234,doi:
[5] J.CheonandS.Choi,"HandGestureClassificationbased on Short Time Fourier Transform of Inaudible Sound," 2020InternationalConferenceonArtificialIntelligence in Information and Communication (ICAIIC), Fukuoka, Japan, 2020, pp. 472 475, doi: 10.1109/ICAIIC48513.2020.9065201.
[16] G. Devineau, F. Moutarde, W. Xi and J. Yang, "Deep Learning for Hand Gesture Recognition on Skeletal Data," 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi'an,2018,pp.106 113,doi:10.1109/FG.2018.00025.
[7] B.Besma,K.Malika,Z.Hadjer,B.Yasmina,A.Sarraand E. Hammoudi, "Development of an Electromyography Based Hand Gesture Recognition System for Upper Extremity Prostheses," 2019 6th International Conference on Image and Signal Processing and their Applications (ISPA), Mostaganem, Algeria, 2019, pp. 1 6,doi:10.1109/ISPA48434.2019.8966886.
[4] A. Chahid, R. Khushaba, A. Al Jumaily and T. M. Laleg Kirati, "A Position Weight Matrix Feature Extraction Algorithm Improves Hand Gesture Recognition," 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 2020, pp. 5765 5768, doi: 10.1109/EMBC44109.2020.9176097.
[2] Pinto, Raimundo& Braga Borges, Carlos & Almeida, Antonio & Paula Jr, Ialis, “Static Hand Gesture RecognitionBasedonConvolutionalNeuralNetworks,” 2019 Journal of Electrical and Computer Engineering. pp.1 12.doi:10.1155/2019/4167890.
[10] M. Al Hammadi, G. Muhammad, W. Abdul, M Alsulaiman,M.A.BencherifandM.A.Mekhtiche,"Hand Gesture Recognition for Sign Language Using 3DCNN," in IEEE Access, vol. 8, pp. 79491 79509, 2020, doi: 10.1109/ACCESS.2020.2990434.
[19] K. Hu, L. Yin and T. Wang, "Temporal Interframe Pattern Analysis for Static and Dynamic Hand Gesture Recognition," 2019 IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan, 2019, pp. 3422 3426,doi:10.1109/ICIP.2019.8803472.
[6] A. Devaraj and A. K. Nair, "Hand Gesture Signal Classification using Machine Learning," 2020 InternationalConferenceonCommunicationandSignal Processing (ICCSP), Chennai, India, 2020, pp. 0390 0394,doi:10.1109/ICCSP48568.2020.9182045.
[9] S. e. Agab and F. z. Chelali, "HOG and HOOF Spatio Temporal Descriptors for Gesture Recognition," 2018 International Conference on Signal, Image, Vision and theirApplications(SIVA),Guelma,Algeria,2018,pp.1 7,doi:10.1109/SIVA.2018.8661127.
[20] A.B.Jani,N.A.KotakandA.K.Roy,"SensorBasedHand inGestureRecognitionSystemforEnglishAlphabetsUsedSignLanguageofDeafMutePeople,"2018IEEE
[23] E. Kaya and T. Kumbasar, "Hand Gesture Recognition Systems with the Wearable Myo Armband," 2018 6th International Conference on Control Engineering & pp.InformationTechnology(CEIT),Istanbul,Turkey,2018,16,doi:10.1109/CEIT.2018.8751927.
[21] A. Jaramillo Yanez, L. Unapanta and M. E. Benalcázar, "Short Term Hand Gesture Recognition using Electromyography in the Transient State, Support Vector Machines, and Discrete Wavelet Transform," 2019 IEEE Latin American Conference on Computational Intelligence (LA CCI), Guayaquil, Ecuador, 2019, pp. 1 6, doi: 10.1109/LA CCI47412.2019.9036757.
SENSORS, New Delhi, India, 2018, pp. 1 4, doi: 10.1109/ICSENS.2018.8589574.
[22] R. Kabir, N. Ahmed, N. Roy and M. R. Islam, "A Novel Dynamic Hand Gesture and Movement Trajectory RecognitionmodelforNon TouchHRIInterface,"2019 IEEE Eurasia Conference on IOT, Communication and Engineering (ECICE), Yunlin, Taiwan, 2019, pp. 505 508,doi:10.1109/ECICE47484.2019.8942691.
[25] P.Koch,M.Dreier,M.Maass,M.Böhme,H.PhanandA. Mertins, "A Recurrent Neural Network for Hand Gesture Recognition based on Accelerometer Data," 201941stAnnualInternationalConferenceoftheIEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 2019, pp. 5088 5091, doi: 10.1109/EMBC.2019.8856844.
[24] J. Kim, J. Cheon and S. Choi, "Hand Gesture Classification using Non Audible Sound," 2019 Eleventh International Conference on Ubiquitous and Future Networks (ICUFN), Zagreb, Croatia, 2019, pp. 729 731,doi:10.1109/ICUFN.2019.8806145.
[26] K. N. Krisandria, B. S. B. Dewantara and D. Pramadihanto, "HOG based Hand Gesture Recognition Using Kinect," 2019 International Electronics Symposium (IES), Surabaya, Indonesia, 2019, pp. 254 259,doi:10.1109/ELECSYM.2019.8901607.
[27] J. J. Lamug Martinez and S. Senorita Dewanti, "Multimodal Interfaces: A Study on Speech Hand Gesture Recognition," 2019 International Conference on Information and Communications Technology (ICOIACT), Yogyakarta, Indonesia, 2019, pp. 196 200, doi:10.1109/ICOIACT46704.2019.8938421.
International Research Journal of Engineering and Technology (IRJET) e ISSN: 2395 0056 Volume: 09 Issue: 02 | Feb 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page1095
