Skip to main content

Hand gesture motion controlled pick and place object robotic vehicle

Page 1


International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 p-ISSN: 2395-0072

Volume: 12 Issue: 09 | Sep 2025 www.irjet.net

Hand gesture motion controlled pick and place object robotic vehicle

Abdul Hadi Akhlaque Khatib1 , Siddharth Sanjay Salvi2 , Karan Raju Barde3, Aniket Chandrakant Shigvan4

1Abdul Hadi Akhlaque Khatib, Mangaon, Raigad, 402103.

2Siddharth Sanjay Salvi, Mangaon, Raigad, 410205.

3Karan Raju Barde Roha, Raigad,402109 ,

4Aniket Chandrakant Shigvan Mangaon, Raigad, 402104.

Dept. of electrical Engineering, DBATU, lonere, Raigad, Maharashtra, India

Abstract - The development of intuitive human-robot interaction systems has become crucial in modern robotics applications. This paper presents the design and implementation ofa hand gesture motion-controlledrobotic vehicle capable of pick and place operations using accelerometer sensors and RF communication. The system consists of two main units: a transmitter unit equipped with ADXL335 accelerometer and Arduino Nano for gesture detection,andareceiverunitwithArduinoUnocontrollingDC gear motors through L298N motor drivers. The proposed system eliminates the need for traditional button-based controllers, providinga more naturalandaccessible interface for robotic control. Experimental results demonstrate successful operation with 200-300ms response time, 44 metersoperationalrange,and1100gramspayloadcapacity. The system achieves 96-98% gesture recognition accuracy with 2-hour battery life, making it suitable for industrial automation,medicalassistance,andhazardousenvironment applications.

Key Words: Handgesturerecognition,Roboticvehicle,Pick and place, Accelerometer, RF communication, Arduino, Human-robotinteraction,ADXL335.

1. INTRODUCTION

The evolution of robotics technology has consistently focusedonimprovinghuman-machineinteractioninterfaces. Traditionalroboticcontrolsystemsrelyheavilyoncomplex joystickcontrollersandbutton-basedinterfaces,whichcan be challenging for users with disabilities and create operational barriers in time-critical applications. Hand gesturerecognitionhasemergedasanaturalandintuitive alternative, allowing operators to control robotic systems throughsimplehandmovementswithoutphysicalcontact withcontroldevices.

Recentadvancesinmicroelectromechanicalsystems(MEMS) technology and wireless communication protocols have made gesture-based control systems more accessible and cost-effective.Theintegrationofaccelerometersensorswith microcontrollerplatformsprovidesapracticalsolutionfor developingresponsivegesturerecognitionsystemsthatcan beimplementedinvariousroboticsapplications.

The proposed system addresses several limitations of conventional robotic control methods including physical strainduringprolongedoperation,limitedaccessibilityfor differently-abled users, safety concerns in hazardous environments, and complex learning requirements for traditional interfaces. The development of this gesturecontrolled robotic vehicle aims to provide an intuitive, wireless,andefficientsolutionforpickandplaceoperations inindustrialanddomesticenvironments.

1.1 Problem Statement

Currentroboticcontrolsystemsfacesignificantchallengesin termsofuseraccessibilityandoperationalefficiency.Wired button-controlledrobotsbecomebulkyandlimitoperational distance, while wireless controllers still require physical buttonpressesthatcancausefingerstrainduringextended use. Additionally, these systems are not suitable for users withphysicaldisabilitiesandposesafetyrisksinhazardous environmentswheredirectoperatorpresenceisdangerous.

1.2 Objectives

Theprimaryobjectivesofthisresearchinclude:measuring hand gestures using accelerometer sensors, identifying movement directions through signal processing, encoding gesture data for RF transmission, receiving and decoding controlsignals,andimplementingpreciserobotmovement control based on decoded commands. The system aims to achievereliablewirelesscommunication,intuitivegesture recognition,andefficientpickandplaceoperations.

2. LITERATURE REVIEW

Anderez et al. (2019) demonstrated accelerometer-based hand gesture recognition for human-robot interaction, achieving95.85%accuracyusingawrist-mountedtri-axial accelerometerwithadaptivesegmentationtechniques.Their research established the foundation for reliable gesture recognition in robotics applications using computational solutionswithgesturesetsincludingdirectionalmovements. Netoetal.(2019)proposedagesture-basedHRIframework usinginertialmeasurementunits(IMUs)formanufacturing assistanceapplications.Theirsystemimplementedartificial

Volume: 12 Issue: 09 | Sep 2025 www.irjet.net

International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 p-ISSN: 2395-0072

neuralnetworksforstaticanddynamicgestureclassification, demonstrating efficiency in assembly operations with parameterizationrobotictaskmanager(PRTM)interface. CelikandKuntalp(2012)developedroboticarmcontrollers using image processing techniques for hand gesture recognition in Human-Machine Interaction (HMI) applications. Their work compared template matching algorithms with signature signal analysis for fingertip detection and counting, providing insights into different approachesforgesturedetectionandclassification. Thesestudiesdemonstratethegrowinginterestingesturebasedroboticcontrol systems,butmostfocusoncomplex image processing techniques or require extensive computational resources. The proposed approach emphasizes simplicity, cost-effectiveness, and reliable performanceusingreadilyavailablecomponents.

3. SYSTEM DESIGN AND METHODOLOGY

3.1 System Architecture

The proposed hand gesture motion-controlled robotic vehicle consists of two main subsystems operating in a master-slaveconfigurationthroughRFcommunication.

Transmitter Unit (Gesture Controller):

 ADXL335three-axisaccelerometerforgesturedetection

 ArduinoNanomicrocontrollerfordataprocessing

 RF transmitter module (433 MHz) for wireless communication

 9Vbatteryforportableoperation

 Modeselectionswitchesforoperationalcontrol

Receiver Unit (Robotic Vehicle):

 ArduinoUnoasmaincontrolprocessor

 RFreceivermodule(433MHz)forcommandreception

 L298Nmotordrivermodulesformotorcontrol

 DC gear motors for vehicle movement and arm operations

 12Vlithium-ionbatteryforsystempowersupply

3.2 Hardware Implementation

3.2.1 Gesture Detection Module

The ADXL335 accelerometer detects hand orientation changesinX,Y,andZaxes,producinganalogvoltageoutputs (0-5V) proportional to gravitational and dynamic acceleration forces. The Arduino Nano processes these analogsignalsusingpredefinedthresholdvaluestoidentify specificgesturepatterns.

Gesture mapping includes forward tilt (X-axis > 2.5V) for forwardmovement,backwardtilt(X-axis<1.5V)forreverse movement,lefttilt(Y-axis>2.5V)forleftturn,andrighttilt (Y-axis<1.5V)forrightturncommands.

3.2.2 Robotic Vehicle Design

The vehicle chassis measures 40cm × 24cm × 8cm, constructedfrommildsteelwith1.5mmthicknessproviding structural strength while maintaining lightweight design.

Theroboticarmconsistsoftwo30cmsectionswithparallel plates separated by 7cm gaps to accommodate internal mechanical components including worm gears and drive motors.

Themechanicaldesignincorporatesfourwheelmotors(12V, 0.2A,60RPM) forvehiclelocomotionandfourspecialized motors(12V,0.2A,3.5RPM)forarmarticulationandgripper operations.Wormgearandspurgearmechanismsprovide precise control with gear ratios optimized for torque multiplicationandpositionalaccuracy.

3.3 Control Algorithm Implementation

The control system implements multi-mode operation allowing seamless switching between vehicle movement, armpositioning,andgrippercontrolfunctions.Thesoftware architectureprocessesgesturecommandsinreal-timewith safety limits preventing motor over-rotation and system damage.

Powermanagementalgorithmsoptimizebatteryutilization through selective motor activation and sleep mode implementation during idle periods. The communication protocol ensures reliable data transmission with error detection and automatic retry mechanisms for critical commands.

4. RESULTS AND ANALYSIS

4.1

Performance Evaluation

Comprehensive testing was conducted to evaluate system performance across multiple operational parameters. The communication system demonstrated reliable operation within50metersrangewith200-300millisecondsresponse timebetweengestureinputandroboticaction.

Mechanical performance testing confirmed maximum payloadcapacityof1100gramswithvehiclespeedof2.26 km/h(0.628m/s)onflatsurfaces.Theroboticarmachieved 60cm total extension with precise positioning accuracy suitableforpickandplaceoperations.

Powerconsumptionanalysisrevealedtotalsystemcurrent draw of 1.6-2.2A during active operation, providing 2-2.5 hourscontinuousruntimewiththe12V,4.5Ahlithium-ion battery. Transmitter unit operation consumed approximately 80mA with 4-6 hours battery life using standard9Vbattery.

4.2 Gesture Recognition Accuracy

Experimental validation demonstrated reliable gesture recognitionacrossdifferentuserhandsizesandmovement patterns.Calibrationproceduresoptimizedthresholdvalues forindividualusers,resultinginforward/backwardgesture recognitionaccuracyof98%,left/rightgestureaccuracyof 96%,andmodeswitchingreliabilityof100%.Falsepositive detectionrateremainedbelow3%undernormaloperating conditions.

Volume: 12 Issue: 09 | Sep 2025 www.irjet.net

4.3 Operational Testing Results

Field testing validated practical applicability in simulated warehouse environments with successful picking and placing of 50 objects during continuous operation tests. Outdoorterrainnavigationdemonstratedeffectiveoperation onflatsurfaces with consistent performancemaintenance for1.5hourscontinuousoperation.

User acceptance evaluation indicated positive feedback regardingintuitivecontrolinterfacewithminimallearning curve requirements. Multiple operators successfully controlled the system after brief orientation sessions, confirming the accessibility advantages of gesture-based control.

Table -1: SystemPerformanceParameters

Operational

5. APPLICATIONS AND ADVANTAGES

5.1 Applications

Industrial: Warehouse inventory management, manufacturingassemblyassistance,hazardousenvironment operations,andqualitycontrolincontaminatedareas.

Medical: Surgical instrument delivery, laboratory sample handling,andrehabilitationassistanceformobility-impaired individuals.

Military: Bomb disposal operations, reconnaissance missions,andremotematerialhandlingindangerouszones.

p-ISSN: 2395-0072

5.2 Advantages

 Intuitivegesture-basedcontroleliminatinglearningcurve

 Wireless operation ensuring safety in hazardous environments

 Cost-effectiveusingreadilyavailablecomponents

 Lowpowerconsumptionwith2-hourbatterylife

 Accessibilityfordifferently-abledusers

 Minimalmaintenancerequirements

6. FUTURE ENHANCEMENTS

Technical Improvements:

 Machine learning algorithms for adaptive gesture recognition

 Computervisionintegrationforspatialawareness

 Hapticfeedbackforenhanceduserexperience

 Voicecommandintegrationformulti-modalcontrol

Hardware Upgrades:

 Increasedpayloadcapacitywithstrongeractuators

 Extendedbatterylifethroughpoweroptimization

 Weather-resistantdesignforoutdooroperations

 Modularend-effectorattachmentsystem

Software Development:

 Mobileappforsystemmonitoringandconfiguration

 Cloudconnectivityforremoteoperation

 Advancedpathplanningalgorithms

 Integrationwithwarehousemanagementsystems

7. CONCLUSIONS

Thisresearchsuccessfullydemonstratesthefeasibilityand effectiveness of hand gesture motion-controlled robotic vehiclesforpickandplaceoperations.Thedevelopedsystem achievesprimaryobjectivesofprovidingintuitivewireless control while maintaining reliable performance and costeffectiveness.The experimental resultsconfirmsuccessful implementation with 200-300ms response time, 44m operational range, 1100 g payload capacity, and 96-98% gesture recognition accuracy. These performance

Fig -2: Applications
Fig -1: Vehicledesign

Volume: 12 Issue: 09 | Sep 2025 www.irjet.net

International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 p-ISSN: 2395-0072

parametersmakethesystemsuitableforvariousreal-world applications including industrial automation, medical assistance, and hazardous environment operations. The intuitive gesture-based interface significantly reduces operator training requirements while improving safety through wireless operation capabilities. Future developments focusing on advanced AI integration and enhancedautonomouscapabilitieswillfurtherexpandthe system'spracticalapplicationsandeffectiveness.

ACKNOWLEDGEMENT

The authors express sincere gratitude to Prof. Ajinkya M. Bhaware, Department of Electrical Engineering, Dr. BabasahebAmbedkarTechnologicalUniversity,Lonere,for invaluable guidance and supervision throughout this researchproject.Appreciationisextendedtothefacultyand technical staff for their support and assistance during experimentalphases.

REFERENCES

[1]AnshuD.O.Anderez,L.P.DosSantos,A.LotfiandS.W. Yahaya,"AccelerometerbasedHandGestureRecognitionfor Human-RobotInteraction,"IEEE2019.

[2]PedroNeto,MiguelSimão,NunoMendesandMohammad Safeea,"Gesture-basedhuman-robotinteractionforhuman assistance in manufacturing", The International Journal of Advanced Manufacturing Technology, vol. 101, no. 1, pp. 119-135,2019.

[3]I.B.CelikandM.Kuntalp,"Developmentofarobotic-arm controllerbyusinghandgesturerecognition",International Symposium on Innovations in Intelligent Systems and Applications(INISTA),pp.1-5,July2012.

[4]SuyogPatil,GauravChauhan,RishikeshKalge"Automatic FloorCleaningRobot"IRJETApr2021.

[5] Prateek Pal, Amrit Mishra, M.C. Srivastava, Ashwani Sharma, "Gesture Controlled Pick and Place Robot", MechanicalandAutomationEngineeringAmityUniversity, Lucknow,India,2018.

BIOGRAPHIES

Name:AbdulHadiAkhlaqueKhatib

Qualification: B.Tech Electrical Engineer from Dr. Babasaheb Ambedkar Technological University,Lonere.

Otherprojectsfromcoursework:

1. Smoke detector machine with alarmsystem

2.AutomaticloadON/OFFcontrol system

Name: Siddharth Sanjay Salvi

Qualification: B.Tech Electrical Engineer from Dr. Babasaheb Ambedkar Technological University,Lonere.

Otherprojectsfromcoursework: 1.Touchlessswitch

2.Automaticdoorbellwithmotion detectorsystem

Name: Karan Raju Barde

Qualification: B.tech Electrical Engineer from Dr. Babasaheb Ambedkar Technological University,Lonere.

Otherprojectsfromcoursework: 1.MetaldetectorusingIC555

2.RadarsystemusingArduino

Name: Aniket Chandrakant Shigvan

Qualification: B.Tech Electrical Engineer from Dr. Babasaheb Ambedkar Technological University,Lonere.

Otherprojectsfromcoursework: 1. Temperature and humidity measurementsystem

2.Automaticdelaytimercircuit

Turn static files into dynamic content formats.

Create a flipbook