The Department of Robotics Engineering is pleased to showcase the following Major Qualifying Project initiatives as integral components of the 2025 Undergraduate Research Project Showcase.
A mergeable infrastructure for swarm programming
Lorenzo Manfredi Segato
Filippo Marcantoni
Emma Pollak
Carlo Pinciroli Advisors Team Members
Abstract
We present PiELo, a novel programming language built from the ground up for swarm robotics. PiELo explores two behaviors which are often cumbersome in contemporary programming infrastructure: reactivity and consensus. Reactivity is the ability of the robot to respond to updates automatically, and consensus, which is unique to swarms, is the ability of the robots to decide together on common values. In our paper, we show the benefits and drawbacks of these features being core to the language.
A System for Watering and AutonomouslyMonitoring Plants (SWAMP)
Team Members
Emilia Gutman
Lauren Harrison
Jessica Hart
Isabella Lucas
Colleen Mullane
Gregory C. Lewin Advisors
To alleviate the challenges of plant care, GRL PWR developed SWAMP, a mobile robotic System for Watering and Autonomously Monitoring Plants. SWAMP replicates the plant care a human provides with minimal user input. From stakeholder analysis, the team identified four key needs of the system:
1.
Decrease the amount of work required to take care of household plants.
2.
Navigate to plants in dynamic, enclosed areas while avoiding obstacles.
4.
Access plants in a variety of common household locations.
3. Provide the necessary amount of water for a diverse set of plants.
SWAMP autonomously navigates its environment using a LiDAR to localize within its surroundings and safely avoid obstacles. To accurately align with each plant, SWAMP uses a camera to detect April tags placed on the pots. Once SWAMP is aligned, it uses an articulated arm to reach plants in a variety of common household locations. SWAMP accommodates various plant types by allowing user input for plant’s unique specifications like target moisture, pot volume and location. The plant’s volume and target moisture are used to calculate the amount of water SWAMP needs to provide. These design choices allow SWAMP to accurately meet the watering needs of each plant, reducing the time-commitment required from plant owners. GRL PWR will demonstrate the system in the WPI Gordon Library, which has approximately 40 plants that require watering. The library provides a real-world environment to assess SWAMP’s ability to map its surroundings, navigate around obstacles, and accurately deliver water to plants.
Advanced Autonomous Tour Guide Robot
Team Members
Jacob Ellington
Aashi Goel
Sukriti Kushwaha
Vivek Voleti
Advisors
Nitin Sanket
Greg Lewin
Fiona Yuan
Jing Xiao
Abstract
Tori is an autonomous tour guide robot that enhances WPI visitor engagement by guiding users to their desired destinations in Unity Hall. The navigation system dynamically adjusts the robot’s path to avoid obstacles and plan safe routes. A voice command system and touchscreen allow for seamless and engaging human-robot interaction methods. In addition, an onboard large language model allows users to query about WPI’s campus and receive accurate and informative responses. Tori traverses all five floors, complete with a custom elevator button detection model to allow her to use the elevators.
Advancing Humanoid Robots:
Development of Balancing and Assisted Walking Along With Improved Hardware
Team Members
Gabriel Bohorquez
Ethan Glasby
Jai Jariwala
Lily Jones
Sahil Mirani
Alana Reid
Jessica Wong
Pradeep
Radhakrishnan
Taylor Andrews Advisors
This project marks the latest iteration of WPI’s endeavor to design a reliable, open-source, 3Dprinted humanoid robot, named Ava. Our team aimed to improve the assisted walking and selfbalancing capabilities through sensor integration, control methods, and mechanical redesigns. We introduced new hardware and software systems, including a multiple Inertial Measurement Unit feedback system and thin-film pressure sensors for real-time posture correction. We transitioned from joint-space to task-space trajectory generation using Inverse Kinematics for Python, allowing for refined end-effector control. We also worked on a Model Predictive Control framework based on a Linear Inverted Pendulum Model that can anticipate and adjust Ava’s movements dynamically. In CoppeliaSim, Ava demonstrated improved dynamic stability, showcasing waving and squatting motions while balancing. In real life, Ava successfully demonstrated balancing through squatting and waving routines. These developments represent a significant step toward unassisted humanoid walking, laying the groundwork for more advanced bipedal behaviors in future iterations.
Ball Catching Drone
Nitin Sanket Advisors Team Members
Connor Tompson
Enrique Pohl
Abstract
This MQP project presents the design, development, and evaluation of an autonomous quadcopter drone capable of intercepting and catching a ball mid-flight. The system integrates real-time trajectory prediction and control algorithms to track a moving ball and calculate a path for interception. This project utilizes a motion capture system to track the ball and robot. Physics modeling helps us estimate the ball trajectory and compute the desired robot control actions for optimal interception. Through developing this procedure the team went through multiple iterations of trajectory estimation methods and control implementations to determine the most reliable and rapid technique for this aerial robotic system This work demonstrates the culmination of work between three Robotics Engineering seniors through this past academic year and of the experience gained over their undergraduate degrees.
Team Members
Wyatt Harris
Adam Kalayjian
Sean Lendrum
Jared Morgan
Kai Nakamura
Owen Sullivan
Advisors
Agheli Hajiabadi
Mohammad Mahdi Jing Xiao Guanrui Li
Legged robots are designed to tackle environments where wheels would otherwise fail. Quadrupedal robots, or "robot dogs," offer unique advantages in stability over humanoid, bipedal robots.
Developments from organizations like Boston Dynamics and Unitree demonstrate wide applications for quadrupeds like inspection, surveying, logistics, etc. However, complex terrains that require precision, like stairs, still remain difficult. Our work presents a comprehensive framework for real-time stair climbing, integrating terrain perception, footstep planning, and contact estimation. It utilizes LiDAR for SLAM, improving odometry and reducing map drift. Hardware upgrades expand robot capabilities, and novel integration of mapping confidence values improves terrain interaction. A dynamic footstep planner maps footholds onto convex planes, providing initial guesses to speed up trajectory optimization. Contact estimation combines foot height, force, gait timing, and terrain data, which improves robustness on challenging terrains. Validated on the Unitree Go1, this framework demonstrates capability in navigating stairs, showcasing suitability for complex environments.
Developing an Autonomous Drone for Avalanche Search and Rescue
Team Members
Kevin Chin
Samuel Honor
Chad Nguyen
Brianna Sahagian
Kevin Leahy Advisors
Abstract
While state-of-the-art beacon technology has been developed to aid in current avalanche search and rescue strategies, there is still a need to reduce search time in order to increase the amount of live rescues. In an effort to create an efficient alternative to ground team search and rescue, the MQP team has developed a drone that can autonomously locate buried targets using Pieps beacon technology and a three phase search that mirrors industry-standard techniques. The created prototype has been analyzed for operational safety, predictability, accuracy, and reliability through a series of flight tests. Overall, the system is designed as a supplemental tool for rescue teams that can assist in search and rescue efforts.
Development of Elbow Exoskeleton with Printed Stretchable Electronics
and Sensors
Team Members
Connor Ehrensperger
Ethan Lilley
Panayotis (Perry)
Pesiridis
Pratap Rao
Lane Harrison Advisors
Abstract
This MQP focused on testing and integrating stretchable electronics printing, as developed in Professor Rao's lab, into a soft elbow exoskeleton to enhance wearability and ease of use. The team developed EMG muscle sensing, positional tracking, and control systems for use with the stretchable electronics. The combination of a soft pneumatic actuator and stretchable printed electrodes could provide a prototype elbow exoskeleton that is easy to wear without excessive fasteners or the need for constantly replacing disposable electrodes.
Disc Golf InventoryAutomation Machine
Team Members
Anthony Gonzales
Aiden Higuera
Jenna Marcinkowski
Katy Stuparu
Cassie Youn
Keyla Zelaya
Greg Lewin
Jing Xiao Ziming Zhang Advisors
Abstract
Automation helps streamline processes, making them more efficient and reducing human error. With Maple Hill Disc Golf aiming to become the first virtual disc golf retailer, automation is the next step to bring the in-person buying experience to online customers. The fourth iteration of this project continued the development of an automated inventory machine with a modular design. An automated machine with the functionality to record disc shape, flexibility, weight, and cosmetic photos would allow Maple Hill to provide the desired information to their online customer base.
Expanding Human-Robot Interaction in an OpenSource, Toddler-Sized 3DPrinted Humanoid Robot: YOLO-Based Vision, Voice Commands, and GripperBased Manipulation
Team Members
Shiivek Agarwal
Elowyn Akers
David Alex
Gabriel Bohorquez
An Phan
Preston Van Fleet
Taylor Andrews Pradeep Radhakrishnan Advisors
We are enhancing Finley, an open-source, toddler-sized 3D-printed humanoid robot, to improve its human-robot interaction capabilities. The robot now supports voice commands, allowing users to control its functions verbally for more intuitive operation. A notable feature is the swapping station, which enables Finley to exchange its end-effectors, such as transitioning between different hands, to perform diverse tasks. Equipped with an infrared (IR) sensor, Finley can measure forehead temperatures, demonstrating its potential for health monitoring applications. Additionally, the robot can assess distances to objects, facilitating effective interaction within its environment. For object manipulation, Finley utilizes a gripper-based system, supported by inverse kinematics algorithms from the IkPy library, to perform precise pickand-place tasks. The integration of the YOLOv4 object detection model enhances Finley's visual perception, enabling real-time identification of objects and human poses. Communication between Finley and its peripherals, such as the swapping station, is managed through the 0MQ messaging protocol, ensuring seamless and efficient data exchange. These advancements collectively contribute to Finley's versatility and effectiveness in various assistive and interactive scenarios. Ultimately, this work aims to develop capabilities that support Finley’s role as a helpful companion in assisted living environments.
General Robot Assistant for Common Errands (GRACE)
Team Members
Brent Weiffenbach
Alexander Beck
Gwenaelle Deleo
Jing Xiao Advisors
Reza Ebadi
Constantinos Chamzas
Abstract
Mobile manipulation enables robots to perform complex tasks using a mobile base and one or more manipulators. The TurtleBot2 is an open-source mobile base platform that uses a RGB-D camera for exploration and navigation. Our project, GRACE, is a mobile manipulator that adds a 3D-printed 6-DOF arm to the TurtleBot2 to perform common household errands. Our work equips GRACE with software for simultaneous localization and mapping (SLAM), perception of semantic objects using YOLO object detection, and a vision-guided pick-and-place arm with custom motion planning and control. GRACE navigates unknown environments, searches for a target object, picks up the object, and navigates to a semantic goal location to place the object.
HexaFlex: Design and
Testing of a Hexapod with a Flexible Origami- Inspired Spine
Team Members
Dongquan Ji
Yanbo Hua
Jiaming Du
ZeHai Li
Advisors
Cagdas Onal
Robin Hall
Gabrielle Conard
Abstract
While legged robots have taken great strides in navigating complex terrain, the traditional rigid bodies used often limit their ability to adapt to highly uneven terrain and narrow passageways. Incorporating a flexible spine can address these issues by enhancing the robot’s adapt ability to its environment. This paper introduces HexaFlex, a hexapod robot with a flexible origami-inspired spine. We detail the mechanical design, control architecture, and gait generation methods. To quantify the benefits of the flexible spine, the robot was tested in confined spaces and over obstacles. HexaFlex achieved a maximum speed of 0.92 BL/s (0.25 m/s) with a cost of transport (COT) of 5.2 on flat terrain, and the flexible spine enabled the robot to navigate turns with a minimum radius of 0.34 BL (0.09 m). It also operated in an outdoor environment to test its robustness in a real-world scenario. The flexible spine improves lo comotion efficiency and maneuverability, improving on traditional rigid robots.
Integration of OpenSource
Controls and Sensors in a Low-Cost Desktop CNC Mill
Team Members
Michael Primavera
Camren Chraplak
Echo Baumer
Perrin Kristoff
Andrew Petro
Daniel Petro
Dante Uccello
James Carroll
JR Thaiprayoon
Michael Doucette
Rafael Caballero
Advisors
Radhakrishnan Kadam Cuneo
Abstract
Our team’s goal was to integrate an improved control architecture and sensor suite into an opensource computer-numerical-control (CNC) mill for students and hobbyists. The open-source software Universal G-Code Sender (UGS) was selected to provide users with an intuitive graphical interface for controlling the machine and the additional functionality our team implemented an automatic tool changer, fourth axis, and robust coolant and safety systems. UGS source code is editable, enabling custom interfaces for the PicoCNC control board chosen for hardware. The 32-bit grblHal package was selected for firmware, as it is also open-source and customizable. The CNC mini mill was designed with a dedicated Raspberry Pi computer to run UGS with grblHAL through the Linux open-source operating system. Furthermore, various sensors, such as accelerometers and temperature sensors, were utilized during the development and testing of the enhancements, producing data that was used to verify the effectiveness of all its functionalities.
Lunabotics
Sam Rooney
Jonathon Tran
Matthew Copeland Team Members
Kenneth Stafford
Loris Fichera Advisors
Abstract
The National Aeronautics and Space Administration (NASA) Lunabotics Challenge tasks university students to develop, test, and fabricate lunar robots that will compete in simulated lunar mission challenges. Student groups will compete against one another using their individual engineering skills through the design and operation of their robot. Student teams are scored based on the autonomous navigation, obstacle avoidance, and regolith berm construction efficiency of their robot. This year, the WPI team inherited last year's robot and elected to enhance its design and autonomous behavior. This enabled the team members to develop diverse skills within their prospective majors, beyond their majors, and outside the classroom to meet testing goals and development objectives for this year’s Lunabotics Challenge and robot, Mooncake
Meissa Microgrid | Multi-
Tenant Renewable Energy Monitoring Platform with Solar PV Tracker Controls
Team Members
Edward Dang
Vien Le
Yinuo Zhao
Andrew Qi
Aliaa Hussein
Advisors
Berk Calli
Seyed Zekavat
Abstract
This project aims to expand EV charging infrastructure in Indonesia with a technologically driven solution in collaboration with 360energy, an energy-as-a-service startup. Our solution features an automated reduced and large-scale dual-axis solar tracking system that captures sunlight and stores it in a battery. Static and dynamic analyses, as well as mathematical models of the system were enlisted to select various components and enable control tracking. Sensors were integrated into the system to detect solar and weather conditions. The system is connected to a centralized web application dashboard to monitor and control the solar tracking system and simulated battery swapping station for EV drivers and 360energy employees. The dashboard provides real-time data visualizations, including energy intake, temperature, and wind speed collected from the solar panel control monitor. This allows users to track environmental conditions and system performance effectively. Not only that, the dashboard allows users to view different project locations on an interactive map showcasing all sites located in Indonesia. The dashboard also displays a battery swapping station interface, replicated with the EV stations, where users can track battery swaps with certain information like battery health, charging schedules, locking status, and availability.
Motion Control Photography
Matthew Winchell
Michael
Conroy Team Members
Andre Rosendo
Ralph Sutter Advisors
Abstract
Although the technique of combining imagery together to create movement is no longer novel, the assemblies which manipulate the camera and subject can be improved upon with modern hardware, electronics, and software. The system needs to precisely move the camera and subject during the exposure for multiple iterations of the same movement. Each layer focuses on a different aspect of the subject, typically in the form of lighting. All layers of footage are then combined as one.
PRIMO (Mobile Printer)
Team Members
Warwick Barker
Colin McGinty
Luke Sanneman
Marc Wehbe
Madhi Agheli Advisors
Abstract
Primo explores a method for printing construction materials using a mobile robot platform. The small robot is equipped with a pump capable of extruding various materials through a nozzle to print layers on the ground. These materials include construction-grade substances such as clay and cement, which are transported using custom-designed pumps driven by DC motors. The robot navigates its environment using LiDAR, allowing it to operate autonomously. Due to its compact size, Primo can print in locations and configurations that are typically inaccessible or impossible to large commercial concrete 3D printers.
SACRED: Strain-Aware
Continuum Robot for Estimating Deformation
Team Members
Nikesh Walling
Cameron Wian
Shivangi Sirsiwal
Loris Fichera Cagdas Onal Advisors
Abstract
This research aims to develop a cost-effective and accurate method for real-time shape sensing of continuum robotic arms through embedded strain sensors. Continuum robots are a class of flexible manipulators that, unlike traditional rigid-link robots, deform continuously along their length to achieve complex shapes and navigate confined spaces. These robots are particularly valuable in medical surgery, manufacturing, search and rescue operations, and geological research applications. However, tracking the precise shape of continuum arms presents significant challenges, as conventional sensing techniques such as vision-based systems require line-of-sight, and fiber optic solutions like Fiber Bragg Grating sensors are expensive and difficult to integrate. To address these limitations, we utilize multi-material 3D printing to fabricate a Yoshimura-style continuum arm with stretchable strain sensors embedded directly within its structure during manufacturing. This approach eliminates the need for post-production sensor installation and provides distributed sensing capabilities throughout the arm's body. Our method enables real-time pose tracking and visualization from the base to the end effector, offering a practical solution for applications requiring precise control in complex environments.
SailBot
24-25
Team Members
Max Berman
Bryce McKinley
James Purnell
Gavin Tingley
Kenneth Stafford
William Michalson Advisors
Abstract
The goal of our Major Qualifying Project (MQP) is to develop an autonomous robotic sailboat to compete in the International Robotic Sailing Regatta. We will build upon the work of previous teams to improve the boat, focusing on creating a new sail, improving the vision system, and adding ergonomic features such as a magnetic power switch and internal system feedback.
SMAC 6.0
Team Members
Sakshi Gauro
Al Jarmoszko
Minh (Mo) Nguyen
Katarzyna Racka
Jingxu (Rick) Wang
Tracy Wang
Advisors
Carlo Pinciroli
Gregory Lewin XinMing Huang
Multi-robot construction systems offer an approach for automating construction, especially in dangerous and unstable environments. Symbiotic Multi-Agent Construction (SMAC) 6.0 proposes a collaborative, decentralized system of inchworm robots and discrete smart blocks to coordinate construction. This occurs through stigmergy, a method where agents communicate indirectly by leaving traces of information to guide subsequent actions. In the system, the inchworms rely on blocks’ navigational cues while the blocks depend on the inchworms for locomotion to their designated positions. The system collectively constructs a predetermined structure defined by a blueprint. Additionally, system robustness and scalability is shown through its capacity to handle disruptions, agent failures, and collisions. Through redesign of communication topology, connection mechanisms, and dynamic inchworm path planning, we demonstrate stigmergy physically and in simulation. This work advances swarm robotics by exploring decentralized coordination in autonomous construction.
Soft Aerial Robot (SoAR)
Team Members
Andrew Roush
Ao Jiang
Chris Walczak
Cadgas Onal Timothy Jones Advisors
The field of aerial robotics rapidly increased in the past decade due to its diverse real-life applications such as package delivery, environment inspection, and even military usage. To increase flight maneuverability and efficiency of aerial robots, this project focused on the development of a tilt wing flying robot platform utilizing a deformable body to achieve high efficiency of flights while retaining great maneuverability such as sharp turns. The soft aerial robot (SoAR) contains a soft body inspired by traditional origami connecting the front and rear sections with each with tilt rotor wings. This special design gives SoAR the ability of vertical take-off & landing as a conventional multi-rotor drone, while achieving a long duration flight with high energy efficiency like a conventional fixed wing plane. In this MQP, we rebuilt the first generation of SoAR with a renovated wing design and split soft bodies for better maneuverability. Additionally, by improving the mechanical design and reliability while integrating modern electronic hardware, we have developed a modular platform that supports testing and scalability for future research.
Soft Robotic Eel
Team Members
Natalie Essig
Christopher Hunt
Pranav Jain
Dexter Stark
Cadgas Onal
Robin
Hall Advisors
Soft robotics offers a compelling approach to underwater locomotion, combining flexibility with bioinspired movement capabilities to navigate environments challenging for traditional robotic systems. This project seeks to advance the design, control architecture, and modularity of a cable-driven soft robotic eel to achieve swimming performance and maneuverability comparable to that of biological organisms. The robot consists of modular, 3D-printed accordion segments fabricated from thermoplastic elastomer filament, actuated via fishing line routed through each segment and wound around servo-driven spools, enabling precise and adaptable motion control. The modular design facilitates easy customization, repair, or extension of the eel’s body, and includes a watertight head enclosure to protect onboard electronics, along with soft fins and a compliant tail to enhance natural swimming behavior. The system was tested in tank and pool environments to evaluate swimming speed, directional control, and responsiveness to control inputs. This soft robotic eel offers potential applications in bioinspired underwater exploration, environmental monitoring, and research into aquatic locomotion. 49
Terrawarden Drone Cleanup
Team Members
Mark Caleca
Zephyr Conley
Jakub Jandus
Samuel Markwick
Kevin Siegall
S. Taylor
Advisors
Berk Calli
Kevin Leahy
Guanrui Li
Abstract
Roadside trash on highway medians causes significant environmental and economic harm in the United States. While prevention and public awareness remain the best long-term solutions, current methods of cleaning up existing litter are ineffective, dangerous, and expensive. Although ground robots have been successful at collecting trash in other conditions, the inaccessible and uneven terrain of highway medians is better suited for an autonomous aerial system. The Terrawarden drone, a onemeter-wide quadrotor with a 3-DOF manipulator and compliant gripper, addresses the problem of roadside litter by autonomously detecting and retrieving trash using on-board perception. To validate the feasibility of this approach, a controlled experiment was designed in which the drone was tasked with detecting a beverage container, navigating to it, collecting it without landing, and then transporting it to a designated collection point. The drone successfully completed each of these tasks, demonstrating the effectiveness of the proposed system and its potential for future research and deployment.
The Design and Prototyping of a Low-Cost & Efficient Ocean Cleanup Robot
Team Members
Yan Acevado
Samantha Booher
Evan Carmody
John Hall
Renata Kaplan
Joshua Keselman
Cooper Mann
Advisors
Vincent Aloi
Selcuk Guceri
Abstract
Over the past year, approximately 16.5 million tons of plastic waste entered the world's oceans (Oceana). This waste poisons ecosystems and wildlife, and degrades into microplastics, entering the human food supply. Current strategies to address this issue are costly, require extensive human support, and focus mainly on preventing additional accumulation rather than removing existing debris. This project proposes the development of a costeffective, scalable robotic system for efficient trash collection along shorelines. Our goal is to create an autonomous robot that matches the performance of existing solutions, with reduced cost and a limited need for human intervention. We designed and built a prototype robot, with a catamaran-inspired design. The robot has a durable trash collection system, two brushless motors, and a variety of sensors for object detection and navigation. While operating autonomously, the robot identifies and collects surface waste using a custom computer vision system optimized for aquatic environments.
UAV/UGV Docking and Charging System
Myrrh Khan
Albert Lewis
Michael Monda
Istan Slamet Team Members
Kevin Leahy
Nitin
Sanket Advisors
Abstract
UAV-UGV collaborations are utilized in many applications which are hazardous to humans such as search and rescue, industrial site monitoring, reconnaissance missions, and agricultural processes. A charging collaboration system between a quadcopter UAV and an unmanned ground vehicle (UGV) significantly extends the quadcopter’s operating time by acting as a mobile charging station. This paper outlines a UAV charging dock on a Husky A100 UGV, which reliably charges a quadcopter’s 4cell lithium-ion or lithium-polymer battery at a 1C charging rate. This charging dock maintains the quadcopter in a secure position during Husky movement via electromagnetic locks, and is equipped to work with UAV vision systems for precision landing.
Ultrasound Guided Needle
Insertion Device for Percutaneous Nephrolithotomy
Team Members
Ethan Zhong
Advisors
Yichuan Tang
Vishali Baker
Emerson Shatouhy
Haichong Zhang
Percutaneous nephrolithotomy (PCNL) is a surgical procedure that ablates kidney stones in patients when they are too large to be passed naturally. This requires inserting a needle into the patient that reaches the kidney stones. Surgical instruments are then inserted through the needle hole to complete the procedure. Physicians completing surgical urological procedures, such as PCNL, often use fluoroscopy to aid in viewing organs. While this doesn’t harm the patient due to its low radiation amount, the exposure to radiation builds over time for physicians. The need for radiation-free PCNL procedures has been explored and several ultrasound-based solutions have been offered, but none offer the same level of imaging standards that fluoroscopy has. The purpose of this project is to develop an ultrasound-guided needle insertion device for PCNL procedures that allows physicians to see the needle in the insertion plane of view. We plan to account for needle bending by allowing the physician to refract the image +/- 15 degrees, which will prevent issues involving reinsertion. For the design of this project, we considered several ultrasound gel materials that created a strong image contrast while also remaining contained within the device. We also needed a fully encapsulated device that would also allow for mirror rotation. After considering contrast versus consistency to determine the best material, a hybrid 3% agarose and olive oil solution was chosen in order to maintain a waterproof device while also having a viscous liquid that the mirror can move through. We then considered several design options that allow for clinical precision and accuracy while maintaining a natural feel to the clinician. The final design ensures the ultrasound probe is held in a similar position to a normal ultrasound procedure, The device is easy to use, with a simple locking mechanism that requires no screws or hardware to attach the probe to the device. The slot for the needle is made so that after the procedure the device can be pulled away easily without the needle so it does not need to be pulled away immediately. The needle algorithm takes the ultrasound images and looks for the needle. This assists the physician in ensuring the needle is on the right path towards the kidney stone. It also will be implemented in a tracking algorithm that will automatically rotate the mirrors the keep the needle tip in the plane of view of the image.
VR Telepresence
Robot
Team Members
Maxwell Friedman
Luis Garcia Valecillos
Prahladh Raja
Tristin Youtz
Andre Rosendo
Fabricio Murai Advisors
Abstract
The goal of this project was to develop a method to extend human sensory perception and physical embodiment through the means of virtual reality (VR) enabled robotic telepresence. The telepresence platform was designed to replicate the movement of a user’s perspective with a 360-degree camera, steaming said video to a VR headset. This project shows a proof of concept for a VR telepresence system which, according to users, can provide an experience more immersive than traditional telepresence systems.