2026 Robotics Engineering MQPs
|
404LensNotFound: Lensless Dragon-fly Sized Aerial Robot Navigation Abstract: Micro aerial robots require lightweight onboard sensing and perception systems to navigate in cluttered low-light environments, whereas traditional lens-based cameras or LIDARs are often bulky and power-intensive. In this work, we explore several lensless computational imaging approaches including phase masks, multi-lens arrays, and diffuser-based cameras. Ultimately, we opted for lensless diffuser which replaces conventional lens with a ‘piece of scotch tape’ to capture encoded images of the scene. We calibrate the imaging system by capturing its Point Spread Function (PSF) and reconstructing the scene from these encoded measurements. Rather than aiming for high-quality reconstructions, our approach is driven by a key insight that the system passively emphasizes foreground structures while suppressing background information. We developed a pipeline for parsimonious obstacle detection to enable aerial navigation while dodge obstacles. We demonstrated autonomous navigation for micro aerial robots in the dark with the speeds of upto of 5m/s. Team Members: Vivek Reddy Kasireddy, Hudson Kortus, Jahnavi Prudhivi Advisors: Nitin Sanket |
|
Agile MAV Navigation through Cluttered Environment Abstract: Search and rescue applications require robots that are small and agile, due to being space-constrained and highly cluttered. Micro aerial vehicles (MAVs) small size and maneuverability are perfect for these environments, However, autonomous navigation through cluttered environments still remains a challenge, with current methods struggle from high-weight payloads, requiring bulky cameras and heavy compute modules. We turn to event cameras, which have high dynamic range, low latency, nearly no motion blur, and extremely power efficient. In this work, we propose a modular quadrotor platform designed to be highly extensible and agile. An event camera-based perception stack, global planner, and model predictive controller running on an onboard hardware coprocessor provide low-latency autonomy in uncertain environments. To achieve demanding motions and compensate for external forces, we utilize a lower-level adaptive attitude controller running at a high frequency. The platform is designed to support a variety of hardware configurations while retaining strength and interoperability with existing research structures. We test our solution in a photorealistic Software-in-The-Loop (SiTL) simulation framework utilizing gaussian splatting to mimmic real-world performance. We benchmark our event-based perception stack against commercial RGBD cameras at varying levels of depth quality and event sensor parameters to prove robustness. Our platform outperforms existing frameworks and demonstrates superior modularity, allowing for perception modules and controllers to be hot-swapped without requiring retraining or redesign. Our research platform will enable future developments in high performance quadrotor control and neuromorphic perception research. Unlike monolithic end-to-end approaches, our modular design retains the benefits of event-based vision while providing the high-level control required for safety-critical search and rescue missions. Team Members: Colin Balfour, Evan Kaba, Rohan Inamdar Advisors: Nitin Sanket, Guanrui Li |
|
Autonomous Robotic Design for Strategic Adversarial Games Abstract: Worldwide, over 10,000 high school robotics teams compete in a fast-paced, annual competition using teleoperated robots. Our goal is to demonstrate that a fully autonomous robot can compete at this same level. Autonomous performance in dynamic, adversarial environments remains a major challenge in robotics, requiring systems that can continuously interpret the environment and respond in real time. This project aims to develop a fully autonomous system that integrates multi-sensor perception, real-time state estimation, and a strategic decision-making model to determine how to respond to an opponent. By translating advanced robotics into a familiar and engaging context, we aim to inspire the next generation of engineers. Team Members: Kylie Herbstzuber, Leo Riesenbach, Luca Dang, Elene Kajaia, Mason Sakai, Vishesh Konduru, Dylan Schmit, Graham Mack Advisors: Griffin Tabor, Bradley Miller, Xinming Huang |
|
Designing a Blind Quantum Algorithm for Secure Robot Path Planning Abstract: As the capabilities of robotics expand, the computational demands for complex tasks grow as well. In response to this demand, cloud robotics is often introduced to offload intensive tasks from a robot's limited hardware to a powerful remote server, yet this delegation introduces a critical vulnerability: sensitive operational data must be exposed to an untrusted server. In this work, I explore the transformation of cloud computing for robotics problems into a quantum environment while ensuring the server executes the algorithm without ever learning the client's sensitive information or computation. Specifically, I adapt the quantum path planning algorithm proposed by Chella et al. [Mathematics 2022, 10, 2475] to a newly developed hybrid light-matter blind quantum computing framework [Science 388, 509-513 (2025), arXiv:2505.21621], proposing a blind circuit design strategy that conceals the client's sensitive information while preserving the correctness of the computation. I provide a runtime estimate to demonstrate practical feasibility and analyze the trade-off between the information security and the quantum resources required. Team Members: Sunny Kang Advisors: Raisa Trubko, Carlo Pinciroli |
|
Flarebot: Unmanned Firefighting Reconnaissance Robot Abstract: Flarebot is a solution designed to perform real-time reconnaissance within structure fires. It works in parallel with firefighters to minimize human risk and provide first responders with the critical information they require to assess the environment and conduct rescues. To ensure survivability in such extreme conditions, every mechanical component was custom-designed and manufactured; additional thermal mitigation methods were implemented to allow for extended operations. The system contains an array of sensors for wireless monitoring of internal health and external environmental conditions, while integrating a computer vision model to identify trapped individuals within smoke-filled environments. Flarebot serves as a cost-effective, user-friendly, and compact alternative to current fire reconnaissance robots, demonstrating the advancements of firefighting technology in an emerging field. Flarebot was developed to save lives. Team Members: Nicholas Carignan, Aidan Carter-Frem, Max Gosselin, Trajen Masner, Henry Wagg, TJ Weeden Advisors: Mustapha Fofana, Griffin Tabor, Jacob Whitehall |
|
General Purpose Continuum Kinematically Adaptable Origami Robot (GeCKO) Abstract: GeCKO is a continuum-bodied hexapod robot, inspired by a series of continuum-bodied mobile robots developed in the WPI Soft Robotics Lab. This project was inspired by 2 of its predecessors, CLARA and Lizard, which were designed to navigate pipe systems and travel over uneven terrain, respectively. These robots all utilize origami-inspired Yoshimura modules. These accordion-like modules expand and contract along their vertices, controlled by winch cables. This allows for controlled bending, opening up numerous unique locomotion opportunities. The main body of GeCKO is heavily based on CLARA in order to maintain the pipe navigation capabilities achieved by CLARA. By replacing CLARA's variable-diameter suspension with tripodal leg modules, GeCKO is able to reconfigure itself in numerous ways. This adaptable design allows for locomotion through myriad environments, making it capable of navigating pipe systems, rough terrain, and highly constrained environments. This adaptability opens up many new avenues for future research in a number of fields. Possible future applications include pipe inspection, search and rescue, and cave exploration. Team Members: Ben Proctor Advisors: Cagdas Onal |
|
Abstract: This Major Qualifying Project presents the design, prototyping, and simulation of GrowBot (GR-0X), a humanoid robotic platform featuring variable-length limbs capable of linear extension and retraction. Unlike conventional humanoid robots with fixed dimensions, GrowBot employs prismatic actuation in its limbs, enabling dynamic height adjustment and extended reach. Each limb achieves linear actuation through a mechanism driven by three lead screws machined to withstand high axial loads. The frame is primarily 3D-printed, with metal components inserted at key structural interfaces such as lead screw assemblies and bearings. Finite element analysis in SolidWorks validated the structural integrity of the lower body under representative loading conditions. For control and simulation, the architecture was built on ROS 2 Humble with Gazebo Harmonic, utilizing a modular ros2_control framework to interface joint trajectory controllers, PID loops, and state broadcasters. Hardware integration centers on a Jetson Nano for high-level processing, an FPGA for real-time motor feedback, and CAN bus communication for Robstride actuators. Team Members: Elliot Ghidali, Fisal Qutubzad, Yael Whitson Advisors: Mahdi Agheli |
|
Implementing Tabletop Pick-And-Place on a 3D Printed Humanoid Robot Abstract: This report presents the contributions made by the 2025-2026 Major Qualifying Project (MQP) team on WPI’s toddler-sized, 3D printed humanoid robot. The team focused on creating a structured framework for arm motion control and integrated computer vision models for tabletop pick-and-place tasks, specifically playing chess, picking pill up bottles and temperature measurement. The motion control framework, tested and implemented in the Genesis simulation platform, handles inverse kinematics path planning using the Open Motion Planning Library (OMPL) and provides joint positions for use on the hardware, shrinking the sim-to-real gap. The robot leverages YOLO-based vision models and DepthAI to detect and recognize objects for various pick-and-place tasks such as chess and pill bottle grabbing. This enhances the robot’s ability to perform basic arm locomotion tasks and provides a scalable framework for related tabletop activities. Numerous tests were carried out on actual hardware demonstrating the robot’s ability to play chess and carry out other pick and place tasks. Team Members: Max Williams, Arjun Vyavaharkar, Aziel Habtemichael, Nicolas Graham, Ethan Ford, Nathaniel Caughron Advisors: Taylor Andrews, Pradeep Radhakrishnan |
|
MARS: Reducing Hardware Integration Overhead in Robotic Systems through Language Design Abstract: Robotic systems development remains constrained by the complexity of hardware integration, where adapting software to new or reconfigured hardware introduces substantial overhead in configuration, validation, and iterative testing. Widely adopted middleware systems such as Robot Operating System (ROS) emphasize flexibility and dynamic composition, but defer most compatibility checks to runtime, limiting their ability to provide strong correctness guarantees during development. We present MARS (Modular Abstraction for Robotics Systems), a domain-specific programming language that elevates hardware abstraction to the language level through programmatic hardware configuration, a structured component model with constrained inheritance and composition rules, static compatibility checking between hardware interfaces and user-defined requirements, and automatic unit management for physical quantities. We evaluate MARS on two heterogeneous robotic platforms: Phobos, a custom modular robot built on a Raspberry Pi with a quick-swappable chassis, and Deimos, an AWS DeepRacer running ROS2-Foxy. Team Members: Reilly Desai, Olivia Olsen, Colin Masucci Advisors: Carlo Pinciroli |
|
Medical Emergency Rescue Drone Abstract: For frontline medics, the modern battlefield poses a difficult challenge: locate the wounded, determine who requires immediate care, deliver life-saving treatment, and coordinate evacuation, all while operating with scarce resources, inaccessible terrain, and relentless time pressure. To address this, we developed a non-contact active perception sensing device that can be hand-held, mounted on a drone, or integrated into a robot to scan a patient and determine their injuries, accelerating the diagnosis process during mass casualty events. This platform is built around three complementary sensing technologies: an RGB-NIR camera, an infrared thermal camera, and a millimeter ultra-wideband radar. Together, these sensors form a comprehensive diagnostic suite capable of external wound detection, internal trauma identification, and physiological monitoring. The sensor data are fused to enable targeted analysis of patient conditions and support rapid triage decisions. Experimental results demonstrate reliable human detection, accurate identification of blood and elevated temperature regions, and baseline vital sign measurements, supporting the feasibility of rapid, non-contact triage in high-risk and resource-constrained environments. Team Members: Caitlin Murphy, Ian Hagglund Advisors: Giovanni Pittiglio, Guanrui Li |
|
Abstract: The NASA Lunabotics Challenge was created to help identify innovative robotic designs for the Artemis program. This two-semester competition invites higher education students to design and prototype a lunar robot by applying NASA’s systems engineering process. In previous years, teams were tasked with designing a robot capable of excavating in-situ resources below surface-level regolith simulant. Recently, NASA presented teams with a challenge that involves excavation, navigation, and berm construction. Therefore, the goal of this project was to design, manufacture, and test a semi-autonomous lunar excavation vehicle that meets these functional requirements. The 2025-2026 WPI Lunabotics Team developed the WPI Moon Tracks Rover to fulfill these project objectives. Contributing to this process provided a unique opportunity for undergraduate students to acquire hands-on engineering experience while applying concepts critical to systems engineering. Team Members: Christopher Adzima, Luis Alzamora, Jesse Dawson, Jeremy De La Cruz, Abraham Dionne, John Larochelle, Alana Moretti, Michael Napoleone, Piotr Skoczylas Advisors: Kenneth Stafford, Suat Ay, Loris Fichera, Ibrahim Bozyel |
|
Radio, Robot, Recycle: Employing Multimodal Sensing and Actuation for Recycling Abstract: Recycling is a large-scale, repetitive, and hazardous task making the employment of robotics for waste sorting an attractive prospect. However, traditional robotic techniques struggle with the variety of waste materials. To solve these issues, our team has constructed a recycling robotic system which employs multimodal sensing and manipulation techniques to sort waste. By implementing the robotic system, our team highlights the need for multiple adaptive sensing and manipulation techniques in robot recycling domains. Team Members: Stephen Wojcik, Maxwell McCalla, Alec Norton, Karyn Carrion, Kylie Solecki Advisors: Giovanni Pittiglio, Bashima Islam |
|
Robotic Plectrum System for Stringed Instrument Characterization Abstract: The tonal response of stringed instruments changes based on factors including construction material and hardware or electrical components, but at its core, human playing technique causes the greatest variability in the sound created. This project aimed to create a robotic system that could emulate human playing techniques while holding variables such as position from the string and picking speed constant. A system was developed with the ability to accurately locate the strings of any guitar placed on it and use that positioning to play individual strings with a high degree of repeatability. The system was evaluated through comparison between human playing and robotic playing on a variety of guitars, analyzing the spectral response of notes played. Team Members: Theo Barnes-Cole, PJ Aubin Advisors: Scott Barton |
|
Robotic Solution for Automated Loading of Centrifuges Abstract: This project develops a low-cost robotic system that automates microplate loading and unloading in standard benchtop centrifuges, making laboratory automation more accessible to small and cost-sensitive organizations. By integrating with an existing SCARA-type robotic arm, the system overcomes the limitations of traditional plate handling by enabling precise vertical and deep rotor placement required by most centrifuges. The solution combines a custom mechanical interface, control electronics, and user-friendly software to deliver a reliable, safe automation platform. It supports simple high-level commands, such as loading, unloading, and spinning, implemented in Python within a ROS2 environment. The system also includes imbalance detection and built-in safety features to ensure dependable operation. By retrofitting widely available manual centrifuges rather than relying on expensive automated systems, this project delivers a compelling, task-specific form factor that significantly reduces costs while maintaining professional-grade performance. It's a scalable, affordable automation solution that expands access to advanced laboratory workflows and drives innovation across the biotech industry. Team Members: Grace Amlicke, Tejas Balcha, and Nicholas Palumbo Advisors: Constantinos Chamzas, Mahdi Agheli Industry Sponsor: Aera Therapeutics |
|
Abstract: The Sailbot 25-26 program is WPI’s 10th annual entry for the International Robotic Sailing Regatta. The competition was founded to promote developments in autonomous sailing, and features multiple events. Along with general upkeep and operator quality of life changes, the team has iterated on 6 primary systems: The sail’s trim tab and accompanying counterbalance have been redesigned, aiming for a more modular and efficient approach. New rudders have been fabricated, and a new fuzzy logic controller supporting them, improving performance for low-wind environments. The vision system architecture was overhauled, pivoting from blob detection to neural network recognition with a custom YOLO11n model. A new keel has been fabricated and installed to reduce weight while preserving righting moment. An existing damper system has been reworked to more effectively reduce oscillation in mast rotation. Numerous upgrades have been made to the mobile controller app to provide more detailed information to users, enabling superior control. Team Members: Arshia Balaji, Benjamin Laster, Bolong Li, Brianna Meisser, Henry Pharris, Steven Vovcsko Advisors: William Michalson, Kenneth Stafford |
|
Seacoast Mushrooms - Automated Mycelium Inoculator Abstract: Small scale mushroom production is important to local economy and supply-chain independence, while providing communities with high-nutritional value food. Growing mushrooms is a unique and sensitive process that requires precise environmental control with high levels of sterility. Large scale mushroom production relies on expensive, proprietary equipment, while 'home grower' solutions are tedious to scale and have high error rates since minimal contact with the mycelium must be maintained. This project served to design and produce an automated mycelium inoculator for Seacoast Mushrooms that provides industry level repeatability, precision, and sterility at a price point that is accessible to small scale producers. The system incorporated affordable, off-the-shelf components in a novel structure to open a sealed bag of sterilized substrate, dispense a precise amount of spawn material without direct contact, and deliver the inoculated bag to an existing sealing mechanism. A prototype of the system demonstrated tighter dosing accuracy and fewer contaminated inoculations than the current manual process while saving operator cost and time. Team Members: Elliot Reese, Daniel Raymond, Robert Mellen, Ricardo Croes-Ball Advisors: Koksal Mus, Fiona Yuan |
|
Surgical Control Assistant: Live-tablet Platform for an Enhanced Laser-Integrated Robot (SCALPEL-IR) Abstract: Primo explores a method for printing construction materials using a mobile robot platform. The small robot is equipped with a pump capable of extruding various materials through a nozzle to print layers on the ground. These materials include construction-grade substances such as clay and cement, which are transported using custom-designed pumps driven by DC motors. The robot navigates its environment using LiDAR, allowing it to operate autonomously. Due to its compact size, Primo can print in locations and configurations that are typically inaccessible or impossible to large commercial concrete 3D printers. Team Members: Kimberly Cummings, Leonel Strangman, Diego Pena-Stein, Kang Zhang Advisors: Loris Fichera |
|
A System for Watering and Monitoring Plants, S.W.A.M.P. 2.0 Abstract: Maintaining healthy indoor plants requires consistent care that many individuals and businesses struggle to provide. Existing automated solutions, such as stationary smart sprinklers or single-pot monitoring systems, fail to address the full scope of indoor plant care. S.W.A.M.P. 2.0 (System for Watering and Autonomously Monitoring Plants) is an autonomous robot designed to monitor and water multiple indoor plants with minimal user interaction. Building on the first iteration, S.W.A.M.P. 2.0 introduces several key improvements: a user-friendly companion application that lowers the barrier to entry for consumers; an upgraded SLAM-based navigation system using a depth camera for mapping and path planning; an object detection model for plant identification; and a redesigned mechanical system for improved structural rigidity. To evaluate system performance, S.W.A.M.P. 2.0 was tasked with mapping and navigating an unknown environment while actively avoiding obstacles and correctly identifying and watering plants accordingly. Completion of these tasks indicates that S.W.A.M.P. 2.0 is a viable autonomous indoor plant care system for residential and commercial environments. Team Members: Cristian Oliveira, Kaloyan Dimitrov, Randy Zhang, Steven Phan Advisors: Greg Lewin, Loris Fichera |
|
Think Again: Improving Plan Safety through LLM Informed Risk Analysis Abstract: As Large Language Models (LLMs) continue to advance in ability and scope, their role in the next generation of robotic systems is rapidly expanding. Such systems are increasingly being deployed in real-world environments with tangible consequences, and the ability to generate safe and reliable plans that consider and account for environmental uncertainties is critical to ensuring the wellbeing of both humans and robots. LLMs have shown promising capabilities in generating safe and feasible plans for robots, and additional research has started to emerge investigating the use of LLMs to verify the safety of generated plans. Formal methods such as Linear Temporal Logic (LTL) have successfully been used to ensure generated plans meet mission requirements; however, they do not tend to generalize well to unseen or unpredicted scenarios due to their rigid and exact definitions. LLMs have shown an impressive capability to provide workable answers and broader contextual understanding to a large spectrum of tasks but cannot be relied upon to provide worthwhile solutions to problems due to their probabilistic nature. In this MQP, we aim to combine the strengths of classical and data-driven methods of planning while using the generalized problem-solving abilities of LLMs to inform classical LTL based planning, creating workable and safe plans that account for the wider context of a scenario to fulfill mission specifications. To demonstrate the capabilities of this method, we implemented an LLM informed planning system, deemed ‘Paranoia,’ that takes a plan and augments it by weighting decisions based on predicted environmental hazards. We evaluate our method in several different environments with hazards of varying severity and compare the output and computational requirements against other pipelines that involve LLMs in the verification process to different extents. Team Members: Benjamin Cruse, Colette Scott, Gavin Hamburg Advisors: Kevin Leahy |
|
Toward Automated Parachute Packing: A Multi-Arm Robotic Approach to Vision-Guided Line Stowing Abstract: Parachute packing is a largely manual process due to the complexity of working with the canopy and suspension lines, which are made of flexible and deformable materials. The specific task of stowing suspension lines into loops on the back of the parachute bag requires consistency and controlled tension, which makes it difficult to automate a traditional robotic manipulator. The result is a process that is time-consuming, labor-intensive, and operator dependent. To address this challenge, we developed a robotic system that can manipulate flexible lines and guide them through two arrays of constrained elastic loops. It performs the entire stowing task by coordinating a primary manipulator and two secondary tensioning arms through a state machine to position, tension, and insert the lines. The system successfully completed the stowing process in testing, demonstrating repeatable line insertion while eliminating the need for manual effort. The results suggest that the robotic handling of these deformable materials, specifically long flexible lines, can be performed reliably. This work contributes to the advancement of robotic manipulation of textiles, which is essential for bringing automation to industries that face issues with variability, compliance, and repeatability. Team Members: Chenhan Guan, Zheren Li, Fiona Prendergast, Gunnar Rorapaugh, Oliver Van Campen Advisors: Constantinos Chamzas, Connor McCann, Mahdi Agheli, Jing Xiao |
|
Abstract: In recent years, we’ve seen great advancements in prosthetic devices. However, traditional rigid prostheses have seen a high abandonment rate due to their lack of anthropomorphism, and overall compliance. As such, people have turned to soft robotics. Usually inspired by nature, these robots utilize soft materials to create flexible joints that can adapt to different scenarios. Over the course of this year, we designed a semi-rigid robotic hand intended to act as a prosthetic device. The hand features fingers made from TPU that can deform and allow users to grasp objects more easily. Soft hall effect sensors are integrated into the fingers to detect force cues. Future work could include integrating this gripper with other systems to develop a full prosthesis. Team Members: Aidan Connolly, Jason Gee, Abigail Mansour, Veronika Nowakowski Advisors: Cagdas Onal, Connor McCann, Neehal Sharrma |
|
UAV-UGV Collaboration for Sustained Mapping and Navigation Abstract: Effective search and rescue missions require first responders to understand their field environment and be able to operate safely within. UAV-UGV collaboration presents a complementary relationship for search and rescue operations. UGVs have a substantially higher battery capacity and carry payloads while UAVs are able to quickly survey large areas. The UAV assists in computer-vision algorithm-based map creation through capture of aerial imagery, and the UGV utilizes path planning algorithms to complete its field traversal task. Repeatability of the collaborative system’s function and sustained missions is ensured by autonomous landing and recharging of the UAV on a UGV-bound charging bed. Team Members: Kerry Xiao, Nico Paoli, Jason Albrecht, Achintya Sanjay Advisors: Kevin Leahy, Greg Lewin Industry Sponsor: The Charles Stark Draper Laboratory, Inc. |
|
Ultrasound-Based Gesture Recognition for Human-Machine Interaction Abstract: In prosthetic control and muscle-sensing applications, electromyography (EMG) remains the current gold standard. While this method can sense movement, it is indirect and subject to noise and signal impedance from tissue, skin, and other muscle movements. Ultrasound sensing (Sonomyography) can directly monitor muscle movements, offering a promising alternative. Ultrasound has not been used in superficial muscle sensing applications due to the time-consuming and processing-heavy nature of interpreting image data. Additionally, noise and shifting sensors can impact model predictions in real-world settings. Previous projects succeeded in creating accurate gesture-classification models, but did not adapt them for real-time and real-world usage. We explored the application of ultrasound in human-machine interfaces; our goal was to create a model and testing setup that simulates prosthetic applications. This required developing an ultrasound sensing platform, designing a physical mounting solution, collecting a dataset of ultrasound measurements, developing a machine learning pipeline, evaluating model performance, and developing a graphical user interface. Our device can interface with a game, allowing users to play rock-paper-scissors using ultrasound gesture classification. Team Members: Aditri Thakur, Dylan Serreyn, Rachel Ellison, Ryan McQuillan Advisors: Haichong (Kai) Zhang |
|
Voice-Driven Instruction to Symbolic Task Execution for Automation (VISTA) Abstract: This Major Qualifying Project presents a voice-driven framework for robotic block manipulation using a UR10 robotic arm, exploring a more intuitive way for users to communicate high-level tasks while maintaining reliability for structured manipulation by combining language understanding with robot planning and execution. The system interprets natural language instructions, converting them into a structured intermediate world-state representation, which is then translated into a deterministic Planning Domain Definition Language (PDDL) problem for symbolic planning. The plans are then executed through a task and motion planning pipeline, ensuring each action is physically feasible and can be carried out safely and reliably.
Team Members: Khyat Sharma Advisors: Jing Xiao, Constantinos Chamzas |