RBE Colloquium Series Presents STEM Launch Participants: Vaibhav Unhelkar and Tapomayukh Bhattacharjee

Friday, October 05, 2018
11:15 am to 12:15 pm
Floor/Room #: 
GP 2001

RBE Colloquium Series Welcomes STEM Launch Participant
Tapomayukh Bhattacharjee

Inferring Contact Properties Using Multimodal Sensing During
Non-Prehensile Manipulation

Abstract: Multimodal sensing can enable a robot to infer properties of contact with its surroundings. Recent research has focused on robots that haptically perceive the world through exploratory behaviors that occur over tens of seconds. During manipulation, many opportunities arise for robots to gather information about the environment from brief (<= 2 seconds) contact due to simple non-prehensile motions. The goal of our work is to enable robots to infer contact properties under these conditions using force, motion, thermal, and visual sensing.

We used a data-driven approach with various machine learning methods. Key challenges were obtaining adequate haptic data for training and developing methods that performed well on haptic data that differed from the training data due to common real-world phenomena such as robot velocity, stiffness, and sensor temperature. To collect suitable data, we used a variety of platforms, including simplified robots, handheld human-operated devices, and a mobile robot. We also generated synthetic data with physics-based models. Through careful empirical evaluation, we identified machine learning methods that better handled common signal variations. We also used physics-based models to characterize common perceptual ambiguities and predict the performance of data-driven methods.

Finally, we extended our work to show the role of multimodal sensing for non-prehensile manipulation of deformable food items during assistive feeding. Overall, our research demonstrates the feasibility of robots inferring properties of the world from brief contact with objects in human environments. By using multimodal sensing, our methods rapidly recognized materials, detected when objects moved, detected contact with people, and inferred other properties of the robot’s surroundings.

Bio: Tapomayukh "Tapo" Bhattacharjee is a postdoctoral research associate in Computer Science & Engineering at the University of Washington, working with Professor Siddhartha Srinivasa in the Personal Robotics Lab. He completed his Ph.D. in Robotics from Georgia Institute of Technology under the supervision of Professor Charlie Kemp, received his M.S. from Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea, and B.Tech. from National Institute of Technology, Calicut, India. He also worked as a Visiting Scientist with the Interaction and Robotics Research Center, Korea Institute of Science and Technology (KIST), Seoul, South Korea. His research interests include haptic perception, tactile sensing, robotic manipulation, machine learning, human-robot interaction, and teleoperation systems.


RBE Colloquium Series Welcomes STEM Launch Participant

Vaibhav Unhelkar

Algorithms for Enabling Fluent Interaction
Between Humans and Robots

Abstract:  Traditionally, robots have excelled in structured and predictable settings, such as safety cages in factories. Gradually but certainly, robots are transitioning from these predictable settings to uncertain environments that include humans, e.g., homes, hospitals, roads, and offices.  While noteworthy progress has been made in recent years, achieving fluent interaction between robots and humans remains an active and challenging problem.  I believe that novel learning algorithms that require little data, and novel decision-making algorithms that require little planning time, are essential to enable fluent human-robot interaction in complex, uncertain environments.  My research focuses on developing these algorithms, and builds upon tools from planning under uncertainty, Bayesian inference, and insights from human factors. In this talk, I will discuss these algorithms for interaction, and share recent results of introducing collaborative robots between humans.

Bio: Vaibhav is a Ph.D. candidate in the Interactive Robotics Group at the Massachusetts Institute of Technology (MIT), studying machine learning and human-robot interaction.  He is interested in achieving better interaction between humans and machines (robots, software agents and algorithms), in order to enable human-machine teams to solve complex real-world problems. Towards this vision, he has developed algorithms to predict human decisions with limited data and to generate robot decisions in collaborative time-critical tasks.  Vaibhav has collaborated with industry partners and his algorithms have been applied to collaborative robots at BMW. His work has been published in several selective venues, including ACM/IEEE Conf. on Human-Robot Interaction (HRI), Conf. on AI (AAAI), IEEE Conf. on Robotics and Automation (ICRA).  He completed his Bachelor's studies at the Indian Institute of Technology (IIT) Bombay, where he worked on nanosatellites and aerial vehicles. For links to papers and research, see: http://people.csail.mit.edu/unhelkar/

Friday, October 5, 2018
11:15 a.m. - 12:00 p.m.
60 Gateway Park, GP 1002

Teresa Hemple