Berk's research primarily focuses on problems related to robotic manipulation, which is a key functionality largely missing from the current state of the art in robotics for unstructured environments, including homes, modern warehouses, and collaborative manufacturing stations. He develops multi-modal robotic manipulation strategies mainly focusing on the role of vision feedback for coping with uncertainties of unstructured environments. He integrates advanced control methods, active vision framework, machine learning and intelligent mechanical design to provide robust dexterous manipulation capabilities.
Prior to WPI, Berk worked in the Grab Lab at Yale University on robust within-hand manipulation techniques. He is also one of the founders and the main administrator of the Yale-CMU-Berkeley (YCB) object set project, which facilitates benchmarking efforts worldwide for robotic manipulation.
Berk’s current focus is to utilize robots in sustainability projects (e.g. sorting for recycling) by solving complicated manipulation problems therein.