Our main goal is to develop intelligent yet human-in-command robots that can collaborate with humans and assists them in their daily activities. We have a multidisciplinary approach, spanning theoretical, technical, and practical dimensions of modern collaborative systems. From the one hand, the HRI² research focuses on the development of advanced control frameworks for mobile and fixed-base collaborative robots and wearable assistive devices to boost their interaction autonomy. From the other hand, we develop cutting edge techniques and kinodynamic models to anticipate human socio-physical states. The development of intelligent interfaces with multimodal sensory processing capabilities is also central to the HRI² lab's strategic vision, with the aim to create timely and comforting (ergonomic) robot actions in response to human intentions and environmental variations.
Human-Robot Interfaces and Interaction (HRI²)
- Openings
- Technician Positions on Vision-based Human Intention Estimation and Activity Recognition using Machine Learning
- Two Fellow researcher positions on: Learning by demonstration and foundation models for robot programming and control in in dynamic and unpredictable environments
- Two Post Doc positions on foundation models for robot programming and control in in dynamic and unpredictable environments
IIT OpenTalk Magazine - Lab Highlights Show all >