Reactive Human-to-Robot Handovers of Arbitrary Objects


Human-robot object handovers have been an actively studied area of robotics over the past decade. However, very few techniques and systems have addressed the challenge of handing over diverse objects with arbitrary appearance, size, shape, and rigidity.

We present a vision-based system that enables reactive human-to-robot handovers of unknown objects. Our approach combines closed-loop motion planning with real-time, temporally-consistent grasp generation to ensure reactivity and motion smoothness. Our system is robust to different object positions and orientations, and can grasp both rigid and non-rigid objects.


Chris Paxton (NVIDIA)
Maya Cakmak (NVIDIA, University of Washington)

Publication Date

Research Area

Uploaded Files


Best Paper in Human-Robot Interaction, ICRA 2021