Non-Prehensile Tool-Object Manipulation by Integrating Large Language Model-Based Planning and Manoeuvrability-Driven Controls

Hoi-Yin Lee, Peng Zhou, Anqing Duan, Wanyu Ma, Chenguang Yang, and David Navarro-Alarcon


Motivation



Image

Human and animal have long demonstrated their adeptness in utilizing tools to navigate daily life challenges.
But what about robots?
Imagine a dual-arm robot effortlessly wielding tools to conquer complex tasks with precision, mirroring human capabilities.
Can such a machine truly match human performance?



What We Do



Our research delves into this intriguing question, focusing on three key areas:
  • Manoeuvrability-driven Tool-Object Manipulation Method
  • Non-prehensile Tool-Object Manipulation with Environmental Constraints
  • Determining Subtask Sequences and Collaboration


flowchart

(a) The task environment includes a camera for real-time top-view capturing, a dual-arm robot, tool(s), and a blue block to be manipulated to the target location. (b) The architecture of our system: Unstructured data input is converted to a subtask list in the symbolic task planner with an LLM, and a manoeuvrability-driven planner to compute the tool's manoeuvrability and generate an affordance-oriented motion and path. (c) Execution process of the result given by the system: dual-arm robots take turns pushing the blue block from one side to another via collaboration.


Image

Experiment Setup


Close range manipulation



Image
Image
Image

Long-horizon Tasks



Task 1: Move the block to the top-left side of the left-arm.


Image


Task 2: Move the block to the target.


Image
Image
Image
Image

Tool Sharing



Can the robot shares the tool?


Task 3: Move the block to the top-left side of the left-arm.


Image

Non-prehensile Manipulation with Environmental Constraints



Our research investigates how robots adapt to environmental constraints, showcasing their ability to manipulate objects even in a constrained area.

C-shaped wall



Image

U-shaped wall


Image

Image

Video