Hostname: page-component-89b8bd64d-z2ts4 Total loading time: 0 Render date: 2026-05-06T13:03:50.568Z Has data issue: false hasContentIssue false

Force-based organization and control scheme for the non-prehensile cooperative transportation of objects

Published online by Cambridge University Press:  18 December 2023

Mario Rosenfelder
Affiliation:
Institute of Engineering and Computational Mechanics, University of Stuttgart, Stuttgart, Germany
Henrik Ebel*
Affiliation:
Institute of Engineering and Computational Mechanics, University of Stuttgart, Stuttgart, Germany
Peter Eberhard
Affiliation:
Institute of Engineering and Computational Mechanics, University of Stuttgart, Stuttgart, Germany
*
Corresponding author: Henrik Ebel; Email: henrik.ebel@itm.uni-stuttgart.de
Rights & Permissions [Opens in a new window]

Abstract

Over decades of robotics research, cooperative object transportation has been studied as a meaningful model problem for robotic networks because it possesses a variety of crucial challenges. Although these challenges are demanding, the cooperation of multiple robots has the potential to solve automation problems that are beyond the scope of an individual robot. So far, the model problem has mostly been addressed by explicitly controlling the robots’ positions. However, the position-based approach suffers from some intrinsic detriments, for example, the lack of explicit feedback between robots and object. Moreover, it remains an open question how many robots shall be employed to ensure a successful transportation. This paper’s purpose is to overcome these challenges using a novel force-based approach taking into account the robots’ actual manipulation capabilities, that is, the exerted forces. Using cost-efficient hardware, the interaction forces are measured and, what is more, explicitly controlled by a highly responsive onboard controller. Employing a tailored software architecture, the novel force-based scheme, useful for robotic manipulation beyond the benchmark problem, is probably the most flexible of its kind regarding the number of robots and the object’s shape. The controller’s functionality and performance as well as the scheme’s versatility are demonstrated by several hardware experiments.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press
Figure 0

Figure 1. Photograph of the employed omnidirectional mobile robot equipped with a force-sensing unit.

Figure 1

Figure 2. Left: Illustration of an exemplary polygonal object (dark gray) and its dilated edges (dark blue). Middle: Potential placement of the mobile robots around the object. Right: Projection of the obtained zonotope at $\tau =\tau _{\text{des}}$ (blue) and rhombus (green) for constraint softening for the configuration depicted in the middle. (a) Dilated edges of the object. (b) Contact normals and lever arms. (c) Constraint softening for $^{\text{S}}\boldsymbol{f}_{\text{des}}\notin \mathcal{G}(\boldsymbol{w})$.

Figure 2

Figure 3. Employed hybrid position-force control concept.

Figure 3

Figure 4. Still images from three exemplary hardware experiments. (a) Four mobile robots transporting an object of rectangular shape along a squared path. (b) Five mobile robots transporting a +-shaped object along a sinusoidal path. (c) Five mobile robots transporting a ⊤-shaped object along an edgy path consisting of pure translation and rotation segments.

Figure 4

Figure 5. Positional path tracking errors for the hardware experiments depicted in Fig. 4.

Figure 5

Figure 6. Left: Overlaid results of ten exemplary experiments in which three mobile robots shall transport an $\llcorner$-shaped object along a circular path. Right: Trajectories of the measured contact forces (color corresponds to the robots’ colors on the left) and the corresponding desired force (in red color) for each robot.