Hostname: page-component-6766d58669-88psn Total loading time: 0 Render date: 2026-05-15T19:14:17.243Z Has data issue: false hasContentIssue false

Interception of an aerial manoeuvring target using monocular vision

Published online by Cambridge University Press:  03 August 2022

Shuvrangshu Jana*
Affiliation:
Guidance, Control, and Decision Systems Laboratory (GCDSL), Department of Aerospace Engineering, Indian Institute of Science, Bangalore-12, India
Lima Agnel Tony
Affiliation:
Guidance, Control, and Decision Systems Laboratory (GCDSL), Department of Aerospace Engineering, Indian Institute of Science, Bangalore-12, India
Aashay Anil Bhise
Affiliation:
Guidance, Control, and Decision Systems Laboratory (GCDSL), Department of Aerospace Engineering, Indian Institute of Science, Bangalore-12, India
Varun VP
Affiliation:
Robert Bosch Center for Cyber Physical Systems, Indian Institute of Science Bangalore-12, India
Debasish Ghose
Affiliation:
Guidance, Control, and Decision Systems Laboratory (GCDSL), Department of Aerospace Engineering, Indian Institute of Science, Bangalore-12, India
*
*Corresponding author. E-mail: shuvra.ce@gmail.com
Rights & Permissions [Opens in a new window]

Abstract

In this work, a vision-based guidance algorithm for interception of an aerial manoeuvring target is proposed. Vision-based aerial engagement in 3D space in an outdoor environment is challenging due to uncertainty in estimating the target location. The vision-based guidance algorithm proposed here generates the desired velocity commands for the interceptor drone based on the orientation of the target pixel in the image plane. The desired velocity at the centre of gravity of the interceptor is calculated such that the velocity at the camera centre is aligned towards the line joining the target pixel and the camera centre. It is shown that the proposed guidance strategy forces the target pixel towards the centre of the image plane and leads to low miss distance. The algorithm developed is first tested in the ROS-Gazebo platform and then implemented in field experiments.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press
Figure 0

Figure 1. Target-interceptor engagement geometry.

Figure 1

Figure 2. Guidance and control architecture.

Figure 2

Figure 3. Engagement scenario in new frame.

Figure 3

Figure 4. Trajectory in $V_{\theta }, V_{\phi }, V_{r}$ space.

Figure 4

Figure 5. Capturability zone of the guidance algorithm.

Figure 5

Figure 6. Trajectories of the interceptor UAV and target.

Figure 6

Figure 7. Variation in LOS and commanded yaw rate: Target in straight line path.

Figure 7

Figure 8. Commanded velocities and corresponding desired accelerations: Target in straight line path.

Figure 8

Figure 9. Tracking of guidance commands.

Figure 9

Figure 10. Commanded yaw rate and variation in LOS: Target in circular path.

Figure 10

Figure 11. Commanded velocities and corresponding desired accelerations: Target in circular path.

Figure 11

Figure 12. Hardware architecture.

Figure 12

Figure 13. Snapshots from flight tests showing the frames, where target capture is accomplished.

Figure 13

Figure 14. Interception of manoeuvring target: hardware implementation (a) Variation of target pixel (image plane: 640 $\times$ 480) and (b) Variation of ball depth.

Figure 14

Figure 15. Commanded vs actual states.

Figure 15

Figure 16. Attitude of UAV during the mission.

Figure 16

Figure 17. Snapshots of engagement from camera attached to UAV (a) Ball depth = 8.92 m and (b) Ball depth = 4.16 m.

Figure 17

Figure 17. (Contd.) Snapshots of engagement from camera attached to UAV (a) Ball depth = 0.77 m and (b) After interception.

Supplementary material: PDF

Jana et al. supplementary material

Jana et al. supplementary material

Download Jana et al. supplementary material(PDF)
PDF 6.9 MB