Acessibilidade / Reportar erro

An Augmented Reality Visualization System for Simulated Multirotor Aerial Vehicles

Abstract

Multirotors Aerial Vehicles are special class of Unmanned Aerial Vehicles with many practical applications. The growing demand for this class of aircraft requires tools that speed up their development. Simulated environments have gained increasing importance, as they facilitate testing and prototyping solutions, where virtual environments allow real-time interaction with simulated models, with similar behavior to real systems. More recently, the use of Augmented Reality has allowed an increasing experience of immersion and integration between the virtual world and a real scenario. This work proposes the use of Augmented Reality technology and a simulated model of a multirotor to create an interactive flight environment, aiming to improve the user experience in the analysis of simulated models. For this purpose, a smartphone was adopted as a hardware platform, a game engine is used as a basis for the development of the Augmented Reality application, that represents a numerical simulation of the flight dynamics and the control system of a multirotor, and a game controller is adopted for user interaction. The resulting system demonstrates that Augmented Reality is a viable technology that can be used to increase the possibilities of evaluating simulated systems.

Key words
augmented reality; mav; uav; simulation; quaternion attitude control

INTRODUCTION

The Multirotor Aerial Vehicles (MAV), which are a special class of Unmanned Aerial Vehicles (UAVs), are rotary-wing aircrafts capable of hover in the air, and vertical take-off and landing (Bresciani 2008BRESCIANI T. 2008. Modelling, identification and control of a quadrotor helicopter. Master’s thesis, Lund University. ISSN: 0280-5316. Available at: https://lup.lub.lu.se/student-papers/record/8847641/file/8859343.pdf.
https://lup.lub.lu.se/student-papers/rec...
, Mahony et al. 2012MAHONY R, KUMAR V & CORKE P. 2012. Multirotor aerial vehicles: Modeling, estimation, and control of quadrotor. IEEE Robot Autom Mag 19(3): 20-32. ISSN: 1558-223X. doi: 10.1109/MRA.2012.2206474., Idrissi et al. 2022IDRISSI M, SALAMI M & ANNAZ F. 2022. A review of quadrotor unmanned aerial vehicles: Applications, architectural design and control algorithms. J Intell Robot Syst 104: 22. doi: 10.1007/s10846-021-01527-7.). Currently, its use is already common in civil and military applications and a growing demand for new developments has required more and better tools for rapid prototyping of new ideas and uses. As detailed in Mairaj et al. (2019)MAIRAJ A, BABA AI & JAVAID AY. 2019. Application specific drone simulators: Recent advances and challenges. Sim Mod Prac & Theory 94: 100-117. ISSN: 1569-190X. doi: 10.1016/j.simpat.2019.01.004., the use of simulation environments is consolidating as a suitable tool for this scenario, since it provides possibilities for rapid evaluation of dynamics behavior and control strategies for different conditions.

Scientific and industrial research has used numerical simulation combined with high fidelity models for analysis of theoretical and practical aspects of dynamical systems (Jalon & Bayo 1994JALON JG & BAYO E. 1994. Kinematic and Dynamic Simulation of Multibody Systems. Springer-Verlag. ISBN: 978-1-4612-7601-2. doi: 10.1007/978-1-4612-2600-0.). The growth of computational processing power is allowing the use of real-time simulation of increasingly complex systems and their visualization through Computer Generated Images (CGI). In general, this strategy speeds up the understanding of the behavior of the system (Radu 2014RADU I. 2014. Augmented reality in education: a meta-review and cross-media analysis. Pers Ubiquitous Comput 18(6): 1533-1543. DOI: 10.1007/s00779-013-0747-y., Khandelwal et al. 2019KHANDELWAL P, SRINIVASAN K & ROY SS. 2019. Surgical education using artificial intelligence, augmented reality and machine learning: A review. 2019 IEEE Int Conf Consumer Electr 1-2. doi.org/10.1109/ICCE-TW46550.2019.8991792.), minimizes the risk of damage to the device and reduces the total cost of the project.

In the CGI field, the Extended Reality (XR) is term that embraces the Augmented Reality (AR), Mixed Reality (MR), and Virtual Reality (VR) technologies. It is becoming common practice to incorporate this class of tools for visualization and interaction with simulated systems into the research and development pipeline. The development and use of XR technologies requires the use of CGI aided by specialized hardware to allow the integration and interaction of virtual objects into real-world scenarios (Carmigniani & Furht 2011CARMIGNIANI J & FURHT B. 2011. Augmented Reality: An Overview. In: FURHT B (Ed), Handbook of Aug Real 3-6. ISBN: 978-1-4614-0064-6. doi.org/10.1007/978-1-4614-0064-6_1., Asaad 2021ASAAD RR. 2021. Virtual reality and augmented reality technologies: A closer look. In: Int Res J Sci Tech Educ Mgmt 1: 10. doi.org/10.5281/zenodo.5652296.
https://doi.org/10.5281/zenodo.5652296. ...
). To distinguish between variants of XR technologies, in VR the user is immersed into an interactive three-dimensional (3D) world, the AR provides an indirect live view of the physical world with 3D objects added to it and the MR is an AR environment where the 3D objects comprehend and interact with the real world.

The AR enhances the user’s perception of reality, and it is characterized by three properties (Azuma 1997AZUMA RT. 1997. A survey of augmented reality. Teleoperators and Virtual Environment. Pres Teleoper Virtual Environ 6(4): 355-385. doi: 10.1162/pres.1997.6.4.355.): i) combines real and virtual; ii) interactive in real-time; and iii) it is registered in 3D. Its long-term development encompasses the use of several strategies for blending virtual objects with the real world. In the early stages, the available computer processing power imposed restrictions to the task of 3D registration (Azuma 1997AZUMA RT. 1997. A survey of augmented reality. Teleoperators and Virtual Environment. Pres Teleoper Virtual Environ 6(4): 355-385. doi: 10.1162/pres.1997.6.4.355., Azuma et al. 2001AZUMA RT, BAILLOT Y, BEHRINGER R, FEINER S, JULIER S & MACINTYRE B. 2001. Recent advances in augmented reality. IEEE Comput Graph Appl 21(6): 34-47. ISSN: 1558-1756. DOI: 10.1109/38.963459., Billinghurst et al. 2015BILLINGHURST M, CLARK A & LEE G. 2015. A Survey of Augmented Reality. In: Found Trends Hum-Comput Interact 8(2-3): 73-272. doi: 10.1561/1100000049.). These techniques have evolved from approximate formulations or the use of artificial markers to robust computer vision strategies (Hoff et al. 1996HOFF WA, NGUYEN K & LYON T. 1996. Computer-Vision-Based Registration Techniques for Augmented Reality. Intel Robots and Comp Vision XV: Algorithms, Techniques, Active Vision, and Materials Handling 2904: 538-548. International Society for Optics and Photonics. doi: 10.1117/12.256311., Kato & Billinghurst 1999KATO H & BILLINGHURST M. 1999. Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), p. 85-94. doi: 10.1109/IWAR.1999.803809., Kuppala et al. 2020KUPPALA K, BANDA S & BARIGE TR. 2020. An overview of deep learning methods for image registration with focus on feature-based approaches. Inter J Image and Data Fusion. doi: 10.1080/19479832.2019.1707720.).

The development of AR-related technologies requires the use of hardware devices suited to the type of user experience desired in conjunction with a software platform capable of integrating signal and image processing with CGI processing tools. The required hardware for the AR experience depends on the application, which can range from a single camera to environments prepared with motion sensors, and specialized accessories (e.g. glasses, game controller and motion capture systems). Consisting of cameras, accelerometer, rate gyro, magnetometer, a high-quality display, a powerful processor unity and, also, encapsulated in a thin and lightweight device, smartphones has become a prominent platform for the development of AR applications. Game engines provide tools for manipulating 3D objects and were created to streamline the creation of games, but they also gained importance in the creation of interactive virtual environments in other areas, such as architecture, entertainment and engineering. More recently, game engines have added tools for developing AR applications (Linowes & Babilinski 2017LINOWES J & BABILINSKI K. 2017. Augmented Reality for Developers. Packt Publishing, 548 p. ISBN: 9781787286436., Khandelwal et al. 2019KHANDELWAL P, SRINIVASAN K & ROY SS. 2019. Surgical education using artificial intelligence, augmented reality and machine learning: A review. 2019 IEEE Int Conf Consumer Electr 1-2. doi.org/10.1109/ICCE-TW46550.2019.8991792., Nowacki & Woda 2020, Asaad 2021ASAAD RR. 2021. Virtual reality and augmented reality technologies: A closer look. In: Int Res J Sci Tech Educ Mgmt 1: 10. doi.org/10.5281/zenodo.5652296.
https://doi.org/10.5281/zenodo.5652296. ...
), such as Vuforia (PTC 2022PTC - DIGITAL TRANSFORMS PHYSICAL. 2022. Vuforia enterprise augmented reality (ar) software: PTC. Availabe at: http://vuforia.com/.
http://vuforia.com/...
), Apple AR (Apple 2022APPLE. 2022. IOS - Augmented Reality. Available: Available at: https://www.apple.com/augmented-reality/.
https://www.apple.com/augmented-reality/...
) and Google AR (Google 2022GOOGLE. 2022. Google AR & VR. Available at: https://arvr.google.com/.
https://arvr.google.com/...
), which is helping to popularize AR applications through the availability of standardized development tools and ease of exporting an application to various hardware platforms (e.g. desktop, mobile and embedded computers).

In this way, combining the possibilities arising from AR tools with the demand for new UAV applications, the scope of this work is to develop an AR visualization system for a simulated MAV, enabling an interactive and real-time user control of the simulation, using a smartphone as a hardware platform and a game engine to support the AR application.

MATERIALS AND METHODS

This section is dedicated to describing the proposed system and begins by presenting an overview of the modules that compose it, describes the notation adopted for the mathematical formulation, describes the MAV Simulation Module in its details and, finally, explains the Module of Augmented Reality and how it is used to present the simulation.

System description

The system is composed by two modules: the MAV simulation module and the AR Visualization Module. The MAV simulation module comprises the MAV dynamics, its attitude and position control, and receive the external game controller commands. The AR Visualization Module maintains the estimation of the user position in the scenario, update the relative pose of the 3D objects, and merges a coherent vision of the virtual elements with the real scenario in a single image to be present to the user.

For the MAV simulation, a MATLAB/Simulink® (version 2020a) simulation environment is performed on a desktop computer with a (soft) real-time execution constraint, to ensure the user experience compatible with the real MAV. Developed with the ARToolkit®, from the Unity 3D® game engine (version 2020.2.1f1), the AR visualization runs on a smartphone. The communication between the two modules is via a WiFi connection, and the gamepad controller is connected to the desktop computer by a 2.4 GHz wireless adapter, as shown in Figure 1a, b.

Figure 1
General description of the interaction and control of the AR system. a) Diagram of interaction between the AR visualization module, the simulation module and the game controller. b) Description of MAV navigation commands adopted on the game controller.

The smartphone is fixed on top of the game controller and its screen is used to view the AR projection. The AR projection is made on images of the local environment, captured with the rear camera of the smartphone. The game controller establishes wireless communication with the simulation, allowing the user to move freely in the real world.

A reliable communication is established between the simulation module and the AR Visualization Module through a TCP protocol, where a TCP server is started at the AR application (at the smartphone), listening a configurable TCP port.

The hardware utilized for the development and execution of the tests is composed by: a notebook, with an i5 processor and 16 Gb of RAM, running a Windows 11 operating system; a smartphone, model Poco X3 NFC, running the Android operating system (version 11); and an Ípega gamepad controller, model PG-9076.

Notation

Defining the previous conventions and notations, scalars are represented by italic letters (aR), algebraic column vectors (aRn) and and matrices (ARp×q) are represented by bold letters. Transpose matrices are indicated by AT, and inverse matrices are denoted by A1. An identity matrix InRn×n is a square matrix with ones on the main diagonal and zeros elsewhere. A Cartesian Coordinate System (CCS) with the origin A is denoted by SA, whilst a Direction Cosine Matrix (DCM) DB/ASO3 converts a vector projection representation from SA to SB (Markley & Crassidis 2014MARKLEY FL & CRASSIDIS JL. 2014. Fundamentals of Spacecraft Attitude Determination and Control. Springer New York York, XV, 486 p. ISBN: 978-1-4939-0801-1. doi: 10.1007/978-1-4939-0802-8., p. 45). The SO(3) is the special orthogonal group of order 3, this implies that DA/B=DB/AT=DB/A1. Furthermore, qq1:3Tq4TR4 is a quaternion of unitary norm adopted for attitude parametrization (Markley & Crassidis 2014MARKLEY FL & CRASSIDIS JL. 2014. Fundamentals of Spacecraft Attitude Determination and Control. Springer New York York, XV, 486 p. ISBN: 978-1-4939-0801-1. doi: 10.1007/978-1-4939-0802-8., p. 28), for which: qTq=1; q1:3[q1q2q3]T=εsinϑ/2R3; q4cosϑ/2R; being εR3 the unitary Euler axis vector, and ϑR the Euler angle of rotation (Markley & Crassidis 2014MARKLEY FL & CRASSIDIS JL. 2014. Fundamentals of Spacecraft Attitude Determination and Control. Springer New York York, XV, 486 p. ISBN: 978-1-4939-0801-1. doi: 10.1007/978-1-4939-0802-8., p. 45). The parametrization of a DCM by q is defined by:

D B / A ( q ) ( q 4 2 q 1 : 3 ) I 3 2 q 4 [ q 1 : 3 × ] + 2 q 1 : 3 q 1 : 3 T . (1)

The product of a pair of quaternions ​​q​​ a​​ and ​​q​​ B​​ can be represented by the following matrix product (Markley & Crassidis 2014MARKLEY FL & CRASSIDIS JL. 2014. Fundamentals of Spacecraft Attitude Determination and Control. Springer New York York, XV, 486 p. ISBN: 978-1-4939-0801-1. doi: 10.1007/978-1-4939-0802-8.):

​​​ qAqB=[qA]qB=[q4A I3[q1:3A×]q1:3A(q1:3A)Tq4A][q1Bq2Bq3Bq4B] (2)

where the skew-symmetric matrix version ​​[u ×]​​ of a vector u = ​​[u​ 1​​ u​ 2​​ ​u​ 3​​]​​​ T​ ∈ ​ℝ​​ 3​​ is given by:

​​ [u×][1u3u2u31u1u2u11]3×3. (3)

We also present here some properties for manipulating quaternion numbers, required for further development: (P1) ​​(qAqB) qC=qA (qB qC); (P2) ​​Iq​​ ≜ ​​[0 0 0 1]​​​ T​​ is the identity quaternion, for which ​​Iq​​ ⊗ q = q ⊗ ​Iq​​ = q​; (P3)qq12 + q22 + q32 + q42​ ​​ is the quaternion norm; (P4) ​​q​​ *​ = ​​[− ​q1:3​ ​ T ​ ​q​ 4​​]​​​ T​​ is the conjugated quaternion; and (P5) ​​q​​ −1​ ≜ ​q​​ *​ / ​​∥q∥​​​ 2​​ is the inverse of any quaternion having nonzero norm, where ​​q​​ −1​ ⊗ q = q ⊗ ​q​​ −1​ = ​Iq​​​. It is also considered the Euler angles vector ​𝛂 = ​​[ϕ θ ψ]​​​ T​ ∈ ​ℝ​​ 3​​, in the sequence 1-2-3, to obtain the game controller commands and to interact with the AR Visualization Module. These angles represent, respectively, the roll, pitch and yaw of the MAV and they are converted to the quaternion representation and obtained from a quaternion, by the following relations1 1 To calculate the values of ​arctan​(⋅)​​ it is preferable to use numeric functions of the type atan2(numerator, denominator). (Markley & Crassidis 2014MARKLEY FL & CRASSIDIS JL. 2014. Fundamentals of Spacecraft Attitude Determination and Control. Springer New York York, XV, 486 p. ISBN: 978-1-4939-0801-1. doi: 10.1007/978-1-4939-0802-8.):

q [ s i n ( ϕ 2 ) c o s ( θ 2 ) c o s ( ψ 2 ) + s i n ( ϕ 2 ) s i n ( θ 2 ) s i n ( ψ 2 ) c o s ( ϕ 2 ) s i n ( θ 2 ) c o s ( ψ 2 ) s i n ( ϕ 2 ) c o s ( θ 2 ) s i n ( ψ 2 ) c o s ( ϕ 2 ) c o s ( θ 2 ) s i n ( ψ 2 ) + s i n ( ϕ 2 ) s i n ( θ 2 ) c o s ( ψ 2 ) c o s ( ϕ 2 ) c o s ( θ 2 ) c o s ( ψ 2 ) s i n ( ϕ 2 ) s i n ( θ 2 ) s i n ( ψ 2 ) ] , a n d (4)
ϕ θ ψ a r  c  t a n  2  q 3  q 2  q 1 q 4  q  1 2 q  2 2 + q 3 2 + q 4 2  a r c s  i  n 2 q  3 q 1 + q 2 q 4  a r c t a n  2  q 2 q 1  q 3  q 4  q 1 2   q 2 2 q 3 2 + q 4 2 . (5)

MAV simulation module

The simulation module comprises the simulation of the dynamics and the control system, using the inputs given by the user with the game controller. To evaluate AR Visualization Module an altitude and attitude control strategy is proposed, and a schematic view is presented in Figure 2.

Figure 2
Representation of the MAV simulation module, composed by the simulation of dynamics and control from the commands obtained from the game controller.

The MAV dynamics modeling adopts two CCSs: ​​𝒮​ R​​​ is the reference frame, fixed at the initial position with the x-axis pointing forward, z-axis pointing upward and y-axis completing the right-handed rule; and ​​𝒮​ B​​​ is the body-fixed frame, that is positioned at the MAV center of mass and it is initially aligned with ​​𝒮​ R​​​. Figure 3a presents the modeled 3D MAV, used in the simulation. All the 3D objects used were modeled with Blender 3D (release 3.1.2). Figure 3b shows the alignment of ​​𝒮​ B​​​ with the vehicle body, from a top view perspective.

Figure 3
The model of the quadrotor MAV in X symmetrical configuration. a) 3D model used in the AR environment to represent the real drone. b) Top view of the MAV with the representation of the rotors distances for the x and y-axis, and the adopted positive direction of rotation.

MAV Dynamics

According with Stevens et al. (2015)STEVENS BL, LEWIS FL & JOHNSON EN. 2015. Aircraft control and simulation. J Wiley & Sons, 3 ed., 749 p. DOI: 10.1002/9781119174882., the flat-Earth equation of motion for any rigid-body adopts the following simplifications: (i) the Earth frame is an inertial reference frame; (ii) position is measured in a tangent-plan coordinate system; and (iii) the gravity vector is normal to the tangent plane and constant in magnitude. The MAV dynamics is modeled by a set of nonlinear differential equations and this work adopts the following model (Bresciani 2008BRESCIANI T. 2008. Modelling, identification and control of a quadrotor helicopter. Master’s thesis, Lund University. ISSN: 0280-5316. Available at: https://lup.lub.lu.se/student-papers/record/8847641/file/8859343.pdf.
https://lup.lub.lu.se/student-papers/rec...
, Oliveira 2011OLIVEIRA MDLC. 2011. Modeling, identification and control of a quadrotor aircraft. Master’s thesis, Czech Technical University in Prague. doi: 10.13140/RG.2.1.1344.2409., Mahony et al. 2012MAHONY R, KUMAR V & CORKE P. 2012. Multirotor aerial vehicles: Modeling, estimation, and control of quadrotor. IEEE Robot Autom Mag 19(3): 20-32. ISSN: 1558-223X. doi: 10.1109/MRA.2012.2206474., Stevens et al. 2015STEVENS BL, LEWIS FL & JOHNSON EN. 2015. Aircraft control and simulation. J Wiley & Sons, 3 ed., 749 p. DOI: 10.1002/9781119174882.):

 w  ˙  r  ˙  R  v  ˙  B  q  ˙ B ω ˙ B  =  1 τ p   w + k p   w    D B / R   q  T   v B 1 m    F B C      + D B  /  R  q  F R  g μ   d  v B ω B   ×  v B 1  2   Ξ q ω  B  J    1 T B C ω B  × J ω B , (6)

with the state vector and the input vector being defined by:

x [ w T r T q T 𝛚 T ] T , (7)
 u  w R 4 , (8)

where ww1w2w3w4TR4 are the instantaneous rotation speed of the propellers, with wj0,wmax, for j=1,,4; ww1w2w3w4TR4 are the input commands of rotation speed for the propellers; ​​r​ R​​ ≜ ​​[​rx​​ ​ry​​ ​rz​​]​​​ T​ ∈ ​ℝ​​ 3​ and ​​v​ B​​ ≜ ​​[​v​ x​​ ​v​ y​​ ​v​ z​​]​​​ T​ ∈ ​ℝ​​ 3​​ are the linear position and velocity of ​​𝒮​ B​​​ w.r.t. ​​𝒮​ R​​​; ​q​ and ​​𝛚​ B​​ ≜ ​​[​ωx​​ ​ωy​​ ​ωz​​]​​​ T​ ∈ ​ℝ​​ 3​​ are the attitude and angular velocity of ​​𝒮​ B​​​ w.r.t. ​​𝒮​ R​​​. The constants ​​τp​​​ and vBvxvyvzTR3 are the linear position and velocity of SB w.r.t. SR; q and ωBωxωyωzTR3 are the attitude and angular velocity of SB w.r.t. SR. The constants τp and kp are, respectively, the settling time and the proportional gain for the motor and propeller assembly; μd is a friction constant due to air drag; mR and JR3×3 are, respectively, the mass and the rotational inertia matrix, a diagonal matrix defined by:

J=Jxx000Jyy000Jzz. (9)

The rotation of the propellers results in thrust forces ff1f2f3f4TR4 and, due the drag effect, a reaction torque dd1d2d3d4TR4, where:

f i = k f w i 2 , (10)
di=(1)(i+1)kdwi2, (11)

for i=1,2,3,4, with kf and kd being constants experimentally determined (Vasconez et al. 2016VASCONEZ J, GREWAL JP, LEONARDO E, MARWA M, GUZMAN CA & MANKBADI RR. 2016. Propulsion system design for micro aerial vehicles. In: AIAA Atmospheric Flight Mechanics Conference. American Institute of Aeronautics and Astronautics 1: 18. DOI: 10.2514/6.2016-3714., Piljek et al. 2020PILJEK P, KOTARSKI D & KRZNAR M. 2020. Method for characterization of a multirotor UAV electric propulsion system. Appl Sci 10(22): 18. doi: 10.3390/app10228229.). Thus, the modulus of the thrust force FCR, along the z-axis of SB, and the control torque TBCR3 are obtained by the following relations:

FC=1111f,(12)
​​ TRC=l1l1l1l1l2l2l2l2kdfkdfkdfkdff, (13)

where, as shown in Figure 3b, l1 is the distance of the rotors to SB’s x-axis; l2 is the distance of the rotors to SB’s y-axis; and the reaction torque is given by the fractional constant kdf=kdkf. The remaining vectors are FBC00FCTR3 and FRgm00gTR3, where the last one represents the force due to the action of gravity on the center of mass of the MAV.

The attitude kinematics is defined in terms of the following quaternion dependent matrix:

Ξq=q4I3+q1:3×q1:3TR4×3. (14)

The modeled MAV is based on a UAV built with the F450 frame and Table I contains the parameters adopted in the simulation.

Table I
Parameters adopted for the MAV simulation.

MAV Altitude and Attitude Control System

The altitude and attitude controllers are implemented by a discrete-time approximation of a Proportional, Integral, Derivative (PID) controller, over a linearized operation point, since the system is governed by nonlinear equations. The discrete-time PID controller is formulated by the following equation:

u k  =   P   e k    + I j  =  0 k T s e j + D e k e k 1 T s , (15)

where ek is the computed error; k represents the kth discrete-time sample; Ts is the sampling time; and P, I and D are tuned constants of the controller. In this work, the gain values of the PID controllers were manually tuned.

The equilibrium point, over which the system is linearized, is given by the MAV in constant position and attitude, with zero linear and angular velocities and with the plane of the propellers parallel to the xy plane of SR. For this, all the propellers must have the same rotation speed and the sum of the forces produced must be equal, in module, to the total weight of the aircraft (Bouabdallah et al. 2004BOUABDALLAH S, NOTH A & SIEGWART R. 2004. PID vs LQ control techniques applied to an indoor micro quadrotor. IEEE/RSJ IROS 2004 3: 2451-2456. doi: 10.1109/iros.2004.1389776., Hussein & Abdallah 2018HUSSEIN A & ABDALLAH R. 2018. Autopilot design for a quadcopter. Khartoum: University Of Khartoum. doi: 10.13140/RG.2.2.10309.91364., Hasseni et al. 2019HASSENI SEI, ABDOU L & GLIDA HE. 2019. Parameters tuning of a quadrotor PID controllers by using nature-inspired algorithms. Evolutionary Intelligence 14(1): 61-73. doi: 10.1007/s12065-019-00312-8.). Therefore, the total thrust at the equilibrium implies that:

i = 1 4 f i = m g , (16)

where, for equal thrust for each propeller, fi=14mg. Consequently, from Eq. (10), the rotation command vector of the propellers at the equilibrium point is then computed by:

w e q = m g 4 k f 1 1 1 1 R 4 . (17)

The altitude control aims to follow the reference ​h​ given by the game controller, as shown in Figure 2. The altitude error is given by:

e h k = h k r z k R , (18)

where uhk is given by Eq. (15), with Ph=5, Ih=10 and Dh=50×103, which results in:

δ w h = u h k 1 1 1 1 R 4 . (19)

Attitude control uses the command references given by the game controller ​​𝛂​​ ‾ ​​, as shown in Figure 2. These commands are the user input obtained by manipulating the game controller, shown in Figure 1b, and have the following physical representation: ​​ϕ​​ ‾ ​​ implies the lateral movement; ​​θ​​ ‾ ​​ implies the forward/backward movement; and ​​ψ​​ ‾ ​​ defines the MAV orientation. The MAV lateral and forward/backward movement commands must be considered in alignment with the vehicle’s current orientation.

Calculating the difference between two angles is not simply subtracting two values, as in the case of the difference between altitudes. Operating with differences in attitude representations requires maintaining the properties of the DCM and, in this work, this operation was performed with both values converted to quaternions. Thus, the attitude reference α need to be converted to the quaternion representation q, with Eq. (4). Taking q as the actual attitude of the drone, q as the reference attitude to be followed and using the definition given by Eq. (2) with the property (P5), the attitude error in quaternion is obtained by (Younes & Mortari 2019YOUNES AB & MORTARI D. 2019. Derivation of all attitude error governing equations for attitude filtering and control. Sensors 19(21): 4682. doi.org/10.3390/s19214682.
https://doi.org/10.3390/s19214682....
):

q e α  k = q k q 1 k . (20)

The quaternion error qeα, given by Eq. (20), is converted to Euler angles representation using Eq. (5), resulting in eαkeϕkeθkeψkTR3. The obtained attitude error eαk is utilized to compute the attitude control vector uαkR3, using Eq. (15). The attitude control vector uα represents the necessary torque to align the MAV according to command α. It is adopted the following gains for the attitude controller: Pαϕ=Pαθ=Pαψ=104, Iαϕ=Iαθ=Iαψ=500 and Dαϕ=Dαθ=Dαψ=500. The force variation δf of each propeller is then obtained by isolating f in Eq. (13), setting TBC=uαk. Thus, from Eq. (10), the attitude control input is given by:

δwα=1kfδf1δf2δf3δf4R4. (21)

Finally, using Eq. (17), Eq. (19) and Eq. (21), we can compute the command w by:

  w   = w   e  q + δ w h + δ w α , (22)

as shown in Figure 2. Also, the commands δwh and δwα are limited to 30% of wmax.

Augmented Reality Visualization Module

The AR visualization module was software developed with the ARToolKit, that is integrated in the Unity 3D game engine, which runs on a smartphone device and promotes real-time integration between the 3D objects and the images captured by the smartphone camera, according to a programmed logic. The smartphone screen works as a camera that lets the user to see the mixed world provided by the AR experience.

The AR technology provided by the ARToolkit has a robust accuracy in the registration of the 3D objects in the real world. This makes it possible for the user to move in the real environment and see the projection, just as it would happen if a real object were present in the environment. In this way, the user can change the point of view of the 3D objects, registered in the real world, by moving the smartphone in position and orientation. This ability to perceive the relative movement of the user in the real world is the result of the sensory fusion of the signals from the inertial sensors available on the smartphone and the image processing, which uses Artificial Intelligence resources. The user interaction with 3D objects will be done by touching the smartphone screen or manipulating the game controller. All of these features are provided as part of Unity 3D.

The construction of the 3D world within the Unity 3D makes use of the metric system for positioning and measuring the objects, and the attitude is expressed in degrees, using Euler angles. To integrate the simulation with the AR software, the simulation needs to send the MAV position and attitude data to the AR Visualization Module (running at the smartphone), at every instant of time, so that the spatial configuration of the 3D environment is updated. The Unity 3D adopts a left-handed CCS configuration, where x-axis pointing forward, y-axis pointing upward and z-axis completing the left-handed rule. The angles are also computed using the left-handed rule.

The virtual environment uses two CCSs to define the relative positions between the scenery and the 3D objects: 1) The global reference is given by the CCS ​​𝒮​ O​​​, which is fixed in the real world and is defined in the position of the smartphone when the AR module starts; 2) The movable CCS ​​𝒮​ V​​​ represents the reference position, with which all 3D objects will be related, and can be repositioned by the user’s preference. The CCSs ​​𝒮​ O​​​ and ​​𝒮​ V​​​ are initially aligned and share the same origin, but ​​𝒮​ V​​​ can be changed by user action in order to change the relative pose of virtual objects in the real world.

To place the simulation in the virtual environment, it is necessary to establish a relationship between the coordinate systems of the simulation and the virtual environment. In the simulation the MAV pose (​​𝒮​ B​​​) is related to the reference ​​𝒮​ R​​​ and, to allow the repositioning of the simulated system in real world, an equivalence between ​​𝒮​ R​​​ and ​​𝒮​ V​​​ is defined. Nevertheless, the position and orientation of the 3D objects at the AR world must be related to ​​𝒮​ O​​​. In fact, as ​​𝒮​ V​​​ represents ​​𝒮​ R​​​ in the virtual environment, each position or attitude received from the simulation needs to be converted from ​​𝒮​ V​​​ to ​​𝒮​ O​​​. Thus, the simulat>ed MAV position ​​r​ R​​​ is converted to the virtual environment global position by:

pO=sO+DO/VrV,(23)

where pOpxpypzTR3 is the composed position of the MAV in the AR environment, expressed in SO; rVrxrzryTR3 is the converted MAV simulated position (rR), expressed in SV; sO is the position of SV w.r.t. SO, and expressed in SO; DO/V is the DCM to convert from ​​​𝒮​ V​​ to ​​𝒮​ O​​​. the ​​s​ O​​​ and ​​D​​ O/V​​ parameters are dynamically computed by the game engine and can be accessed by the application.

The MAV attitude α is converted to the virtual environment global attitude γγxγyγzTR3 by:

γ = β + α , (23)

where α180πϕψθTR3 is the MAV simulated attitude (α), converted to left-hand representantion and expressed in degrees; β is the attitude of SV w.r.t. SO, obtained as a property from the game engine. The described vector relationships that result in the global position and attitude are presented in Figure 4.

Figure 4
Vector relationship of CCSs for positioning 3D objects, according to the relationship established by the game engine and the simulation system. Illustration of the possibility of user movement in the real world, with respect to ​​𝒮​ V​​​ and ​​𝒮​ O​​​.

Summarizing the workflow, the program execution consist in: 1) choose an real environment for the simulation projection; 2) load both applications, the AR visualization module (with the smartphone at a convenient start position) and the MAV simulation module; 3) set the AR visualization port for the external connection; 4) set this TCP port in the MAV simulation module; 5) start the simulation; 6) navigate the MAV using the game controller; and, if necessary, 7) adequate the origin ​​𝒮​ R​​​ to a convenient position, during the execution, by change the ​​𝒮​ V​​​ position.

RESULTS

The evaluation of the AR visualization module consists of analyzing the consistency of the simulation by its virtual representation and the user experience. In the tests carried out in the evaluation, the simulation (numerical integration and control) was adopted with a fixed step of ​​Ts​​ = 1 ​[ms]​​.

To evaluate the consistency between the simulated model and the visualization, the first step was to determine the dimensional congruence of the AR environment. In this case, predefined positions and attitude were applied to MAV w.r.t. ​​𝒮​ R​​​, and the 3D model response was evaluated. To this end, it was set positions to each axis of: ​±​ 0.5 (m), ​±​ 1.0 (m), and ​±​ 2.0 (m); and attitude around each axis of: ​±​ 30°, ​±​ 45°, ​±​ 90°, ​±​ 180°. The results showed that the AR environment is consistent with the idealized values. However, in more practical situations, using positions that are too far away from the camera’s location can cause distortions in the image’s perspective. Furthermore, setting positions below the local ground level may imply unrealistic projections.

Figure 5 a to f demonstrates the proposed AR system used in indoor and outdoor environments. Each image is the result of the composition of three elements for the AR system: the real-world scenario, the 3D MAV and a system of coordinate axes, representing the origin ​​𝒮​ R​​​ in the simulated system. In general, the simulation and visualization systems presented satisfactory results, concerning to usability, dynamic response and interaction capacity.

Figure 5
Real environment with 3D objects projected, composing the Augment Reality visualization system. a) Indoor initial position. b) Indoor flight scene. c) Indoor change of scene point of view. d) Outoor initial position. e) Outdoor flight scene. f) Outdoor change of scene point of view.

The use of this AR system in indoor environments has, in general, better lighting uniformity for the 3D objects, since, in this situation, the light projection tends to have a better distribution. The lighting system in the Unity 3D-ARToolkit estimates the direction and intensity of ambient light to define the lighting system for 3D objects in the scene, so the open environment tends to be more difficult to calibrate and position the lights for the 3D world.

Although the indoor environment has better lighting uniformity, in many cases, it has a smooth texture for the floor, walls and objects, which jeopardizes the tracking process. Estimation of the smartphone motion by the game engine and its ARToolkit depends of the environment texture for the image processing algorithm. A rough texture improves the results and fine-textured environments can cause an uncontrolled drift in the estimated position of the 3D objects in relation to the real world.

For the evaluation of the user experience, it is desirable that it be compatible with the use and of an equivalent real device. Seeing the environment through the smartphone screen gives the user a real sense of integration between the real and the virtual. This capacity has been achieved, especially, in textured environments, where the size and the position of virtual objects registered in the real world has been shown to be more coherent. With this tool, it is possible, for example, to move around the objects and see them from different points of view.

The communication delay can affect the experience of the simulation. To minimize this effect, the smartphone is configure as a hot spot and the desktop computer connects directly to it.

In general, the proposal met the desired operational requirements and the final system allowed the simulation of a drone in augmented reality, through the interaction between a simulation, which incorporates the physics and electronics of a drone, and a visualization system with the virtual representation integrated into the real environment. The use of augmented reality still has some limitations, however, recent developments in this area have made it possible to use it beyond research laboratories, allowing this technology to be incorporated into various applications, without the need for an environment specially prepared for it.

DISCUSSION

The growth of augmented reality technologies and tools has allowed its use in several areas and has been the focus of investments by several companies. Augmented Reality presents itself as a disruptive interface system, whose flexibility allows combining the possibilities arising from computer graphics with immersion in a real environment. This work combined the use of this tool with the growing demand for drones, providing a visualization application that can be useful in several stages of the design of MAVs.

The proposed application fulfilled the initial objectives, in which an AR system allows the exploration of a simulation, not only from the point of view of numerical evaluation and performance criteria, but also provides an understanding of the system’s behavior with an immersive experience in the real world.

The results demonstrated the feasibility of using Augmented Reality tools as a way to increase the possibility of evaluation of MAV projects, based only on simulation. The set of development tools presented by the Unity 3D game engine accelerated the software production process and the availability and quality of the sensors incorporated into smartphones made it an excellent hardware alternative for simpler applications. The combination of both technologies, accessible to the general public at a relatively low cost, allowed the development of an easy-to-use tool to aide the development of applications with MAVs and, also, with possibilities of extension to other areas, such as training of flights with MAVs and educational. In addition, its interactive interface allows the inclusion of non-technical professionals in the development process, providing an experience of use that resembles the real product.

This work can also be expanded and adapted to similar projects. In future versions, it is intended to adapt this project for the use of equipment dedicated to Augmented Reality immersion and with more resources. Comparison of simulated system behavior with a real device is also expected, evaluating both performance and user experience.

ACKNOWLEDGMENTS

The author E. A. Moura thanks the financial support of Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES), through the doctoral scholarship number 88882.180838/2018-01 from the Programa de Excelência Acadêmica (PROEX).

  • 1
    To calculate the values of ​arctan​(⋅)​​ it is preferable to use numeric functions of the type atan2(numerator, denominator).

REFERENCES

  • APPLE. 2022. IOS - Augmented Reality. Available: Available at: https://www.apple.com/augmented-reality/
    » https://www.apple.com/augmented-reality/
  • ASAAD RR. 2021. Virtual reality and augmented reality technologies: A closer look. In: Int Res J Sci Tech Educ Mgmt 1: 10. doi.org/10.5281/zenodo.5652296.
    » https://doi.org/10.5281/zenodo.5652296.
  • AZUMA RT. 1997. A survey of augmented reality. Teleoperators and Virtual Environment. Pres Teleoper Virtual Environ 6(4): 355-385. doi: 10.1162/pres.1997.6.4.355.
  • AZUMA RT, BAILLOT Y, BEHRINGER R, FEINER S, JULIER S & MACINTYRE B. 2001. Recent advances in augmented reality. IEEE Comput Graph Appl 21(6): 34-47. ISSN: 1558-1756. DOI: 10.1109/38.963459.
  • BILLINGHURST M, CLARK A & LEE G. 2015. A Survey of Augmented Reality. In: Found Trends Hum-Comput Interact 8(2-3): 73-272. doi: 10.1561/1100000049.
  • BOUABDALLAH S, NOTH A & SIEGWART R. 2004. PID vs LQ control techniques applied to an indoor micro quadrotor. IEEE/RSJ IROS 2004 3: 2451-2456. doi: 10.1109/iros.2004.1389776.
  • BRESCIANI T. 2008. Modelling, identification and control of a quadrotor helicopter. Master’s thesis, Lund University. ISSN: 0280-5316. Available at: https://lup.lub.lu.se/student-papers/record/8847641/file/8859343.pdf
    » https://lup.lub.lu.se/student-papers/record/8847641/file/8859343.pdf
  • CARMIGNIANI J & FURHT B. 2011. Augmented Reality: An Overview. In: FURHT B (Ed), Handbook of Aug Real 3-6. ISBN: 978-1-4614-0064-6. doi.org/10.1007/978-1-4614-0064-6_1.
  • GOOGLE. 2022. Google AR & VR. Available at: https://arvr.google.com/
    » https://arvr.google.com/
  • HASSENI SEI, ABDOU L & GLIDA HE. 2019. Parameters tuning of a quadrotor PID controllers by using nature-inspired algorithms. Evolutionary Intelligence 14(1): 61-73. doi: 10.1007/s12065-019-00312-8.
  • HOFF WA, NGUYEN K & LYON T. 1996. Computer-Vision-Based Registration Techniques for Augmented Reality. Intel Robots and Comp Vision XV: Algorithms, Techniques, Active Vision, and Materials Handling 2904: 538-548. International Society for Optics and Photonics. doi: 10.1117/12.256311.
  • HUSSEIN A & ABDALLAH R. 2018. Autopilot design for a quadcopter. Khartoum: University Of Khartoum. doi: 10.13140/RG.2.2.10309.91364.
  • IDRISSI M, SALAMI M & ANNAZ F. 2022. A review of quadrotor unmanned aerial vehicles: Applications, architectural design and control algorithms. J Intell Robot Syst 104: 22. doi: 10.1007/s10846-021-01527-7.
  • JALON JG & BAYO E. 1994. Kinematic and Dynamic Simulation of Multibody Systems. Springer-Verlag. ISBN: 978-1-4612-7601-2. doi: 10.1007/978-1-4612-2600-0.
  • KATO H & BILLINGHURST M. 1999. Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), p. 85-94. doi: 10.1109/IWAR.1999.803809.
  • KHANDELWAL P, SRINIVASAN K & ROY SS. 2019. Surgical education using artificial intelligence, augmented reality and machine learning: A review. 2019 IEEE Int Conf Consumer Electr 1-2. doi.org/10.1109/ICCE-TW46550.2019.8991792.
  • KUPPALA K, BANDA S & BARIGE TR. 2020. An overview of deep learning methods for image registration with focus on feature-based approaches. Inter J Image and Data Fusion. doi: 10.1080/19479832.2019.1707720.
  • LINOWES J & BABILINSKI K. 2017. Augmented Reality for Developers. Packt Publishing, 548 p. ISBN: 9781787286436.
  • MAHONY R, KUMAR V & CORKE P. 2012. Multirotor aerial vehicles: Modeling, estimation, and control of quadrotor. IEEE Robot Autom Mag 19(3): 20-32. ISSN: 1558-223X. doi: 10.1109/MRA.2012.2206474.
  • MAIRAJ A, BABA AI & JAVAID AY. 2019. Application specific drone simulators: Recent advances and challenges. Sim Mod Prac & Theory 94: 100-117. ISSN: 1569-190X. doi: 10.1016/j.simpat.2019.01.004.
  • MARKLEY FL & CRASSIDIS JL. 2014. Fundamentals of Spacecraft Attitude Determination and Control. Springer New York York, XV, 486 p. ISBN: 978-1-4939-0801-1. doi: 10.1007/978-1-4939-0802-8.
  • NOWACKI P & WODA M. 2019. Capabilities of ARcore and ARkit platforms for AR/VR applications. Eng Depend Comp Syst & Net 987: 358-370. ISBN 978-3-030-19501-4. doi: 10.1007/978-3-030-19501-4_36.
  • OLIVEIRA MDLC. 2011. Modeling, identification and control of a quadrotor aircraft. Master’s thesis, Czech Technical University in Prague. doi: 10.13140/RG.2.1.1344.2409.
  • PILJEK P, KOTARSKI D & KRZNAR M. 2020. Method for characterization of a multirotor UAV electric propulsion system. Appl Sci 10(22): 18. doi: 10.3390/app10228229.
  • PTC - DIGITAL TRANSFORMS PHYSICAL. 2022. Vuforia enterprise augmented reality (ar) software: PTC. Availabe at: http://vuforia.com/
    » http://vuforia.com/
  • RADU I. 2014. Augmented reality in education: a meta-review and cross-media analysis. Pers Ubiquitous Comput 18(6): 1533-1543. DOI: 10.1007/s00779-013-0747-y.
  • STEVENS BL, LEWIS FL & JOHNSON EN. 2015. Aircraft control and simulation. J Wiley & Sons, 3 ed., 749 p. DOI: 10.1002/9781119174882.
  • VASCONEZ J, GREWAL JP, LEONARDO E, MARWA M, GUZMAN CA & MANKBADI RR. 2016. Propulsion system design for micro aerial vehicles. In: AIAA Atmospheric Flight Mechanics Conference. American Institute of Aeronautics and Astronautics 1: 18. DOI: 10.2514/6.2016-3714.
  • YOUNES AB & MORTARI D. 2019. Derivation of all attitude error governing equations for attitude filtering and control. Sensors 19(21): 4682. doi.org/10.3390/s19214682.
    » https://doi.org/10.3390/s19214682.

Publication Dates

  • Publication in this collection
    27 May 2024
  • Date of issue
    2024

History

  • Received
    04 Oct 2022
  • Accepted
    09 July 2023
Academia Brasileira de Ciências Rua Anfilófio de Carvalho, 29, 3º andar, 20030-060 Rio de Janeiro RJ Brasil, Tel: +55 21 3907-8100, CLOCKSS system has permission to ingest, preserve, and serve this Archival Unit - Rio de Janeiro - RJ - Brazil
E-mail: aabc@abc.org.br