Home > Vol. 35, No. 1

Intuitive Drone Control using Motion Matching between a Controller and a Drone
  • Huhn Kim : Department of Mechanical System Design Engineering, Professor, Seoul National University of Science and Technology, Seoul, Korea
  • Wonjoo Chang : Department of Design and Engineering, Student, Seoul National University of Science and Technology, Seoul, Korea

Background In recent times, drones have been widely utilized for various purposes. In particular, drone control from a first-person view (FPV), in which the pilot controls the drone as if riding in the drone's cockpit by using a display device or a head-mounted display, is becoming increasingly popular. Therefore, a controller for safe and convenient FPV drone control is necessary.

Methods This study investigates the effectiveness of a motion controller that manipulates drone movements based on its own movements as compared to that of a conventional joystick controller. We designed and developed the motion-matching controller and drone for experimental evaluation. In the experiment, participants perform the task of maneuvering a drone from origin to destination on a given course with the developed motion and joystick controllers.

Results The experimental results showed that the motion-matching controller was superior to the joystick controller in terms of task success rate, number of collisions, task completion time, number of failed attempts, and subjective evaluation, particularly in the FPV mode. Notably, participants could perform complex manipulations that require controlling two or more axes simultaneously.

Conclusions The motion controller can be employed to enable improved intuitiveness and usability for personal or industrial applications that require drones to be operated in the FPV mode.

Keywords:
Drone Controller, Motion Matching, First-person View, Reference-frame Misalignment.
pISSN: 1226-8046
eISSN: 2288-2987
Publisher: 한국디자인학회Publisher: Korean Society of Design Science
Received: 23 Jul, 2021
Revised: 30 Sep, 2021
Accepted: 23 Oct, 2021
Printed: 28, Feb, 2022
Volume: 35 Issue: 1
Page: 93 ~ 113
DOI: https://doi.org/10.15187/adr.2022.02.35.1.93
Corresponding Author: Wonjoo Chang (wonjoo.chang@n15.asia)
PDF Download:

Funding Information ▼
Citation: Kim, H., & Chang, W. (2022). Intuitive Drone Control using Motion Matching between a Controller and a Drone. Archives of Design Research, 35(1), 93-113.

Copyright : This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted educational and non-commercial use, provided the original work is properly cited.

1. Introduction

“Drone” is the name given by the military to an unmanned aerial vehicle (UAV). In recent times, helicopter-type drones equipped with multiple rotors have been commercialized and used in various fields such as photography, agriculture, distribution, and disaster management, as well as for leisure (Shin et al., 2014). However, with the widespread use of these devices, drone-related accidents are increasing. The accident rate and the possibility of property damage and injury from drones are known to be about ten and 100 times greater, respectively, than those for aircrafts (Weibel & Hansman, 2006). The most frequent cause of drone accidents is poor handling by pilots (Williams, 2004; Hing & Oh, 2009). Several cases of pilot deaths or serious injuries due to pilot errors have been reported overseas (Cho et al., 2016). Therefore, it is very important to design a controller that can be easily and safely used by pilots to control drones and reduce the risk of accidents (Hing & Oh, 2009; Cho et al., 2016).

There are three main factors that contribute to the difficultly of controlling drones. First, it is difficult for pilots to maintain situational awareness owing to the limited availability of sensory information. Because the pilot is physically separated from the drone, it is necessary to manipulate drones using only visual information without the possibility of acquiring information through vestibular, tactile, and auditory senses (McCarley & Wickens, 2005; Lam et al., 2007, Hing et al., 2009). Lam et al. (2007) showed that supporting the sensory information through a haptic interface increases the workload and task activity as compared to that when relying exclusively on visual information. However, it effectively reduces collision frequency and ensures stability. In addition, providing integrated vestibular feedback with visual feedback for the degree of drone rotation is helpful to the situational awareness of the pilot (Giordano et al., 2010).

Second, control difficulties arise from inconsistencies between the viewpoints of the drone and the pilot. The drone pilot generally has access to one of two viewpoints. One is the third-person view (TPV), in which the pilot controls the drone while visually viewing it from the ground. The other is the first-person view (FPV), in which the pilot control the drone as if riding in the drone's cockpit. The FPV corresponds to the concept of egocentric view or internal piloting, while the TPV is corresponds to that of allocentric view or external piloting (Brewer & Pears, 1993; Soechting et al., 1996). Drone manipulation from the FPV is challenging in that the situation perception of drones may be insufficient or inaccurate owing to limited visibility, slow control response and feedback, and lack of sensory information (Williams, 2006; Hing & Oh, 2009).

The pilot attempts to manipulate the drone assuming that his/her forward direction and the forward direction of the drone are identical. However, when the drone rotates in the TPV mode, this is not the case. At such a time, because the forward direction of the drones is different from that assumed by the pilot, the pilot's intended direction of manipulation and the drone's actual movement often differ. This is called the reference-frame misalignment problem between controllers and drones (Williams, 2006). For example, if a drone is at an angle of 90˚ to the left with respect to the forward direction of the pilot, as shown in Figure 1, the drone will move toward the left of the pilot when the pilot moves the drone in the forward direction. In this case, the pilot must control the drone by consciously considering its perspective (Cho et al., 2016). Thus, if the pilot wants the drone to move straight in the forward direction, he/she must move it towards the right. Such mental rotation manipulation is known to be cognitively difficult and can cause several malfunctions (Arthur & Hancock, 2001; Gugerty & Brooks, 2004; McCarley & Wickens, 2005; Williams, 2006). In particular, the greater the angle of the required mental rotation, the worse the drone operation performance (Aretz & Wickens, 1992; Gugerty & Brooks, 2004). The reference-frame misalignment problem can be solved by the automation of the drone control, a function to always match the front of the pilot and the front of the drone (i.e., headless function), and drone manipulation from the FPV (Williams, 2006).


Figure 1 Reference-frame misalignment problem when controlling a drone

Third, owing to the lack of intuitiveness, it is difficult to operate a joystick controller, which is the most common type of drone controller, shown in Figure 2 (left). Drones move with propeller propulsion, and the attitude is controlled by three-axis rotation. A joystick controller uses the left joystick to control the yaw and throttle for rotation and up and down movement, while the right stick controls the pitch and roll to move the drones forward, backward, left, or right. However, as mentioned above, the drone control using a joystick is not intuitive, and it requires considerable training time for precise flights (Higuchi & Rekimoto, 2013). This is because drone control requires three-dimensional movement, unlike the two-dimensional movement common to other joystick-controlled devices such as remote-controlled (RC) cars and model ships. Essentially, learning and manipulation are challenging because it is difficult to cognitively map the two-dimensional motion of a joystick to the three-dimensional motion of drones. In particular, in the case of eight-way flight, which requires simultaneous operation of more than three axes including the yaw axis while maintaining the altitude of drones, the operation difficulty is very high and requires considerable learning. Unlike the two drone control difficulties mentioned above, the joystick controller makes it difficult to manipulate the drone regardless of the pilot’s viewpoints, i.e., TPV or FPV. Thus, it is very important to improve the drone controller to allow intuitive control, and several related studies are under-way (Vincenzi et al., 2015). The types of intuitive drone controllers that are superior to joysticks can be broadly classified into gesture, pointing, and motion-control methods. Studies have also been conducted to naturally manipulate drones using the human gaze or voice (Hansen et al., 2014; Yu et al., 2014; Fayjie et al., 2017; Menshchikov et al., 2019), or even brain signals (Jeong et al., 2020; Chen et al., 2020).


Figure 2 Joystick drone controller

Gesture control is a method of manipulating an unmanned device by mapping specific movements of the operator’s hand, arm, or head to the movement of the device. For example, Higuchi and Rekimoto (2013) proposed a manipulation method in which a drone moves forward when the pilot’s head is lowered, rotates when his or her head is rotated, and descends when his or her body is in a downward position. Several studies proposed a system for controlling a drone using pilot’s hand and finger gestures recognized by a Leap Motion device (Graff, 2016; Fernández et al., 2016; Sarkar et al., 2016; Gubcsi & Zsedrovits, 2018; Zhao et al., 2018; Bandala et al., 2019). Gesture-based natural controls are more intuitive and efficient than joystick controls in indoor environments (Lambrecht et al., 2011; Sanna et al., 2013; Mashood et al., 2015). Rognon et al. (2018) proposed a soft exoskeleton to control a drone with upper body gestures, and they showed that participants using the exoskeleton felt more immersed, had a greater sensation of flying, and reported less fatigue. DelPreto and Rus (2020) developed a gesture control system for manipulating a drone using wearable muscle and motion sensors. However, while gesture control may be intuitive for specific commands that require simple manipulation, it may not be appropriate for complex manipulation (Pfeil et al., 2013). This is because the more complex the flight trajectory is, the more gestures are required, and the greater the flight speed, the more difficult it will be to quickly change gestures according to the changing situation.

Pointing control is a manipulation method in which the pilot maintains a certain distance from the drone and moves it to the desired position by pointing to the corresponding point within the virtual hemisphere in front of the pilot. Chen et al. (2019) proposed an egocentric drone controller that allows pilots to arbitrarily position and rotate a flying drone using pointing interactions on a see-through mobile augmented reality (AR). Gromov et al. (2019) developed and validated a system for controlling the position of a drone in 3D space using pointing gestures. It is very easy to point to the desired location, but the greater the distance between the drone and the pilot, the less accurate will be the operation. In addition, from the FPV where the pilot’s forward view is blocked with a head-mounted display (HMD), he/she cannot accurately grasp the direction of the drone. Thus, this type of control method cannot be applied in this case.

Motion control is a method of mapping the movement of the controller to the motion of the drone; the drone rotates by following the angle of movement of the controller and moves forward as the controller moves forward (Figure 3). This type of operation is natural and intuitive for the manipulation of physical objects such as drones. Shin et al. (2014) implemented a controller system that controls a drone using the motion of a smartphone with a built-in G-sensor. Cho et al. (2016) showed that automatic drone rotation synchronizing with the direction of the joystick controller was highly effective because it could eliminate the misalignment problem. Morishita et al. (2016) proposed a control method that matches the movement of a drone with the head motion of a pilot wearing an HMD. This head-synced drone control could reduce virtual reality (VR) sickness (Watanabe & Takahashi, 2020). Such a motion control allows novices to learn quicker than they would with conventional controls and enables the stable operation of drones.


Figure 3 Motion-matching drone controller

There were researches in which some movements of the controller (e.g. rotation) were matched with some of the movements of the drone, but there was no controller that perfectly matched the 3-axis movement of the controller and the 3-axis movement of the drone. Therefore, this study proposes and develops the motion-matching drone controller that completely matches the motion of the drone. Then, this study confirms that the motion-matching drone controller in both TPV and FPV modes is easy to use and intuitive even for beginners.

2. Development of a motion-matching drone controller

In a previous study, Chang and Kim (2018) analyzed appropriate forms of motion-matching drone controllers and presented five representative controller types, as shown in Figure 4. They assessed their usability after implementing physical models with 3D printers. Their experimental results showed that users prefer the type shown in the second image in Figure 4, in which the throttle is manipulated with the thumb. Based on this, in this study, the shape of the motion controller was improved, and a design and prototype were developed, as shown in Figure 5. The main body of the motion controller is designed to have a curved shape with a natural wrist angle when gripped. It is designed symmetrically for easy operation with either hand, and it remains stable with the finger support between the forefinger and middle finger, even when the grip is loosened. The throttle dial switch is located at the center, and the two buttons responsible for sensor initialization and yaw-axis compensation functions are placed on either side of the dial switch. All three operating switches are placed in the line of the thumb at an angle of approximately 30˚ so that they can be operated comfortably while holding the controller. The power switch is placed below reach to prevent accidental pressing, and the battery is placed at the bottom so that the center of gravity is at the bottom.


Figure 4 Representative forms of a motion-matching drone controller

Figure 5 Prototype design and final rendering of the proposed motion-matching drone controller

Figure 6 shows the hardware configuration of the developed motion-matching controller. The controller consists of a control chip for motion control, a wheel switch for throttle control, a tactile switch for initializing the motion sensor, a slide switch for power control, a battery, and an antenna. The control chip is from E2BOX Ltd., which has a Cortex-M3 processor and an inertia measurement unit (IMU) sensor, MPU6050. It is placed inside the body of the controller to control the drone through wireless communication. The communication between the controller and the drone uses a 2.4GHz RF Transceiver, which is the same as the communication used by a wireless joystick controller.


Figure 6 Hardware structure of the motion-matching drone controller

The embedded software of the controller and drone, which was written in C language, functions as follows. When the user turns on the controller or presses the reset button, the controller's posture at that time matches the horizontal position of the drone looking forward. This determines the initial position shown in Figure 7. Following this, the controller continuously senses the values of the three axes (pitch, yaw, and roll) of the MPU6050 sensor together with the throttle value and transmits them to the drone. Then, the drone adjusts the speed by adjusting number of motor rotations based on the throttle value received from the controller. In addition, the drone compares its current values with the three received axes values and adjusts the motor accordingly to maintain the same posture as the controller at all times. Thus, the drone's posture always matches the controller's posture (Figure 7).


Figure 7 Manipulation of the motion-matching drone controller

Interestingly, the aforementioned reference-frame misalignment problem can be resolved using the motion controller. When rotating the drone, the pilot can rotate the wrist or arm holding the motion controller or rotate the body itself (Figure 8). However, when the angle to be rotated is high, the latter is more comfortable and natural than the former. If the pilot rotates their body to rotate the drone, their forward direction and the forward direction of the drone always coincide (the rightmost picture in Figure 8), thereby eliminating the reference-frame misalignment. This is similar to the headless function.


Figure 8 The relationship between the operator's posture and the solution of the reference-frame misalignment problem
3. Experiment
3. 1. Purpose

The purpose of this experiment was to verify whether the motion controller has improved usability over that of an existing joystick controller for actual drone steering in both FPV and TPV modes.

3. 2. Drone and controllers

The joystick controller is common on the market; for the joystick-controlled drone, DM002, which is an off-the-shelf six-axis RC drone quadcopter equipped with a 600TVL FPV camera (Figure 9), was selected. Meanwhile, the drone with the motion controller was developed by changing only the flight controller (FC) part of DM002 to the receiving module of the E2BOX control chip such that the drone could be controlled by the controller’s motion instead of the joystick. The weight difference between the two drones because of the FC was only about 0.5 g, and neither drone had a hovering function to keep the drone hovering in place.


Figure 9 Drones and controllers for the experiment (Left: Motion-matching drone; Right: Joystick drone)
3. 3. Participants

Twenty-two people (13 male, nine female), aged between 20–24 years (average age: 21.05 years), participated in this experiment. To accurately evaluate their intuition while using the controllers, all participants chosen for the experiment had no prior experience with drones. The corrected visual acuity of all the participants was appropriate for running the driving course, and they faced no major difficulties in handling the controllers. They received $30 in return.

3. 4. Experimental design

The primary factors considered in the experiment were the controller (motion, joystick) and the control viewpoint of the drones (TPV, FPV). These were designed as within-subject factors. Each participant performed four drone control tasks. First, the drone control task of operating in the TPV was performed with one of the two controllers, following which the same task was performed with the other controller. After one week, the same drone control tasks were performed with the two controllers in the FPV. In the FPV mode, the participants controlled the drone while wearing the Samsung Gear VR HMD, as shown in Figure 10. In order to avoid order effects, the order of execution between the two controllers was preconfigured such that it was balanced for each participant. However, the order between TPV and FPV was not balanced so as to enable participants to become accustomed to both controllers by using the TPV mode first. This was done because the FPV mode requires drone control while wearing an HMD, which is a difficult task. Pilot test results showed that operating in the FPV mode resulted in lower performance than in the TPV mode despite the learning due to the order of operation.


Figure 10 A participant wearing the HMD and manipulating the drone in FPV mode

The task procedure for participants was as follows. First, they learned basic drone control methods and performed three basic tasks: altitude maintenance, rotation, and linear movement. The altitude maintenance task entailed keeping the drone at a given height for 3 s. The rotation task entailed rotating the drone once to the left and right within the designated rectangular area. During the rotation task, the participants were required to use body rotation, and not hand or arm rotation, in order to avoid the reference-frame misalignment problem. The linear movement task entailed moving out of the designated line on the left or right side of the drone and then returning to the original position. If a participant succeeded in the altitude maintenance task three times, rotation task one time, and linear movement task one time, their basic training was deemed complete, and the course-driving task was started. The course path used in this experiment was structured as shown in Figure 11 with reference to the previous studies related to drone driving (Hansen et al., 2014; Graff, 2016) such that linear movement, altitude maintenance, and rotation of drones could be appropriately included. The course was constructed indoors so as not to be influenced by wind, and the driving route was marked with light bars and color cones. In particular, the second corner was equipped with an obstacle so that it could be passed by holding the drone over a certain altitude above the ground. The course-driving task in the TPV mode could be completed without rotating the drone. However, in the FPV mode, the participants had to rotate the drone to complete the course because only the forward view can be seen through the HMD.


Figure 11 Drone driving course in the experiment

In the course-driving task, each participant attempted to achieve three successful trials within a specified amount of time (TPV: 5 min; FPV: 10 min). The success rate (failure count), driving time, and number of collisions were recorded for each trial, and the data were reconfirmed after recording the video. As shown in Figure 12, the driving time was analyzed with respect to two parameters. The first is the time taken for an attempt that has been declared a success, which is the pure task completion time. The second is the total task completion time, including the time required for both failed and successful attempts. The number of collisions on the ground, ceiling, walls, and pillars was also recorded. The number of wall collisions was calculated as the number of times the drone hit the wall or was out of the marked course for longer than 3 s. For each attempt, if the participant moved in the wrong direction or the experimenter judged that the trial would not be successful, it was considered as a failure and then resumed from the starting point. If the participant could not achieve three successful attempts within the time limit, the task was terminated. Most of the participants took approximately 50 minutes to complete the task.


Figure 12 Total and pure task completion times

Surveys and interviews regarding the usability of each controller were conducted at the end of each task. As shown in Table 1, the questionnaire consisted of nine questions, with seven positive questions (1, 2, 4, 5, 6, 8, and 9) and two negative questions (3 and 7). This questionnaire was modified to fit the drone control context by referring to the NASA Task Load Index (NASA-TLX), System Usability Scale (SUS), and Usefulness, Satisfaction, and Ease of Use (USE) (Hart & Staveland, 1988; Albert & Tullis, 2013). The participants answered the questions based on a seven-point Likert scale.

Table 1
Questionnaire for measuring drone control usability

No Pos/Neg Questions
1 Positive I was able to quickly learn how to use the controller.
2 Positive It was easy to use the controller.
3 Negative I was confused when I piloted a drone with the controller.
4 Positive It felt natural to control the drone with the controller.
5 Positive I liked to use the controller.
6 Positive It was fun to use the controller.
7 Negative I felt frustrated when I piloted the drone.
8 Positive I was able to steer the drone easily with the controller.
9 Positive The drone always moved as expected when I piloted them.

4. Result
4. 1. Basic operations

Three types of basic operation tasks were performed for training: maintaining altitude, rotating left and right, and moving left and right linearly. Maintaining altitude is the most basic and important manipulation for smooth drone flights. The altitude maintenance task entailed keeping the drone at a given height for 3 s. Figure 13 shows the average number of attempts per trial to achieve three successes in maintaining a given altitude. The motion controller required significantly fewer attempts to maintain the altitude than the joystick controller (F(1,256) = 5.75, p = 0.017; average: 1.28 times vs. 1.56 times). Moreover, as the altitude maintenance cycle progressed, the number of attempts to succeed was reduced owing to the learning effect (F(2,256) = 7.90, p = 0.000; average: 1.72, 1.36, and 1.17 times). The interaction effect between the trial and the controller was not significant (F(2,256) = 2.38, p = 0.095). However, as shown in Figure 13, as the trial progressed, the number of attempts significantly reduced for the joystick controller. The motion controller, on the other hand, showed fewer attempts from the first try. In other words, participants were good at maintaining the altitude from the beginning with the motion controller. This suggests that the motion-matching control is more intuitive for altitude maintenance operations than the joystick control.


Figure 13 Number of attempts until three successes for the altitude maintenance task

In terms of the differences based on the viewpoints, the number of attempts for the FPV was significantly lower than those for the TPV (F(1,256) = 6.54, p = 0.011; average: 1.27 times vs. 1.56 times). The interaction effect between the viewpoint and the controller was also significant (F(1,256) = 8.28, p = 0.004). The motion controller required a small number of attempts irrespective of the viewpoints, but the joystick controller required fewer attempts for the FPV than the TPV. However, this appeared to be due to the learning effect, because the FPV experiment was performed one week after the TPV experiment.

The rotation task entailed rotating the drone once to the left and then to the right within the designated rectangular area. The motion controller was significantly better than the joystick controller in terms of the number of successful rotations (F(1,171) = 29.60, p = 0.000; average: 1.08 times vs. 1.86 times). It can be interpreted that the motion controller is more intuitive and stable for the rotation operation than the joystick controller. This may be because the motion controller is always aligned with the direction of the pilot’s body and the drone, and thus, there is no reference-frame misalignment problem, which is a possibility when using the joystick. Meanwhile, the number of rotation attempts for the FPV was lower than that for the TPV (F(1, 171) = 3.06, p = 0.083; average: 1.35 times vs. 1.60 times), and the interaction effect between the viewpoint and the controller was significant (F(1, 171) = 4.57, p = 0.034). With the motion controller, there was no significant difference in the number of rotation attempts for the viewpoints. However, with the joystick controller, the number of rotation attempts for the FPV was lower than that for the TPV.

The linear movement task entailed moving out of the designated line on the left or right side of the drone and then returning to the original position. The motion controller was better than the joystick controller in terms of the number of movement attempts required to succeed (F(1,84) = 3.57, p = 0.063; average: 1.36 times vs. 1.89 times), but there were no significant differences between the two viewpoints. However, the time taken to accomplish the task performed with the motion controller was significantly higher than that with the joystick controller (F(1, 84) = 4.82, p = 0.032; average: 51.6 s vs. 43.5 s). In other words, for the linear movement task, the joystick controller allowed quicker manipulation because it required the simple operation of moving a stick back and forth. Unlike for the altitude maintenance or rotation task, the joystick controller appears to be superior to the motion controller for the linear movement task. On the other hand, it took a longer time to achieve success in the FPV than in the TPV (F(1, 84) = 7.02, p = 0.01; average: 42.7 s vs. 52.4 s). Despite the learning effect, participants were more likely to face difficulty in the FPV mode to manipulate drones with the HMD in the linear movement task.

4. 2. Course-driving task
4. 2. 1. Success rate

Figure 14 shows the driving success rate (number of successes / total attempts) for each controller operating in TPV and FPV modes, respectively. For the TPV, the success rates of the joystick and motion controllers were 58% and 76%, respectively. Similarly, for the FPV, the success rates of the joystick and motion controllers were 48% and 78%, respectively. Clearly, the motion controller always showed higher success rate than the joystick controller (F(1,85) = 9.40, p = 0.003). However, the success rates for TPV and FPV were not significantly different (F(1,85) = 0.06, p = 0.800), and there was no significant interaction effect between the viewpoint and the controller (F(1,85) = 0.36, p = 0.553) .


Figure 14 Success rate of each controller for each viewpoint
4. 2. 2. Task completion time and number of failed attempts

Figure 15 shows the total task completion time, pure task completion time, and total number of failur es until a successful trial on the course, depending on the controller and viewpoint. The total task completion time was significantly lower for the motion controller than for the joystick controller (F(1,260) = 8.13, p = 0.005; average: 41.99 s vs. 59.26 s). In addition, the total task completion time for the TPV was significantly lower than that for the FPV (F(1,260) = 13.53, p = 0.000; average: 39.49 s vs. 61.76 s). There was a significant interaction effect between the viewpoint and the controller (F(1,260) = 5.48, p = 0.02), and the total task completion time was very high when operated with a joystick controller for the FPV, as shown in Figure 15.


Figure 15 Total and pure task completion time, and number of failed attempts

There was no significant difference between the pure task completion time for the motion and joystick controllers only for successful attempts (F(1,260) = 0.00, p = 0.945; average: 33.03 s vs. 33.15 s). This indicates that the difference between the total task completion times of the two controllers was due to the time of the failed attempts. Moreover, the pure task completion time for the TPV was significantly shorter than that of the FPV (F(1,260) = 39.72, p = 0.000; average: 27.82 s vs. 38.36 s). In particular, there was a significant interaction effect between the viewpoint and the controller in the pure task completion time (F(1,260) = 7.62; p = 0.006). The pure completion time of the joystick controller was shorter than that of the motion controller for the TPV but longer than that of the motion controller for the FPV. This indicates that using the joystick controller made it particularly difficult to manipulate drones in the FPV as compared to the motion controller.

The motion controller failed less often than the joystick controller (F(1,260) = 13.12, p = 0.000; average: 0.29 times vs. 0.98 times). Meanwhile, there was no significant difference between the number of failed attempts for different viewpoints (F(1,260) = 1.70, p = 0.194), and there was no interaction effect between the controller and viewpoint (F(1,260) = 1.96, p = 0.163). In particular, the number of failed attempts for the joystick-FPV combination was very high, as shown in Figure 15. This shows that the participants had difficulty in controlling the drone using the joystick controller for the FPV.

4. 2. 3. Number of collisions

Figure 16 shows the average number of collisions per crash point while driving the course with the two controllers for TPV and FPV. Drones often hit the ground; for all crash points except the ceiling, the motion controller had fewer crashes than the joystick controller. In terms of the total number of collisions, the motion controller underwent significantly fewer collisions than the joystick controller (F(1,260) = 11.37, p = 0.001; average: 1.44 times vs. 2.06 times). In addition, for the TPV, the number of collisions was less than that for the FPV (F(1,260) = 15.81, p = 0.000; average: 1.39 times vs. 2.12 times). In particular, the interaction effect between the controller and the viewpoint was significant (F(1,260) = 15.81, p = 0.000), and the difference between the number of collisions for the two controllers was evident for the FPV. This is because operation in the FPV mode was more difficult than that in the TPV mode. The number of collisions in the FPV and TPV modes did not differ significantly when using the motion controller; however, for the joystick controller, it increased for all positions in the FPV mode (by ~1.6 times for the ground, ~6.9 times for the ceiling, and ~1.8 times for the pillar). This is in agreement with the above-mentioned joystick-FPV combination in which the number of failed attempts and task completion time were high. Overall, the motion-matching controller helped manipulate drones more reliably without collisions than the joystick controller, particularly in the FPV mode.


Figure 16 Number of collisions with two controllers for the TPV and FPV modes
4. 3. Subjective evaluation

Figure 17 summarizes the total score of the subjective evaluations of the question items in Table 1 (total score = total score of positive questions - total score of negative questions). For the TPV, the subjective evaluation score was significantly higher for the motion controller than for the joystick controller (F(1,84) = 47.32, p = 0.000; average: 32.94 vs. 20.06 points). However, there was no significant difference between the scores for different viewpoints (F(1,84) = 1.27, p = 0.264). Overall, the motion-matching controller received more positive reviews than the joystick controller regardless of the viewpoints. Many participants commented that the motion controller was comfortable, natural, and fun. Meanwhile, although the joystick-FPV combination underwent several collisions and its task completion time was longer, the subjective score of the joystick was higher for the FPV. This can be attributed to the learning effect one week after operating in the TPV mode.


Figure 17 Total score of the subjective evaluation
5. Discussion and Conclusion

This study proposed the motion-matching controller that controls the drone by perfectly matching the posture of the controller and the drone in a three-dimensional space. Then, this study demonstrated that the motion controller is more intuitive and usable than the conventional joystick controller. In the conducted experiments, participants performed the task of driving a given course with actual developed controllers and drones. The results showed that the motion-matching controller achieved a higher success rate than the joystick controller in maintaining the altitude and rotating the drone. The motion-matching controller was found to be superior to the joystick controller in terms of task success rate, number of collisions, task completion time, number of failed attempts, and subjective evaluation, particularly in the FPV mode. Specifically, it was observed that participants naturally used the motion-matching controller to perform the complicated manipulations that require controlling two or more axes simultaneously, while these were difficult to perform with the joystick controller.

As far as we know, there is no such type of motion controller as suggested in this study. Moreover, there have been no prior studies showing how the motion-matching controllers are useful in two viewpoints (TPV, FPV) during manipulating real drones. The experimental results showed that the motion-matching controller is useful in both the TPV and FPV, and particularly so in the FPV. It appears that the motion-matching controller, which enables position control, can more easily and accurately control the rotational position than the joystick, which enables speed control.

The introduction of this paper lists three reasons for the challenges faced in drone control, including low situational awareness, reference-frame misalignment, and the lack of intuitiveness of a joystick controller. The motion-matching controller in this study intends to address the latter two difficulties. The motion-matching controller naturally requires the pilot to rotate his or her body to rotate the drone, and thus, the drone's forward direction and the body's forward direction always coincide, thereby eliminating the reference-frame misalignment problem. Additionally, the motion-matching controller is highly intuitive, because the movements of the pilot or controller exactly match the movements of the drone. The experimental results showed that the motion controller failed less often than the joystick controller. In particular, the number of failed attempts for the joystick-FPV combination was very high. This indicates that the participants had difficulty in controlling the drone in the FPV using the joystick controller. This difference between the two controllers in the FPV is largely due to the rotation manipulation of drones. With the joystick controller, it is necessary to stop the movement of the drone, rotate, and then move it again. The motion-matching controller allows quicker rotation because it can curve naturally without stopping the movement of the drone.

However, this study has some limitations. Some performance differences between the developed motion-matching controller and existing joystick controller may affect the experimental results. In addition, if the drones had a hovering function that enabled them to maintain altitude easily, the results of the experiment may have changed. Furthermore, because the experimental driving course was considerably simplified considering the fatigue of participants, it may be difficult to generalize the results to an actual driving environment with a complicated route. Therefore, further experiments are required considering the hovering function and the complexity of the running course.

These findings demonstrate that if the motion-matching drones are manufactured using more precise sensors and their sensitivity is adjusted to allow more stable control, they can be highly effective for leisure or in industrial applications utilizing an FPV mode.

Acknowledgments

This work has been conducted with the support of the “Project for Nurturing Advanced Design Professionals” initiated by the Ministry of Trade, Industry and Energy of the Republic of Korea, and was published based on Master’s thesis of the corresponding author in Seoultech.

References
  1. 1 . Albert, W., & Tullis, T. (2013). Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes.
  2. 2 . Aretz, A. J., & Wickens, C. D. (1992). The mental rotation of map displays. Human performance, 5(4), 303-328. [https://doi.org/10.1207/s15327043hup0504_3]
  3. 3 . Arthur, E. J., & Hancock, P. A. (2001). Navigation training in virtual environments. International Journal of Cognitive Ergonomics, 5(4), 387-400. [https://doi.org/10.1207/S15327566IJCE0504_2]
  4. 4 . Bandala, A. A., Maningo, J. M. Z., Sybingco, E., Vicerra, R. R. P., Dadios, E. P., Guillarte, J. D. D, Salting, J.O.P., Santos, M. J. A. S., & Sarmiento, B. A. E. (2019). Development of Leap Motion Capture Based-Hand Gesture Controlled Interactive Quadrotor Drone Game. 2019 7th International Conference on Robot Intelligence Technology and Applications, 174-179. [https://doi.org/10.1109/RITAPP.2019.8932800]
  5. 5 . Chang, W. J., & Kim, H. (2018). Development of an intuitive drone controller based on hand motions. Proceedings of HCI (Human Computer Interaction) Korea, 128-131.
  6. 6 . Chen, C., Zhou, P., Belkacem, A. N., Lu, L., Xu, R., Wang, X., Tan, W., Qiao, Z., Li, P., Gao, Q., & Shin, D. (2020). Quadcopter robot control based on hybrid brain-computer interface system. Sensors and Materials, 32(3), 991-1004. [https://doi.org/10.18494/SAM.2020.2517]
  7. 7 . Chen, L., Ebi, A., Takashima, K., Fujita, K., & Kitamura, Y. (2019). PinpointFly: An egocentric position-pointing drone interface using mobile AR. SIGGRAPH Asia 2019 Emerging Technologies, 34-35. [https://doi.org/10.1145/3355049.3360534]
  8. 8 . Cho, K., Cho, M., & Jeon, J. (2017). Fly a drone safely: Evaluation of an embodied egocentric drone controller interface. Interacting with computers, 29(3), 345-354. [https://doi.org/10.1093/iwc/iww027]
  9. 9 . DelPreto, J., & Rus, D. (2020). Plug-and-play gesture control using muscle and motion sensors. Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 439-448. [https://doi.org/10.1145/3319502.3374823]
  10. 10 . Eilan, N., McCarthy, R., & Brewer, B. (1993). Spatial representation: Problems in philosophy and psychology, 25-30.
  11. 11 . Fayjie, A. R., Ramezani, A., Oualid, D., & Lee, D. J. (2017). Voice enabled smart drone control. 2017 Ninth International Conference on Ubiquitous and Future Networks, 119-121. [https://doi.org/10.1109/ICUFN.2017.7993759]
  12. 12 . Fernandez, R. A. S., Sanchez-Lopez, J. L., Sampedro, C., Bavle, H., Molina, M., & Campoy, P. (2016). Natural user interfaces for human-drone multi-modal interaction. 2016 International Conference on Unmanned Aircraft Systems, 1013-1022. [https://doi.org/10.1109/ICUAS.2016.7502665]
  13. 13 . Graff, C. (2016). Drone Piloting Study. University of Italian Switzerland.
  14. 14 . Gromov, B., Abbate, G., Gambardella, L. M., & Giusti, A. (2019). Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU. 2019 International Conference on Robotics and Automation, 8084-8091. [https://doi.org/10.1109/ICRA.2019.8794399]
  15. 15 . Gubcsi, G., & Zsedrovits, T. (2018). Ergonomic quadcopter control using the leap motion controller. 2018 IEEE International Conference on Sensing, Communication and Networking, 1-5. [https://doi.org/10.1109/SECONW.2018.8396348]
  16. 16 . Gugerty, L., & Brooks, J. (2004). Reference-frame misalignment and cardinal direction judgments: group differences and strategies. Journal of experimental psychology: Applied, 10(2), 75. [https://doi.org/10.1037/1076-898X.10.2.75]
  17. 17 . Hansen, J. P., Alapetite, A., MacKenzie, I. S., & Møllenbach, E. (2014). The use of gaze to control drones. Proceedings of the symposium on eye tracking research and applications, 27-34. [https://doi.org/10.1145/2578153.2578156]
  18. 18 . Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology, 52, 139-183. [https://doi.org/10.1016/S0166-4115(08)62386-9]
  19. 19 . Higuchi, K., & Rekimoto, J. (2013). Flying head: a head motion synchronization mechanism for unmanned aerial vehicle control. CHI'13 Extended Abstracts on Human Factors in Computing Systems, 2029-2038. [https://doi.org/10.1145/2468356.2468721]
  20. 20 . Hing, J. T., & Oh, P. Y. (2009). Development of an unmanned aerial vehicle piloting system with integrated motion cueing for training and pilot evaluation. Journal of Intelligent and Robotic Systems, 54(1), 3-19. [https://doi.org/10.1007/s10846-008-9252-3]
  21. 21 . Hing, J. T., Sevcik, K. W., & Oh, P. Y. (2009). Improving unmanned aerial vehicle pilot training and operation for flying in cluttered environments. 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, 5641-5646. [https://doi.org/10.1109/IROS.2009.5354080]
  22. 22 . Jeong, J. H., Lee, D. H., Ahn, H. J., & Lee, S. W. (2020). Towards brain-computer interfaces for drone swarm control. 2020 8th International Winter Conference on Brain-Computer Interface, 1-4. [https://doi.org/10.1109/BCI48061.2020.9061646]
  23. 23 . Lam, T. M., Mulder, M., & Van Paassen, M. M. (2007). Haptic interface for UAV collision avoidance. The International Journal of Aviation Psychology, 17(2), 167-195. [https://doi.org/10.1080/10508410701328649]
  24. 24 . Lambrecht, J., Kleinsorge, M., & Krüger, J. (2011). Markerless gesture-based motion control and programming of industrial robots. ETFA2011, 1-4. [https://doi.org/10.1109/ETFA.2011.6059226]
  25. 25 . Mashood, A., Noura, H., Jawhar, I., & Mohamed, N. (2015). A gesture based kinect for quadrotor control. 2015 International Conference on Information and Communication Technology Research, 298-301. [https://doi.org/10.1109/ICTRC.2015.7156481]
  26. 26 . McCarley, J. S., & Wickens, C. D. (2005). Human factors implications of UAVs in the national airspace. University of Illinois Institute of Aviation Technical Report (AHFD-05-5/FAA-05-1), Savoy, IL. Aviation Human Factors Division.
  27. 27 . Menshchikov, A., Ermilov, D., Dranitsky, I., Kupchenko, L., Panov, M., Fedorov, M., & Somov, A. (2019). Data-driven body-machine interface for drone intuitive control through voice and gestures. IECON 2019-45th Annual Conference of the IEEE Industrial Electronics Society, 5602-5609. [https://doi.org/10.1109/IECON.2019.8926635]
  28. 28 . Morishita, K., Yanagisawa, H., & Noda, H. (2016). Intuitive control for moving drones. SIGGRAPH ASIA 2016 Posters,1-2. [https://doi.org/10.1145/3005274.3005287]
  29. 29 . Pfeil, K., Koh, S. L., & LaViola, J. (2013). Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles. Proceedings of the 2013 international conference on Intelligent user interfaces, 257-266. [https://doi.org/10.1145/2449396.2449429]
  30. 30 . Robuffo Giordano, P., Deusch, H., Lächele, J., & Bülthoff, H. H. (2010). Visual-vestibular feedback for enhanced situational awareness in teleoperation of UAVs. 66th American Helicopter Society International Annual Forum 2010, 2809-2818.
  31. 31 . Rognon, C., Mintchev, S., Dell'Agnola, F., Cherpillod, A., Atienza, D., & Floreano, D. (2018). Flyjacket: An upper body soft exoskeleton for immersive drone control. IEEE Robotics and Automation Letters, 3(3), 2362-2369. [https://doi.org/10.1109/LRA.2018.2810955]
  32. 32 . Sanna, A., Lamberti, F., Paravati, G., & Manuri, F. (2013). A Kinect-based natural interface for quadrotor control. Entertainment Computing, 4(3), 179-186. [https://doi.org/10.1016/j.entcom.2013.01.001]
  33. 33 . Sarkar, A., Patel, K. A., Ram, R. G., & Capoor, G. K. (2016). Gesture control of drone using a motion controller. 2016 International Conference on Industrial Informatics and Computer Systems, 1-5. [https://doi.org/10.1109/ICCSII.2016.7462401]
  34. 34 . Shin, P. S., Kim, S. K., & Kim, J. M. (2014). Intuitive Controller based on G-Sensor for Flying Drone. Journal of Digital Convergence, 12(1), 319-324. [https://doi.org/10.14400/JDPM.2014.12.1.319]
  35. 35 . Soechting, J. F., Tong, D. C., & Flanders, M. (1996). Frames of reference in sensorimotor integration: Position sense of the arm and hand. In Hand and brain (pp. 151-168). Academic Press. [https://doi.org/10.1016/B978-012759440-8/50012-8]
  36. 36 . Vincenzi, D. A., Terwilliger, B. A., & Ison, D. C. (2015). Unmanned aerial system (UAS) human-machine interfaces: new paradigms in command and control. Procedia Manufacturing, 3, 920-927. [https://doi.org/10.1016/j.promfg.2015.07.139]
  37. 37 . Watanabe, K., & Takahashi, M. (2020). Head-synced Drone Control for Reducing Virtual Reality Sickness. Journal of Intelligent & Robotic Systems, 97(3), 733-744. [https://doi.org/10.1007/s10846-019-01054-6]
  38. 38 . Weibel, R. E., & Hansman, R. J. (2006). Safety considerations for operation of unmanned aerial vehicles in the national airspace system. MIT International Center for Air Transportation Report, No. ICAT 2005-01, Cambridge, MA.
  39. 39 . Williams, K. W. (2004). A summary of unmanned aircraft accident/incident data: Human factors implications. Federal Aviation Administration Oklahoma City OK Civil Aeromedical Inst.
  40. 40 . Williams, K. W. (2006). Human Factors Implications of Unmanned Aircraft Accidents: Flight-Control Problems. Emerald Group Publishing Limited. [https://doi.org/10.1016/S1479-3601(05)07008-6]
  41. 41 . Yu, M., Lin, Y., Schmidt, D., Wang, X., & Wang, Y. (2014). Human-robot interaction based on gaze gestures for the drone teleoperation. Journal of Eye Movement Research, 7(4), 1-14. [https://doi.org/10.16910/jemr.7.4.4]
  42. 42 . Zhao, Z., Luo, H., Song, G. H., Chen, Z., Lu, Z. M., & Wu, X. (2018). Web-based interactive drone control using hand gesture. Review of Scientific Instruments, 89(1), 014707. [https://doi.org/10.1063/1.5004004]