An interactive gesture control system for collaborative manipulator based on Leap Motion Controller

Gesture control is often used for the control of robotic manipulators. However conventional methods always focus on position control of robotic manipulators and use a fixed position to place gesture detection devices, which limits the flexibility to control and interact. This paper presents an interactive gesture control system based on Leap Motion Controller that can overcome these shortcomings. A coordinate transformation is performed between the position of the left palm detected by Leap Motion Controller and the tool center point (TCP) of the collaborative manipulator to obtain gesture data first. This gesture data is used to control the posture of a six-joint collaborative manipulator-Lite6. The position of gripper is controlled by grip level of the right palm. A controller is designed to manipulate a lift table on which Leap Motion Controller is placed to achieve adaptive ascending and descending movements to meet the needs of operators of different heights and arm spans. A modified Kalman filter is used to filter noise and smooth the signal captured by Leap Motion Controller. The experiments demonstrate that the system can operate stably and accurately control the position, posture, and gripper position of a collaborative manipulator in real time using human palms’ gestures.


Introduction
The application scenarios for collaborative manipulators have become increasingly extensive in recent years, encompassing assembly, sorting, handling, grinding, etc.Meanwhile, attention has been paid to the safety problem in the process of cooperation between humans and collaborative manipulators, especially in many industrial scenes.To ensure safety, the common way is to maintain a safe distance between the operator and the collaborative manipulator during operation.Therefore, it is necessary to choose the appropriate method of human-machine interaction, and among them, gesture control is one of the most widely used methods.The principle of gesture control is to detect the position and movement of the hand by gesture sensor, identify and judge it, and convert the identification and assessment results into appropriate signals or instructions to control the relevant equipment.In this process, the accuracy of gesture recognition is the prerequisite for effective control of the collaborative manipulator.The most common methods of gesture recognition can be divided into two categories: contact and non-contact.
2][3][4] This type of method uses wearable sensors to detect muscle electrical signals to judge gestures, and then control the corresponding devices, the representative of which is Myo armband.Myo armband can interact with the controlled device by detecting the changes in bioelectrical signals generated by the muscles when the arm moves and cooperating with the physical movements of the arm.Its recognizable gestures include double-clicking, opening, fist clenching, turning, left and right, etc.This type of method has great limitations because there are few gestures that can be judged, and the wristband needs to touch the skin directly to detect the gestures relatively accurately.][7][8] Operators wear smart gloves equipped with flexible sensors, and the sensors transmit the detected pressure and strain signals to the controller for processing to control the equipment.This type of smart glove can detect many gestures, such as VIRDYN MHand, SmartGlove, SenseGlove Nova, etc., but these gloves are often equipped with various sensors, signal transmission devices, and even control boards.They are bulky, uncomfortable to wear and use, expensive to buy, and some need to be equipped with virtual reality devices such as HTCVive.(3) Other wearable devices.As in the literature, 9 the author has designed a wristband device equipped with an accelerometer.The user can control the movement of the device by wearing this wristband.Its principle is to detect the movement and rotation of the human hand in three directions x, y, and z through the accelerometer.Because of the inherent characteristics of the accelerometer, the control actions that can be achieved are also more limited, and because the detection is the acceleration information of the sensor, other ways are used to make the sensor move with the hand at the same time (such as holding it in the wrist).
1][12][13] In this way, a pre-trained model is usually used to recognize the RGB images, and the recognized gestures are used to control the operation of the equipment.Although the system structure is simple, the identification equipment is cheap and easy to obtain, but limited by the principle of image acquisition, this type of method has the following disadvantages: it is easily affected by external environmental factors such as light; the gestures that can be recognized are limited; it is difficult to obtain information about the position of palms.(2)  The depth camera method.This method fuses 2D and 3D data acquired by the depth camera for gesture judgment.As in the literature, 14 the author estimates the pose of 2D hand bones from 2D images acquired by the depth camera and fuses them with depth images of hands to obtain a 3D hand pose.6][17] It can obtain more gesture information, but the recognition process is complicated, and the accuracy is affected by the fusion results.(3) Possibilities of using structured light technology.In this way, the hand skeleton model is constructed from the light information sent by the sensor and its return signal, and the gestures are judged according to the signal.The most representative of these products are Kinect and Leap Motion Controller.Kinect can not only recognize gestures, but also recognize arm and body movements, and its detection distance is also large.In papers, [18][19][20][21] the authors use Kinect for gesture recognition.Leap Motion Controller can recognize more gesture signals and return the spatial position information of the palm.However, its detection range is small, therefore, it is suitable for scenes where the detection distance is close, and the detection accuracy is required to be high.3][24][25][26][27] (4) Using lidar for gesture recognition.In the literature, [28][29][30][31] the authors use lidar to detect gestures.Lidar has a large detection range, so it can detect movements of the human body, arms, and even palms from a long distance, but its ability to judge gestures is limited.
As a typical application of gesture control, the existing literature on gesture sensor-based robot manipulator control can be divided into three categories: (1) Artificial neural network (ANN)-based algorithm.ANN-based gesture recognition algorithm is an adaptive machine learning algorithm that learns several times based on some learning criterion to gradually improve the gesture recognition rate.In the literature, [32][33][34][35][36] authors have used this type of algorithm to control the movements of the robot manipulator.In this way, a model is first trained by a neural network, then human arm or palm data are used as input to the network, and the trained network outputs the corresponding result as control signals.This method typically requires numerous samples to train the network, making the training process time-consuming.Moreover, the accuracy of gesture recognition is highly dependent on the quality of the training data.8][39][40][41] HMM is an important probabilistic model for statistical learning and sequence data processing.A Hidden Markov Model-based gesture recognition algorithm corresponds to an HMM model during training, and the classification with the highest probability is the recognition result.The HMM-based algorithm is suitable for time series modeling, better recognition rate for complex dynamic gesture trajectories, has fast operation, and is easy to add and improve the gesture library.However, in the operation control of robot manipulators, training a large amount of sample models makes the recognition algorithm less convenient.][44][45][46] The principle of the algorithm based on template matching is to compute the similarity between real-time input gesture features and those of the pre-built template library, then choose the type with the highest similarity as the gesture recognition outcome through specific methods.The template matching gesture control is stable, has less computation, and the template library is reduced and modified.The disadvantage is that the accuracy of gesture segmentation and the matching success rate of the template database and different gestures to be recognized are required.However, these problems can be solved by extracting more robust features.
Most of the research on gesture control for robotic manipulators mentioned above is image-based or mainly focuses on controlling movement in the x, y, z directions in Cartesian coordinates and controlling the action of the end effector.However, there are few studies on the position, posture, and gripper control of collaborative manipulators.On the other hand, most of the Leap Motion Controller research literatures use solid surfaces to place the gesture sensor, this fixed position method leads to the limitation of control accuracy and interaction range.In this paper, an interaction system is designed to control a six-joint collaborative manipulator (including x, y, z, roll/R, pitch/P, yaw/Y in Cartesian coordinates and gripper position) based on Leap Motion Controller.The paper makes three main contributions: (1) Designed a system that can use Leap Motion Controller data to control a collaborative manipulator's posture and the state of the gripper in real-time.To overcome the limitations of the fixedposition method, a controller is designed to manipulate a lift table that carries Leap Motion Controller to achieve adaptive ascending and descending motions.
(2) A constraint condition is designed to avoid duplicate data caused by the difference frequencies between Leap Motion Controller and collaborative manipulator, and make robotic manipulator track the data detected by Leap Motion Controller effectively.(3) A modified Kalman filter algorithm is used to filter noise and smooth the gesture signal captured by the sensor to achieve better tracking results.
The rest of the content of the paper is as follows: (1) System architecture, introducing the functions and characteristics of each component of the system; (2) Method, describing the control principle and coordinate transformation process in detail; (3) Experimental results and discussion, describing the test method, analyzing, and discussing the test results; (4) Conclusion: summarizing the research results and looking forward to the future work.

System architecture
The designed system consists of the following devices: a six-joint collaborative manipulator-Lite6 with a servo gripper installed, a Leap Motion Controller, a lift table with controller, and a PC.The PC sends a control signal to launch the Leap Motion Controller and receives its feedback signals, including palm posture, opening state of palm and distance between palm and Leap Motion Controller, etc.The system needs to calibrate first to calculate an appropriate distance and control the lifting table to reach the appropriate height.Then coordinate system transform and data mapping to acquire the TCP posture of the collaborative manipulator and gripper's state, duplicates also need to be dealt with in this process.After that, the filter algorithm is applied to acquire more stable data.The data is used to control the collaborative manipulator and gripper.The system architecture is presented in Figure 1.

Six-joint collaborative manipulator Lite6
Lite6 which manufactured by UFACTORY, was chosen as the controlled object due to its low cost and ease of use.It has a payload of 1 kg, an arm span of up to 440 mm, repeatability of 0.2 mm, and a maximum velocity of 500 mm/s.In servo mode, the embedded controller has a maximum receiver frequency of 250 Hz.The chosen servo gripper is manufactured by the same company as Lite6 for another product-xArm but is compatible with Lite6.The position/opening of the gripper can be controlled within the range of [210,  850].Users may utilize the SDK that is supplied by the manufacturer to expedite secondary development.Figure 2 depicts an overview of the Lite6 alongside its operational domain. 47

Leap Motion Controller
Leap Motion Controller is a gesture recognition sensor module manufactured by Ultraleap that can capture the user's palm and finger movements to interact with digital content.It is small, accurate, and fast, and can be integrated into enterprise hardware solutions or connected to virtual reality/mixed reality devices for design and development.The sensor can track hand movement in a three-dimensional space of 60 cm or more, with a typical field of view of 140°3 120°(as shown in Figure 3).With software or SDK, it can identify 27 different hand elements, including bones and joints, and its typical operating frequency is 120 Hz, which can fully meet the requirements of real-time gesture signal acquisition. 48

Lifting table with controller
Most of the research literatures about Leap Motion Controller use fixed surface to place the sensor.However, due to the difference in height and arm span between operators, this method of fixed sensor position usually limits the control accuracy and the operator's palm movement range.To overcome this limitation, a lift table is employed to place the Leap Motion Controller, as illustrated in Figure 4.The height range of the lift table is 60-110 cm.The device is powered by a 24 V DC motor.A circuit board is designed to control its lifting action.The board contains a DC motor forward and reverse control circuit and an IC socket.The IC socket can be adapted to the Arduino nano serial or MKR serial control boards.The motor control circuit is connected to the IO port of the Arduino control board via an IC socket.The Arduino control board is connected to the control PC via a USB cable.

Control PC
A laptop with Windows 11 operating system is used as the core control device, Leap Motion Controller Python SDK, xArm Python SDK (compatible with Lite6) 49 and other related Python libraries need to be installed on the computer.As Ultraleap Company no longer maintains its Python SDK for Leap Motion Controller and the third part library is unstable, the old version of Python SDK (supporting Python2) Redis software and Redis Python library is used to ensure the stability of the control system.The data transmission of Leap Motion Controller is shown in Figure 5.

Method
This section describes the control method in detail.Both palms are used to control Lite6 and its gripper.The left palm is used to control the posture (position and attitude) of Lite6, and the right palm is used to control the position/opening of gripper.

Leap Motion Controller placement location setting
As mentioned in the previous section, a system is designed to support the Leap Motion Controller for upward and downward movement.The optimal area for interaction with Leap Motion Controller is located just above it.It is assumed that the most precise data is obtained from the center of the device's interaction zone.It makes sense to set 30 cm directly above the controller to the initial palm position.To make the experience as comfortable as possible for different users, a calibration function is provided to control the lift table.When this function is performed, the distance value between the operator's palm and the controller is obtained by averaging three measurements taken at 1 s intervals.As the lift table is driven by a 24 V DC motor (the velocity is 40 mm/s), this means that the lift table can be controlled according to the distance above and the time of switching on.If the distance is less than 30 cm, the lift table is controlled to move up, otherwise it is controlled to move down.The time of turning on the DC motor can be calculated by formula (1).

Leap Motion Controller SDK attributes selection
Palms' ID and the posture of the left palm can be achieved directly through the corresponding attributes  provided by the Leap Motion Controller SDK, but three attributes that can describe the opening of a palm: The first is ''pinch_strength,'' which is the strength of a pinch between the thumb and the nearest fingertip as a value in the range [0, 1].An open, flat palm has a pinch strength of zero.As the thumb's tip approaches the finger's tip, the pinch strength increases to one.The second is ''grab_strength,'' which refers to the strength of a grab palm pose as a value in the range [0, 1].An open palm has a strength of zero.As the hand closes into a fist, its grip strength increases to one.The third is ''sphere_radius,'' which is the radius of a sphere that fits the curvature of the palm.This sphere is placed roughly as if the palm were holding a ball.So the size of the sphere decreases as the fingers are curled into a fist. 50he official document does not mention the range, but our test shows that most of the ''sphere_radius'' are in the range [60, 30], where 60 is for a flat palm and 30 is for a fist.So which attribute should be used to judge the opening of the palm to control the position/opening of the gripper needs to be discussed.A specific experiment is utilized to test the stability of the three attributes.In the experiment, a dexterous palm-Ti5 (produced by Ti5 ROBOT) is used.The palm has six degrees of freedom and can be controlled by Modbus-RTU protocol, it has 90°depending on the state.To improve the ability of the Leap Motion Controller tracking software to successfully track it, the dexterous palm is covered with a tight-fitting surgical glove.An open, flat palm corresponds to zero and a fist corresponds to one.Randomly place the dexterous palm into the Leap Motion Controller's interaction area.Move it from open to closed in 90 steps and record the data for the three attributes in each step.Judge which of the three sets of data has the smallest deviation (sum of mean square error) from the actual opening and closing degree of the dexterous palm and select that attribute as the controlled variable.To facilitate comparison of the data, the recorded data are mapped to values in the range [0, 90] according to the state of gripper.0 means that the palm is pinched, and 90 means that the palm is open and flat.The data obtained from the ''pinch_strength'' and ''grab_strength'' attributes are transformed using formula (2), while the ''sphere_radius'' data is transformed using formula (3).In these formulas ''pinch_strength,'' ''grab_strength,'' and ''sphere_radius'' refer to raw data acquired by SDK functions of Lite6, and ''pinch,'' ''grab,'' and ''diameter'' refer to mapping data.''Diameter'' is used here instead of ''radius'' to make the data comparison clearer.
Tens of tests were performed within this framework, eight of which were randomly selected for comparison.The tracking data is shown in Figure 6.It can be seen that the data acquired by ''sphere_radius'' is more stable.Table 1 shows the sum of the mean square error of the mapping data deviates using the three attributes.It also shows that most deviations of ''sphere_radius'' is smaller than the data deviation of the other two attributes, so attribute ''sphere_radius'' is selected to control the gripper.

Manipulator coordinates transformation and gripper control
The posture of the human hand and the posture of the collaborative manipulator Lite6 are uniquely determined by six parameters (x, y, z, roll/R, pitch/P, yaw/ Y).However, their reference coordinate systems are different, so to accurately control the posture of Lite6, the hand posture detected by Leap Motion Controller must be transformed into the posture of Lite6 in its TCP coordinates.
Leap Motion Controller uses right-handed Cartesian coordinates, and its origin is the center of the top surface when placed horizontally, defined here as O L (x L , y L , z L ). roll/R is the angle of rotation around the z-axis, pitch/P is the angle of rotation around the x-axis, and yaw/Y is the angle of rotation around the y-axis.The position of the operator's palm is different when it enters Leap Motion Controller's interaction area each time.To ensure that the position of the controlled object Lite6 remains constant, regardless of the initial palm position, the four-finger direction is made consistent with the negative direction of the z-axis of Leap Motion Controller's coordinates during the test, and the current palm position detected by Leap Motion Controller is defined as the coordinate origin under the new coordinate system (later expressed as LeapNew coordinates), which is expressed as the directions of O H (x H , y H , z H ). The directions of R, P, and Y are the same when the coordinate origin is O L , as shown in Figure 7.
For Lite6, the origin of its base coordinate system is O B (x B , y B , z B ).The tool coordinate system is obtained by rotating the base coordinate system 180°around its x axis.When the end effector is not installed, the origin of the tool coordinate system is located at the center of the flange.So, the origin of the tool coordinate system is defined as O E (x E , y E , z E ), the corresponding coordinate values are calculated by formula (4).
Suppose a point O 0 H in LeapNew coordinates, which is transformed into a point O 0 E in Lite6 tool coordinates, H T L shows the transformation matrix of the palm reference plane (the plane of palm is defined as the reference plane) and Leap Motion coordinate system, L T B is the transformation matrix of Leap Motion Controller's coordinates and Lite6 base coordinates.B T E is the transformation matrix of Lite6 base coordinates and tool coordinates.According to the coordinate transformation relationship, it can be known that the transformation relationship between the reference plane of the Lite6 flange and the reference plane of the palm is shown in formula (5), where H T E is unchanged, that is, when the palm position changes, the center point of Lite6's end flange should also make corresponding movement to ensure that the H T E is a fixed value: The above question of how palm-center posture transforms into Lite6 TCP posture can be translated According to the above conditions, the transformation matrix B T E can be found by the following equation: B T E 0 can calculate by equation ( 7) accordingly: After substituting the known information, B T E 0 can be expressed as: Therefore, the position and posture of O 0 E can be obtained accordingly.
The key to controlling the gripper is to map the data obtained from the ''sphere_radius'' attributed to the range of movement of the gripper.The range is [210, 850], but since the gripper does not need to grasp an object in the tests, the range value is set to [0, 800], 1 each unit represents 0.1 mm here.The formula (9) illustrates the mapping relationship with ''gripper'' indicating the gripper position.However, Lite6 does not support direct control of the gripper via tool I/O in the end effector flange, so the gripper is connected to the control laptop via the 485 to USB module.
Duplicate data processing  8.To make the data easy to distinguish in the figure, a data point is plotted every five steps in 10,000 steps.If there are several adjacent positions with the same data, they are treated as one position data point.As evidenced by Figure 8, there are many duplicate data in each test.In Table 2, ''Max No.'' is the maximum number of identical position data, and ''Percentage'' refers to the proportion of data with a certain number of repetitions greater than a threshold in the total data of a test.A high proportion (more than 30%) of data is duplicated within the dataset.Since the control signal of the gripper comes from the controller of Lite6, it can be inferred that the gripper control also contains duplicated data.To avoid duplicate data and to make Lite6 track the data detected by Leap Motion Controller effectively, a constraint is designed: the motion instruction of Lite6 can only be executed when the target position is different from the previous one.

Data filtering
Due to the presence of noise in the system, the output signal is filtered using an improved algorithm based on Kalman filtering to reduce the control error.The system state space expression is shown in equation (10).
Traditional Kalman filtering algorithm can be expressed as equation (11).
Q represents system noise, and R represents measurement noise in equation (11).They can influence the value of the Kalman gain K k , thereby affecting the estimation results.However, the parameters of traditional Kalman filtering algorithm are set to constant value, so it cannot adapt to real-time changes.Therefore, we use a modified method to overcome this problem.The main idea of the algorithm is to construct suitable Q and R associated with the tool center point's velocity and gripper's velocity of Lite6 to reduce noise, because real-time velocity is closely related to the noise.Theoretically, a faster velocity may cause larger noise, and the covariance of system noise Q and measurement noise R will increase accordingly.We define two factors f Q and f R in the range [0, 1] to describe this relationship, the factors can be calculated by the following equation.
where v t and v g represent the real-time velocity of tool center point and gripper, respectively, which can be obtained using the official API functions of Ufactory.v tmax and v gmax denote the maximum velocity of tool center point and gripper, respectively.According to the user manual, the maximum velocity of tool center point is 500 mm/s, while the maximum velocity of the gripper is 5000 rpm.Q and R are rewritten in the velocitydependent form, as expressed in equation (13).
Experimental results and discussion In this section, we first present the setup and control flow of the system.Then, we apply the proposed method to control Lite6 and demonstrate the experimental results.Finally, we discuss the method for filtering to obtain more precise results.

Experimental setup
The platform shown in Figure 9 is for experiments, and the position tracking error at each step of the test is calculated.When utilizing ''set_servo_cartesian'' function of xArm SDK to control Lite6, the wait parameter of the function needs to be set to either Ture or False.Our previous research demonstrated better results when selecting False as the wait parameter. 51Therefore, the same parameter settings were employed in the experiment as in the previous study.Dozens of tests were conducted, with each test consisting of 500 cycles.For the convenience of judgment, calculate the Euclidean distance with the position tracking error of PO 0 E (position acquired by the ''get_position'' function of the SDK) and O 0 E by formula ( 14), ignoring roll, pitch, and yaw data.The same approach was used for the gripper control tests.
When the program is running, first initialize Lite6's TCP point, which is the origin O E of the TCP coordinate system and set Lite6's motion mode to 1 (servoj mode), in this mode, TCP will move to the given joint position with the fastest velocity (180°/s) and acceleration.Then, start Leap Motion Controller to monitor palm gestures.When both palms enter Leap Motion Controller's interaction area, set the right palm position to O H (this setting is done only once), and use O 0 E calculated by formula (8) to control Lite6 move to the target position and posture by using ''servo_cartesian'' motion and ''set_servo_cartesian'' function in the SDK.The gripper is controlled by the ''radius'' parameter of left palm using the ''set_gripper_position'' function.The specific procedure flow is as Figure 10.

Experimental result and discussion
We move the palm with a random trajectory within the interaction area of Leap Motion Controller and record the position tracking error (mm) in 500 steps of eight random tests.The factory default setting parameters of velocity and mavcc (acceleration) are used, the velocity was set to 115 mm/s and mavcc was set to 2000 mm/s 2 .The data includes the minimum error, maximum error, and average error for each test.To track the gripper's position in real-time, we set the gripper's velocity to the maximum value and record the gripper minimum, maximum and average tracking error (mm) of eight random tests as well.Since the gripper moves horizontally in a straight line and its range of motion is relatively small compared to the working range of the TCP, each gripper test covers 200 steps of data.In these tests, the palm posture data and scratch data acquired from Leap Motion Controller are assumed as the target position.
The statistical data shows that the average value of the position tracking error for the eight selected tests is 12.47 mm.Although the real-time performance is ideal and the error is within an acceptable range, the robot's motion trajectory and the gripper's position curve are not smooth.Therefore, mean filtering and Kalman filtering are added to compare with the unfiltered data (Position tracking data acquired by the robotic manipulator's built-in controller).The mean filter method is as follows, where O 0 E j ð Þ represents the filtered value at step j.
The three-dimensional Kalman filter used in this study is a standard one.The initial position estimate is set to 0, and the remaining filter parameters are as follows: Figure 11 shows the position tracking trajectory of expected position, unfiltered data, mean filter (M-filter), Kalman filter (K-filter), and modified Kalman filter (MK-filter).When using the SDK function to directly control Lite6, sometimes its tracking position has a noticeable offset from the expected position acquired from Leap Motion Controller.Due to the fast speed of Lite6 and the high control accuracy of the embedded controller, the effect of pre-filtering and post-filtering does not appear too obvious in the figure.Therefore, we qualitatively analyzed the tracking error by plotting the error graphs in Figure 12.Unfiltered data represents the position tracking error using the SDK function, Mfilter indicates the mean filter error, K-filter indicates the Kalman filter error, and MK-filter represents the error of our filter algorithm.The mean filter can smooth the trajectory to some extent, but its tracking error is still large.In contrast, the Kalman filter is clearly better than the mean filter, and the tracking error using our modified algorithm is somewhat smaller than that of the Kalman filter.
To effectively compare the minimum errors of difference algorithms, we recorded the minimum error (min_error d ), maximum error (max_error d ), and average error (avg_error d ) of position tracking over eight experiments in Table 3.The minimum errors in the table are the non-zero minimum errors in the experimental data.The data also show that the minimum error, maximum error, and average error of the modified algorithm are the smallest among them.This indicates that better tracking results can be obtained by using the modified algorithm.Table 4 records the minimum time consumption (min_time g ), maximum time consumption (max_time g ) of each cycle, and the average time consumption (avg_time g ) for 500 cycles of the different algorithms for each of the eight tests.Whether using mean filtering, Kalman filtering, or modified filtering algorithms, more time is required compared to without filtering.Different algorithms consume different amounts of time due to differences in computational complexities.The computational complexity of mean filtering is lower than that of the Kalman filtering algorithm and modified algorithm.As a result, it consumes a relatively low amount of time, while Kalman filtering and modified algorithms consume considerably more time, but still within the millisecond range.It means that the real-time nature of the system will not be affected by the algorithms.
The trajectory of the gripper control also needs to be smoothed.The difference from TCP position tracking   is that the smoothing of the gripper control trajectory uses a one-dimensional filter.Since the palm is generally open when it enters the interaction area, we set the initial position of the gripper to a value between 700 and 800.The movement distance of the gripper is converted from the angular placement of servo motor, and its maximum linear velocity is smaller relative to the maximum velocity of the TCP movement.Therefore, we try to avoid opening and closing the right palm as much as possible in a short period of time.Figure 13 show the results of eight tests of tracking gripper position using different algorithms.From the figure, it can be seen that the results obtained by the proposed modified algorithm are closer to the expected values.Figure 14 shows the tracking error, and it is obvious that the tracking error of the proposed modified algorithm is less than that of other algorithms.We also recorded the minimum error (min_error g ), maximum error (max_error g ), and average error (avg_error g ) over eight tests in Table 5.The minimum error, maximum error, and average error of the modified algorithm are the smallest among difference algorithms.Due to the slower velocity of gripper movement relative to the TCP movement, the advantages of filtering appear more obvious.Table 6 records the minimum time consumption (min_time g ), maximum time consumption (max_time g ) of each cycle, and the average time consumption (avg_time g ) for 200 cycles of the different algorithms.Although the filtering algorithm consumes more time, the system still maintains real-time capability.

Conclusions
Gesture control plays an important role in humanmachine interaction, as it enables operators to control equipment remotely, ensuring operator safety.It is often used to control robotic manipulators.This study presents an interactive gesture control system for collaborative manipulator control based on Leap Motion Controller.The system design allows for control of the posture of collaborative manipulator with six degrees of freedom in Cartesian coordinates, as well as the gripper position.Unlike most previous literatures, this system provides increased precision and flexibility in controlling the manipulator.To address limitations associated with control accuracy and the range of movement of the operator's palm caused by the fixed sensor position method, an adaptive lift table control system was implemented.Kinematics analysis of the collaborative manipulator Lite6 was conducted to derive the transformation matrix between the palm position and Lite6's tool center point (TCP) within the interaction area of Leap Motion Controller, from which control data was obtained.A constraint condition was designed to reduce computation by removing duplicate data generated by the disparity in frequency between the Leap Motion Controller and Lite6.To ensure the smooth trajectory of the TCP and gripper operation, a modified Kalman filter algorithm is proposed.The results of experiments indicate that the interactive gesture control system we designed can efficiently and effectively control Lite6 in real-time, with the tracking error falling within an acceptable range.
The system designed above can be applied to scenes that are not suitable for human operation in the field, such as assembly work in harsh working environments with high temperatures.The use of gesture control can protect people from injury to the greatest extent.However, using two palms to control the collaborative manipulator and the end effector such as the gripper can lead to an occlusion situation.On the other hand, 30 cm right above the Leap Motion Controller is considered in this manuscript as a suitable starting position of the palm in this manuscript, but the most suitable position needs to be tested by more scientific experiments.In future research, it is necessary to study how to solve the problem of interference from both palms, and more scientific proof of the appropriate initial position needs to be conducted.In addition, to make the collaborative manipulator perform more tasks, it would be more scientifically valuable and challenging to study the collaborative manipulator equipped with a dexterous palm.

Figure 3 .
Figure 3. Leap Motion Controller: (a) general view and size, (b) field of view (top view), and (c) field of view (left view).

Figure 4 .
Figure 4. Lifting table and its controller.

Figure 6 .
Figure 6.Stability testing of the three attributes: (a) attributes value of test one, (b) attributes value of test two, (c) attributes value of test three, (d) attributes value of test four, (e) attributes value of test five, (f) attributes value of test six, (g) attributes value of test seven and (h) attributes value of test eight.

H
T E remains unchanged.The specific steps to solve this problem are as follows: (1) Obtain O 0 H , and the parameters (x, y, z, R, P, Y) of O 0 H can be obtained by Leap Motion Controller.Therefore L T H 0 is known.(2) The transformation matrix between Leap Motion coordinate system and the transformation matrix L T B of the Lite6 base coordinate system is a fixed value.(3) By obtaining O 0 E , the position and posture of collaborative robotic manipulator can be obtained by the upper computer software.

Figure 7 .
Figure 7. Schematic diagram of the coordinates and transform method.

Figure 8 .
Figure 8. Duplicate data distribution: (a) duplicate data of test one, (b) duplicate data of test two, (c) duplicate data of test three, (d) duplicate data of test four, (e) duplicate data of test five, (f) duplicate data of test six, (g) duplicate data of test seven and (h) duplicate data of test eight.

Figure 11 .
Figure 11.Position tracking trajectory: (a) trajectory of test one, (b) trajectory of test two, (c) trajectory of test three, (d) trajectory of test four, (e) trajectory of test five, (f) trajectory of test six, (g) trajectory of test seven and (h) trajectory of test eight.

Figure 12 .
Figure 12.Position tracking error: (a) tracking error of test one, (b) tracking error of test two, (c) tracking error of test three, (d) tracking error of test four, (e) tracking error of test five, (f) tracking error of test six, (g) tracking error of test seven and (h) tracking error of test eight.

Table 1 .
Mapping data deviate of the three attributes.

Table 2 .
Percentage of duplicate data.

Table 4 .
Time consumption of filter algorithms in position tracking (ms).