Comprehensive evaluation of robotic global performance based on modified principal component analysis

The multivariate statistical method such as principal component analysis based on linear dimension reduction and kernel principal component analysis based on nonlinear dimension reduction as the modified principal component analysis method are commonly used. Because of the diversity and correlation of robotic global performance indexes, the two multivariate statistical methods principal component analysis and kernel principal component analysis methods can be used, respectively, to comprehensively evaluate the global performance of PUMA560 robot with different dimensions. When using the kernel principal component analysis method, the kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. Because kernel principal component analysis with polynomial kernel function is time-consuming and inefficient, a new kernel function based on similarity degree is proposed for the big sample data. The new kernel function is proved according to Mercer’s theorem. By comparing different dimension reduction effects of principal component analysis method, the kernel principal component analysis method with polynomial kernel function, and the kernel principal component analysis method with the new kernel function, the kernel principal component analysis method with the new kernel function could deal more effectively with the nonlinear relationship among indexes, and its calculation result is more reasonable for containing more comprehensive information. The simulation shows that the kernel principal component analysis method with the new kernel function has the advantage of low time consuming, good real-time performance, and good ability of generalization.


Introduction
With the development of robot to high-speed and highprecision direction, evaluation of robotic performance properly becomes an important topic in the field of robotic research. Many scholars have proposed many performance indexes to evaluate robotic kinematics and dynamics performance at home and abroad. The global performance index 1 proposed by Gosselin is one of the important evaluation indexes which includes 2,3 angle velocity global performance index, force global bearing performance index, and linear speed global performance index, and so on. Because of the diversity and complexity of practical projects and the interrelation of the single robotic global performance indexes, optimum motion design of the robot based on global performance should deal with the specific single global performance index, at the same time solving the comprehensive global performance.
The multivariate statistical methods, including principal component analysis (PCA) and kernel principal component analysis (KPCA) method, are applied to research the correlation and comprehensive evaluation of variable indexes in agriculture, biology, and other researching fields. 4 And in mechanics research, PCA and KPCA have also been successfully applied in machine's condition monitoring and fault identification for feature extraction capabilities. [5][6][7][8] Therefore, because of the diversity and correlation of robotic global performance indexes, the multivariate statistical methods PCA and KPCA can be feasible for the research of robotic global performance comprehensive evaluation. In previous research, PCA and KPCA methods are used to calculate the single performance indexes respectively in robotic performance comprehensive evaluation, which prove the effectiveness of the PCA and KPCA methods. 9 Compared to other multiobjective mathematical methods, the data size of single performance indexes is not limited in PCA and KPCA, and the single indexes could be transferred into principal components. Then some redundant information is discarded by linear or nonlinear transformation, so the calculation for comprehensive performance evaluation is easier and the weights of the single indexes are more objective than other multi-objective mathematical methods.
When using the KPCA method, the kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. In our previous study, the global performance of PUMA560 robot with different dimensions has been comprehensive evaluated by using the KPCA method, and polynomial kernel function has been used in KPCA calculation after comparison. 10 As the KPCA method calculation with polynomial kernel function is time-consuming and less efficient, a new kernel function based on similarity degree is proposed for the big sample data in the article. And the new kernel function can be proved according to Mercer's theorem to be used as a kernel function.
With different dimensions, the performance of the PUMA560 is comprehensively evaluated by using the three methods, PCA, KPCA with polynomial kernel function, and KPCA with the new kernel function, and then the best dimension with comprehensive performance of PUMA560 robot can be selected. The comparison between the dimension reductions of the three methods mentioned above indicates that indexes' nonlinear relationship can be solved effectively by KPCA with the new kernel function, which can provide more information according to the first principal component so that the solution is reasonable. Therefore, KPCA with the new kernel function which is more suitable is chosen to evaluate robot's comprehensive performance. And through the experimental verification, the KPCA method with the new kernel function has the advantage of low time consuming, good real-time performance, and good ability of generalization.

Principle of PCA
On the premise that the loss of sample data is the minimum, PCA's main thought is that with only little loss of information, based on linear transformation, original data multidimensional variates are substituted with some new independent principal components. The nature of original sample could be showed by principal components that have their certain significance. For example, the original data's changing is reflected by the first principal component, and the samples' other features could be reflected by rest principal components. 11 The main steps 9 of PCA are as follows: 1. Standardize the primitive data to eliminate the adverse effects caused by different dimensions The original data are . . . ; n; j ¼ 1; 2; . . . ; p ð1Þ Among them The standardized matrix is Calculate the correlation coefficient matrix Among them, R is an n Â n symmetrical matrix, the data on the diagonal are all 1; Z 0 is the transposed matrix of matrix Z.
3. Calculate the eigenvalues and eigenvectors of correlation coefficient matrix: The eigenvalues l i are obtained by lE À R j j¼ 0, and then sort l i by size, l 1 ! l 2 ! Á Á Á ! l p ! 0.
The eigenvectors ZX i are obtained by lE À R ð Þ X ¼ 0,

Determine the number of principal components
The contribution rate of the principal component is The cumulative contribution rate of the m principal components ahead is Generally, the cumulative contribution rates CV i ! 85% of the m principal components ahead are selected. 5. Determine the expression of the principal component 6. Determine the comprehensive evaluation function Principle of KPCA KPCA method achieves dimension reduction of nonlinear space by linear algebra, support vector machine, and other relative theories, solving the problem of information redundancy, and keeping the completion of original data. Its main thought is that by nonlinear mapping F chosen, and the input vector x is mapped into a higher dimensional linear feature space F, then in space F the PCA method is used to calculate, getting the linear principal component which is the nonlinear principal component in original input space virtually. Let x i 2 R d i ¼ 1; 2; . . . ; p ð Þbe the d dimension sample point of input space. R d is mapped into feature space F by nonlinear mapping F.
The sample point in F is called ðx i Þ. Nonlinear mapping F is so difficult to solve, in common, thus KPCA uses a kernel function to achieve it.

Kernel function
When using the KPCA method, the kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. The basic function of kernel function is to import two input vectors x and z in low-dimensional space, at the same time, in the highdimensional space calculating these two vectors' inner product without specific mapping from low-dimensional space to high-dimensional space. 12 And the definition of kernel function is as follows. Suppose x; z 2 X , and X belongs to RðnÞ space, and the mapping F : X ! H ð Þof the nonlinear function F can be realized from X space to the feature space H (the inner product space or Hilbert space: H; < Á; Á >), and H belongs to RðmÞ; n << m.
In equation (8), h; i is the inner product and k x; z ð Þ is the kernel function.
According to functional theory, if a kernel function satisfies Mercer's condition, the kernel function could be equal to the inner product in transformed space, and when any symmetric function satisfies Mercer's condition, the symmetric function can be used as the kernel function. 13 For a specific problem, the key to solving a specific nonlinear classification problem is to select and construct a suitable kernel function.
The following three kernel functions can be commonly used:
In this article, the kernel function parameters are optimized based on the particle swarm algorithm. The particle swarm optimization model is established as follows.
are two kinds of feature samples of the feature space. The mean vectors of the two types of samples in the feature space are The sum of the squares of distances between classes is The square of the dispersion in x k is A particle swarm optimization fitness function can be built Here, we find the minimum value of the fitness function F s ð Þ. The minimum point s is the optimal core width.

Constructing new kernel function
There are many researches about the construction of kernel function. 14-16 Constructing a new kernel function to solve the specific problem is important. Because KPCA method calculation with polynomial kernel function is time consuming and inefficient for the big sample data in our previous study, a new kernel function based on similarity degree could be proposed. The construction steps of the new kernel function are as follows: 1. The corresponding dimension multiplication of two row vectors x i T x j could be got to calculate similarity degree, and it is as molecular of the new kernel function. 2. The Manhattan distance of two row vectors is expressed by jjx i À x j jj, at the same time, it can be as the denominator of the new kernel function. 3. If two row vectors are identical, there is When the two row vectors are identical, the similarity is 1, and x i T x j þ jjx i À x j jj could be as the denominator of the kernel function. 4. dðd > 0Þ as the width parameter could control the radial scope of function.
Eventually constructing the new kernel function According to the Hilbert-Schmidt theory, as long as an operation satisfies Mercer's condition, the operation could be used as inner product space conversion, and the inner product space conversion could be as the kernel function.
Mercer Theorems. Let X is a compact set in R n , k x; z ð Þ is the continuous real-value symmetric function in X Â X , and ðð Equation (19) is the Mercer condition, and it could be equivalent to k x; z ð Þ, which is a kernel function, that is In equation (20), is the mapping from X to Hilbert and ðÁÞ is the inner product in Hilbert space L 2 .
If the constructed function is similar to the kernel function, which satisfies Mercer's condition, it could be proved. The theoretical proof is as follows: is the linear kernel function, and it meets when X is a compact set in R n , k 1 ðx i ; x j Þ can be continuous real-value symmetric function in X Â X , because if all the element values of vectors x and z are nonnegative, ðd > 0Þ is the homogeneous kernel, it only depends on the distance. And it meets when X is a compact set in R n , k 2 x i ; x j À Á is the continuous real-value symmetric function in X Â X , because of d > 0, k 2 x i ; x j À Á is nonnegative. 4. When x À z is 0, at this time, when x and z are identical, thus And when x and z are different, (18) cannot be 0.
In a word, when X is a compact set in R n , equation (18) is the continuous real-value symmetric function in X Â X and nonnegative. From the Mercer's theorem

Application of PCA and KPCA
According to single performance indexes in various dimensions, the PCA calculation finishes the robotic performance comprehensive evaluation. The contribution rate of each principal component reflects the amount of information extracted from the original single indexes. The larger the contribution rate, the more original information is contained in the principal component. The cumulative contribution rate is generally demanded to reach 85% to ensure enough information on principal components. So the first principal component, whose contribution rate is greater than 85%, could be considered as a comprehensive performance index with definite mechanism meaning. Further, the relation among the single performance indexes, topological structures and dimensions could be revealed, and the topological structures and dimensions with the best comprehensive performance are selected simultaneously.
When the contribution rate of the first principal component is smaller than 85%, the single performance indexes' data compression is insufficient. In this situation, the single performance indexes' nonlinear characteristics are extracted difficultly, so the KPCA method is applied for the robotic performance comprehensive evaluation. The original single indexes' data have the ability to be mapped into highdimensional space by nonlinear transformation, at the same time, it is calculated by PCA for the effective dimensionality reduction. When using the KPCA method, it is important to select appropriate kernel function and parameters, and because both kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. In Figure 1, the calculation flowchart is shown.
The advantages of this method are listed as follows: 1. The single performance indexes could be selected as many as possible. All sample data could be calculated by PCA or KPCA. And the impacts between single performance indexes could be eliminated. 2. After the original sample data of single performance indexes are changed to principal components, weights of the indexes are needed to score the comprehensive performance, which are more objective than human decisions. 3. When using the KPCA method, the nonlinear relationship between each single performance index can be stated. KPCA method provides a valid evaluation and analysis method to the comprehensive performance. 4. When using the KPCA method, by selecting and constructing appropriate kernel function, the first principal component's contribution rate can get 85% which is better than the result solved by PCA, which has better global superiority.

Single performance indexes
The global performance indexes are only related with the kinematic dimension. Therefore, the commonly used global performance indexes, such as linear speed global performance index x 1 ð Þ, angle velocity global performance index x 2 ð Þ, force global bearing performance index x 3 ð Þ, torque global performance index x 4 ð Þ, X-linear acceleration global performance index x 5 ð Þ, and Y-linear acceleration global performance index x 6 ð Þ are selected. The indexes are given in Table 1.
In addition to the above indexes, according to different tasks, more robotic single performance indexes should be selected.

Application of kernel function
When using the KPCA method, the kernel function and parameters directly have an effect on the result of comprehensive performance evaluation. The following three kernel functions in equations (7), (8), and (9) are commonly used. The Gaussian kernel function can be localized. At the same time, in wake of the increase of parameter c, the inner learning ability is weakened. Polynomial kernel function can promote the extrapolation, which has good overall properties, the lower the order, the stronger the promotion. And the multilayer perception kernel function including a hidden layer 17 can realize multilayer perception.
For specific problems, the key to solving a specific nonlinear classification problem is to construct a suitable kernel function. In our previous study, by comparing three kernels, polynomial kernel function has been used in KPCA calculation. Because KPCA method calculation with polynomial kernel function is time consuming and inefficient, a new kernel function based on similarity degree is proposed for the big sample data. The contribution rates of principal components have a nonlinear relation with the kernel function's type and parameters. The process to determine the kernel function's type and parameters is also a nonlinear optimization process. So the contribution rate of the first principal component is the optimization objective, and kernel function's type and parameters are variables. And then the optimization function is established to select appropriate kernel function's type and parameters. 18

PCA for robot
By the PCA method, the robotic single global performance indexes are analyzed. Comprehensive global performance can be evaluated, and the relationships between the dimensions and the single robotic global indexes can be revealed.

Typical robot
PUMA560 robot is often used as an industrial robot, 19 and hence the typical PUMA560 robot is selected. In Figures 2   and 3, the model and each link coordinate system of PUMA560 robot are shown, respectively.
The dimension parameters of PUMA560 robot are given in Table 2, selecting d 2 ¼ 100*200, a 3 ¼ 5*55 (the step size of d 2 and a 3 are 10 and 5, respectively), d 2 can take 11 different values, and a 3 can take 11 different values. Therefore, the 6 global performance indexes' values for 121 samples are obtained.
The evaluation result and computation time of PCA are directly affected by the sample. The samples in mechanism Table 1. Typical global performance indexes.

Analytic expression
Physical meaning Value selecting requirements Measure the linear velocity of the mechanism dexterity in the whole workspace h Jv 2 ½0; 1, the greater h Jv value, the better linear velocity Measure the angular velocity of mechanism dexterity in the whole workspace h J! 2 ½0; 1, the greater h J! value, the better angular velocity Measure the force of mechanism dexterity in the whole workspace h F 2 ½0; 1, the greater h F value, the better force Measure the torque of mechanism dexterity in the whole workspace h M 2 ½0; 1, the greater h M value, the better torque Measure the X-linear acceleration of mechanism dexterity in the whole workspace h H4! 2 ½0; 1, the greater h H4! value, the better X-linear acceleration Measure the Y-linear acceleration of mechanism dexterity in the whole workspace h H5! 2 ½0; 1, the greater h H5! value, the better Y-linear acceleration evaluation and analysis by the PCA method have no special requirements. But the samples' sequences are supposed to be bounded, while the quantity of the samples' sequences must be more than 30. 20 The first x 1 , x 2 , x 3 , x 4 , x 5 , and x 6 should be standardized forward because the x 1 , x 2 , x 3 , x 4 , x 5 , and x 6 are moderate indexes. In Table 3, all standardized results of the 121 samples are presented.

PCA result
As given in Table 4, the correlation coefficients between each index can be calculated.Each index has positive correlations in Table 4. As given in Table 5, based on the correlation coefficient matrix, the PCA results are calculated.
In Table 5, the first principal component instead of the six single indexes is used for a comprehensive global performance index. The first principal component can evaluate the comprehensive global performance and reflect the balance of original single global indexes. At the same time, the comprehensive performance can be calculated as follows y 1 is the first principal component score. Then comprehensive global performance scores of the 121 samples are able to be calculated by equation (21), which are shown in Figure 5. Thus PUMA560 robot's dimensions with the best comprehensive global performance can be obtained. The greater the score value, the greater the PUMA560 robotic comprehensive global performance is.
However, as seen in Table 5, the first principal component contribute of PCA method is 44.014%. Therefore, the first principal component calculated by equation (21) cannot reflect the comprehensive global performance as it does not contain enough original single global indexes' information. As a result, KPCA method should be applied in PUMA560 robot's mechanism analysis and global performance comprehensive evaluation.

KPCA with polynomial kernel function result
When using the KPCA method, the kernel function and its parameters can be selected to construct an optimization model. The type of kernel function and its parameters can be variables, and the first principal component's contribution can be the optimization objective. Polynomial kernel function, in this case, has been used in KPCA calculation after comparison. In equation (14), polynomial kernel function and its parameters with s ¼ 30 are selected. In this case, KPCA with polynomial kernel function's results are given in Table 5.
As given in Table 5, the first principal component contribute of KPCA method with polynomial kernel function can reach 86.840%. The dimension reduction effect of KPCA with polynomial kernel function is comparatively more remarkable than the PCA method, at the same time, most information of the original global performance indexes can be contained by the first principal component calculated by KPCA method with the polynomial kernel function. So KPCA with polynomial kernel function could evaluate PUMA560 robot global performance credibly.  Figure 3. Coordinate of PUMA560 robot. Table 2. Parameters of the PUMA560 robot.
Then by KPCA with polynomial kernel function, dimension of the PUMA560 robot with the best comprehensive global performance is obtained.

KPCA with the new kernel function result
In equation (18), selecting the new kernel function with d¼60 in this case. Then KPCA with the new kernel function's results can be as given in Table 5. As given in Table 5, the first principal component contribute of KPCA method with the new kernel function is 90.405%. The dimension reduction effect of KPCA with the new kernel function can be comparatively more remarkable than KPCA with polynomial kernel function, and most information of the original global performance indexes can be contained by the first principal component calculated by KPCA method with the new kernel function. So KPCA with the new kernel function could evaluate the PUMA560 robot global performance credibly. Then by KPCA with the new kernel function, the dimension of the PUMA560 robot with the best comprehensive global performance is obtained.

Comprehensive evaluation
The original data of single indexes can be projected to the transform space, by PCA, KPCA with polynomial kernel function, and KPCA with the new kernel function, comprehensive performance scores of PUMA560 robot with different dimensions can be got. At the same time, in Figure 4, the first principal component scores of these three methods can be compared.
The The practicality and effectiveness of PCA, KPCA with polynomial kernel function, and KPCA with the new kernel function method used for robotic global performance comprehensive evaluation can be proved.
But KPCA with the new kernel function could increase the gradient of comprehensive performance scores, especially the best sample, and then the pros and cons of each sample more apparent, while the other samples' scores are changing continuously and are relative similar as shown in Figures 5, 6, and 7, the trend of other samples' scores after removing the best sample based on PCA, KPCA with polynomial kernel function, and KPCA with the new kernel function method are Figures 5, 6 and 7, respectively. Moreover, PUMA560 robot global performance information retained through dimension reduction of KPCA with polynomial kernel function can be more than PCA's, and PUMA560 robot performance information retained through dimension reduction of KPCA with the new kernel function is more than KPCA with polynomial kernel function's. So with the best comprehensive global performance, the PUMA560 robot dimension is selected by KPCA with the new kernel function easily.

Time consumption
To reflect high efficiency of the new kernel function, more samples should be selected. Selecting d 2 ¼ 100 À 200, a 3 ¼ 5 À 55 (the step size of d 2 and a 3 is 2 and 2.5, respectively). Therefore, 6 global performance indexes' values of 1071 samples are obtained. The time consumption of KPCA with polynomial kernel function and KPCA with the new kernel function can be compared in Figure 8.
As shown in Figure 8, the time consumption of KPCA with the new kernel function method can be lower than the KPCA with polynomial kernel function obviously. And with the increase of samples, the lower time consuming The ability of generalization  step size is 10). The best comprehensive performance scores of KPCA with polynomial kernel function and the best comprehensive performance scores of KPCA with the new kernel function are given in Table 6. As given in Table 6, the best comprehensive performance scores of KPCA with polynomial kernel function is relatively unstable, and the best comprehensive performance scores of KPCA with the new kernel function has remained relatively stable; in general, the KPCA method with the new kernel function has good ability of generalization.

Conclusion
Taking PUMA560 robot as an example, PCA method and KPCA method as the modified PCA method have been introduced into the research on robotic global performance comprehensive evaluation. And the best PUMA560 robot dimension with comprehensive global performance could be selected. Comprehensive evaluating robotic performance by KPCA with the commonly used kernel functions is time-consuming and inefficient, so a new kernel function based on similarity degree is proposed for the big sample data. Therefore, KPCA with the new kernel function could be introduced into the research on robotic performance comprehensive evaluation.
1. The comprehensive performance scores of PUMA560 robot with different dimensions can be calculated by PCA and KPCA with polynomial kernel function and the new kernel function respectively, and at the same time, robotic comprehensive performance can be measured. As a result, the effectiveness of PCA method and KPCA method with different kernel functions for robotic comprehensive performance evaluation is proved. 2. A new kernel function based on similarity degree instead of polynomial kernel function is proposed for the big sample data to solve the time consuming and inefficient problem. And the new kernel function is proved according to Mercer's theorem. The simulation shows that the KPCA method with the new kernel function has the advantage of low computational complexity, good real-time performance, and good ability of generalization.
Through comparing dimension reductions of PCA and KPCA with polynomial kernel function and the new kernel function, it indicates that KPCA with the new kernel function can deal with the nonlinear relationship between robotic indexes effectively, at the same time, through the first principal component, KPCA with the new kernel function can provide more information, so the results of KPCA with the new kernel function are more reasonable.
The method could not only comprehensive evaluate global performance indexes, but also comprehensive evaluate other performance indexes such as fault-tolerant performance, obstacle avoidance, joint limit avoidance and joint torque minimization, system kinetic energy minimization, dynamic performance, and so on. A variety of performances can be combined to carry on the comprehensive evaluation, and the correlation among the various types of performance could be presented. In future work, PCA and KPCA would be used to evaluate comprehensive performance of multiple robots in engineering projects, and to verify the effectiveness of PCA and KPCA method.

Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the National Natural Science Foundation of China [Grant Number 51415016].