TY - JOUR
T1 - Machine Learning Potential Model Based on Ensemble Bispectrum Feature Selection and Its Applicability Analysis
AU - Jiang, Jiawei
AU - Xu, Li Chun
AU - Li, Fenglian
AU - Shao, Jianli
N1 - Publisher Copyright:
© 2023 by the authors.
PY - 2023/1
Y1 - 2023/1
N2 - With the continuous improvement of machine learning methods, building the interatomic machine learning potential (MLP) based on the datasets from quantum mechanics calculations has become an effective technical approach to improving the accuracy of classical molecular dynamics simulation. The Spectral Neighbor Analysis Potential (SNAP) is one of the most commonly used machine learning potentials. It uses the bispectrum to encode the local environment of each atom in the lattice. The hyperparameter jmax controls the mapping complexity and precision between the local environment and the bispectrum descriptor. As the hyperparameter jmax increases, the description will become more accurate, but the number of parameters in the bispectrum descriptor will increase dramatically, increasing the computational complexity. In order to reduce the computational complexity without losing the computational accuracy, this paper proposes a two-level ensemble feature selection method (EFS) for a bispectrum descriptor, combining the perturbation method and the feature selector ensemble strategy. Based on the proposed method, the feature subset is selected from the original dataset of the bispectrum descriptor for building the dimension-reduced MLP. As a method application and validation, the data of Fe, Ni, Cu, Li, Mo, Si, and Ge metal elements are used to train the linear regression model based on SNAP for predicting these metals’ atomic energies and forces them to evaluate the performance of the feature subsets. The experimental results show that, compared to the features of SNAP and qSNAP, the training complexity improvement of our EFS method on the qSNAP feature is more effective than SNAP. Compared with the existing methods, when the feature subset size is 0.7 times that of the original features, the proposed EFS method based on the SSWRP ensemble strategy can achieve the best performance in terms of stability, achieving an average stability of 0.94 across all datasets. The training complexity of the linear regression model is reduced by about half, and the prediction complexity is reduced by about 30%.
AB - With the continuous improvement of machine learning methods, building the interatomic machine learning potential (MLP) based on the datasets from quantum mechanics calculations has become an effective technical approach to improving the accuracy of classical molecular dynamics simulation. The Spectral Neighbor Analysis Potential (SNAP) is one of the most commonly used machine learning potentials. It uses the bispectrum to encode the local environment of each atom in the lattice. The hyperparameter jmax controls the mapping complexity and precision between the local environment and the bispectrum descriptor. As the hyperparameter jmax increases, the description will become more accurate, but the number of parameters in the bispectrum descriptor will increase dramatically, increasing the computational complexity. In order to reduce the computational complexity without losing the computational accuracy, this paper proposes a two-level ensemble feature selection method (EFS) for a bispectrum descriptor, combining the perturbation method and the feature selector ensemble strategy. Based on the proposed method, the feature subset is selected from the original dataset of the bispectrum descriptor for building the dimension-reduced MLP. As a method application and validation, the data of Fe, Ni, Cu, Li, Mo, Si, and Ge metal elements are used to train the linear regression model based on SNAP for predicting these metals’ atomic energies and forces them to evaluate the performance of the feature subsets. The experimental results show that, compared to the features of SNAP and qSNAP, the training complexity improvement of our EFS method on the qSNAP feature is more effective than SNAP. Compared with the existing methods, when the feature subset size is 0.7 times that of the original features, the proposed EFS method based on the SSWRP ensemble strategy can achieve the best performance in terms of stability, achieving an average stability of 0.94 across all datasets. The training complexity of the linear regression model is reduced by about half, and the prediction complexity is reduced by about 30%.
KW - descriptors
KW - ensemble learning
KW - feature select
KW - machine learning potential
UR - http://www.scopus.com/inward/record.url?scp=85146785572&partnerID=8YFLogxK
U2 - 10.3390/met13010169
DO - 10.3390/met13010169
M3 - Article
AN - SCOPUS:85146785572
SN - 2075-4701
VL - 13
JO - Metals
JF - Metals
IS - 1
M1 - 169
ER -