TY - GEN
T1 - Evolving Molecular Graph Neural Networks with Hierarchical Evaluation Strategy
AU - Yuan, Yingfang
AU - Wang, Wenjun
AU - Li, Xin
AU - Chen, Kefan
AU - Zhang, Yonghan
AU - Pang, Wei
N1 - Publisher Copyright:
© 2024 Copyright is held by the owner/author(s). Publication rights licensed to ACM.
PY - 2024/7/14
Y1 - 2024/7/14
N2 - Graph representation of molecular data enables extracting stereoscopic features, with graph neural networks (GNNs) excelling in molecular property prediction. However, selecting optimal hyper-parameters for GNN construction is challenging due to the vast search space and high computational costs. To tackle this, we introduce a hierarchical evaluation strategy integrated with a genetic algorithm (HESGA). HESGA combines full and fast evaluations of GNNs. Full evaluation involves training a GNN with preset epochs, using root mean square error (RMSE) to measure hyperparameter quality. Fast evaluation interrupts training early, using the difference in RMSE values as a score for GNN potential. HESGA integrates these evaluations, with fast evaluation guiding candidate selection for full evaluation, maintaining elite individuals. Applying HESGA to optimise deep GNNs for molecular property prediction, experimental results on three datasets demonstrate its superiority over traditional Bayesian optimisation, Tree-structured Parzen Estimator, and CMA-ES. HESGA efficiently navigates the complex GNN hyperparameter space, offering a promising approach for molecular property prediction.
AB - Graph representation of molecular data enables extracting stereoscopic features, with graph neural networks (GNNs) excelling in molecular property prediction. However, selecting optimal hyper-parameters for GNN construction is challenging due to the vast search space and high computational costs. To tackle this, we introduce a hierarchical evaluation strategy integrated with a genetic algorithm (HESGA). HESGA combines full and fast evaluations of GNNs. Full evaluation involves training a GNN with preset epochs, using root mean square error (RMSE) to measure hyperparameter quality. Fast evaluation interrupts training early, using the difference in RMSE values as a score for GNN potential. HESGA integrates these evaluations, with fast evaluation guiding candidate selection for full evaluation, maintaining elite individuals. Applying HESGA to optimise deep GNNs for molecular property prediction, experimental results on three datasets demonstrate its superiority over traditional Bayesian optimisation, Tree-structured Parzen Estimator, and CMA-ES. HESGA efficiently navigates the complex GNN hyperparameter space, offering a promising approach for molecular property prediction.
KW - graph neural networks
KW - hierarchical evaluation strategy
KW - hyperparameter optimisation
KW - molecular property prediction
UR - http://www.scopus.com/inward/record.url?scp=85206933980&partnerID=8YFLogxK
U2 - 10.1145/3638529.3654055
DO - 10.1145/3638529.3654055
M3 - Conference contribution
AN - SCOPUS:85206933980
T3 - GECCO 2024 - Proceedings of the 2024 Genetic and Evolutionary Computation Conference
SP - 1417
EP - 1425
BT - GECCO 2024 - Proceedings of the 2024 Genetic and Evolutionary Computation Conference
PB - Association for Computing Machinery, Inc
T2 - 2024 Genetic and Evolutionary Computation Conference, GECCO 2024
Y2 - 14 July 2024 through 18 July 2024
ER -