HOPE: High-Order Polynomial Expansion of Black-Box Neural Networks

Tingxiong Xiao, Weihang Zhang, Yuxiao Cheng, Jinli Suo*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)
Plum Print visual indicator of research metrics
  • Captures
    • Readers: 40
  • Mentions
    • News Mentions: 1
see details

摘要

Despite their remarkable performance, deep neural networks remain mostly 'black boxes', suggesting inexplicability and hindering their wide applications in fields requiring making rational decisions. Here we introduce HOPE (High-order Polynomial Expansion), a method for expanding a network into a high-order Taylor polynomial on a reference input. Specifically, we derive the high-order derivative rule for composite functions and extend the rule to neural networks to obtain their high-order derivatives quickly and accurately. From these derivatives, we can then derive the Taylor polynomial of the neural network, which provides an explicit expression of the network's local interpretations. We combine the Taylor polynomials obtained under different reference inputs to obtain the global interpretation of the neural network. Numerical analysis confirms the high accuracy, low computational complexity, and good convergence of the proposed method. Moreover, we demonstrate HOPE's wide applications built on deep learning, including function discovery, fast inference, and feature selection. We compared HOPE with other XAI methods and demonstrated our advantages.

源语言英语
页(从-至)7924-7939
页数16
期刊IEEE Transactions on Pattern Analysis and Machine Intelligence
46
12
DOI
出版状态已出版 - 2024
已对外发布

指纹

探究 'HOPE: High-Order Polynomial Expansion of Black-Box Neural Networks' 的科研主题。它们共同构成独一无二的指纹。

引用此

Xiao, T., Zhang, W., Cheng, Y., & Suo, J. (2024). HOPE: High-Order Polynomial Expansion of Black-Box Neural Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 46(12), 7924-7939. https://doi.org/10.1109/TPAMI.2024.3399197