Improve neural representations with general exponential activation function for high-speed flows

Ge Jin, Deyou Wang, Pengfei Si, Jiao Liu, Shipeng Li*, Ningfei Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Characterizing flow fields with neural networks has witnessed a considerable surge in recent years. However, the efficacy of these techniques is typically constrained when applied to high-speed compressible flows, due to the susceptibility of nonphysical oscillations near shock waves. In this work, we focus on a crucial fundamental component of neural networks, the activation functions, to improve the physics-informed neural representations of high-speed compressible flows. We present a novel activation function, namely, the generalized exponential activation function, which has been specifically designed based on the intrinsic characteristics of high-speed compressible flows. Subsequently, the performance of the proposed method is subjected to a comprehensive analysis, encompassing training stability, initialization strategy, and the influence of ancillary components. Finally, a series of representative experiments were conducted to validate the efficacy of the proposed method, including the contact-discontinuity problem, the Sod shock-tube problem, and the converging-diverging nozzle flow problem.

Original languageEnglish
Article number126117
JournalPhysics of Fluids
Volume36
Issue number12
DOIs
Publication statusPublished - 1 Dec 2024

Fingerprint

Dive into the research topics of 'Improve neural representations with general exponential activation function for high-speed flows'. Together they form a unique fingerprint.

Cite this