Abstract To improve the performance of multilayer perceptron (MLP) neural networks activated by conventional activation functions, this paper presents a new MLP activated by univariate Gaussian radial basis functions (RBFs) with adaptive centers and widths, which is composed of more than one hidden layer. In the hidden layer of the RBF-activated MLP network (MLP-RBF), the outputs of the preceding layer are first linearly transformed and then fed into the univariate Gaussian RBF, which exploits the highly nonlinear property of RBF. Adaptive RBFs might address the issues of saturated outputs, low sensitivity, and vanishing gradients in MLPs activated by other prevailing nonlinear functions. Finally, we apply four MLP networks with the rectified linear unit (ReLU), sigmoid function (sigmoid), hyperbolic tangent function (tanh), and Gaussian RBF as the activation functions to approximate the one-dimensional (1D) sinusoidal function, the analytical solution of viscous Burgers’ equation, and the two-dimensional (2D) steady lid-driven cavity flows. Using the same network structure, MLP-RBF generally predicts more accurately and converges faster than the other three MLPs. MLP-RBF using less hidden layers and/or neurons per layer can yield comparable or even higher approximation accuracy than other MLPs equipped with more layers or neurons.
Corresponding Authors:
Chang Shu
E-mail: mpeshuc@nus.edu.sg
Cite this article:
Qinghua Jiang,Lailai Zhu,Chang Shu et al. Multilayer perceptron neural network activated by adaptive Gaussian radial basis function and its application to predict lid-driven cavity flow[J]. Acta Mechanica Sinica, 2021, 37(12): 1759-1774.