Palina Salanevich (Utrecht University)

Universal approximation properties of random neural networks

29 September 2022, 11:00-12:00 in HG03.082

Abstract

Single layer feedforward neural networks (SLFN) have been widely used for solving problems such as classification and regression because of their universal approximation capability. At the same time, iterative methods used for training SLFN suffer from slow convergence, get trapped in local minima, and are sensitivity to the initial choice of parameters. To counter this, both researchers and practitioners aimed to introduce random neural networks, where the number of parameters to be learned is reduced. Based on the original construction of Igelnik and Pao, single layer neural-networks with random input-to-hidden layer weights and biases have seen success in practice, but lacked the necessary theoretical justification. In this talk, we start to bridge this gap and provide a non-asymptotic bound on the approximation error for such random neural networks, depending on the number of nodes in the hidden layer. We also discuss how this randomized neural network architecture can be generalized to approximate functions on smooth compact lower-dimensional manifolds and provide theoretical guarantees in both the asymptotic and non-asymptotic forms for this generalization.



Back to the seminar page