Numerical Functional Analysis and Optimization, cilt.38, sa.7, ss.819-830, 2017 (SCI-Expanded)
In this article, we study approximation properties of single hidden layer neural networks with weights varying in finitely many directions and with thresholds from an open interval. We obtain a necessary and simultaneously sufficient measure theoretic condition for density of such networks in the space of continuous functions. Further, we prove a density result for neural networks with a specifically constructed activation function and a fixed number of neurons.