Approximation by perturbed neural network operators
Applicationes Mathematicae (2015)
- Volume: 42, Issue: 1, page 57-81
- ISSN: 1233-7234
Access Full Article
topAbstract
topHow to cite
topGeorge A. Anastassiou. "Approximation by perturbed neural network operators." Applicationes Mathematicae 42.1 (2015): 57-81. <http://eudml.org/doc/279947>.
@article{GeorgeA2015,
abstract = {This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.},
author = {George A. Anastassiou},
journal = {Applicationes Mathematicae},
keywords = {neural network approximation; perturbation of operators; modulus of continuity; Jackson type inequality},
language = {eng},
number = {1},
pages = {57-81},
title = {Approximation by perturbed neural network operators},
url = {http://eudml.org/doc/279947},
volume = {42},
year = {2015},
}
TY - JOUR
AU - George A. Anastassiou
TI - Approximation by perturbed neural network operators
JO - Applicationes Mathematicae
PY - 2015
VL - 42
IS - 1
SP - 57
EP - 81
AB - This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.
LA - eng
KW - neural network approximation; perturbation of operators; modulus of continuity; Jackson type inequality
UR - http://eudml.org/doc/279947
ER -
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.