Approximation by perturbed neural network operators

George A. Anastassiou

Applicationes Mathematicae (2015)

  • Volume: 42, Issue: 1, page 57-81
  • ISSN: 1233-7234

Abstract

top
This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.

How to cite

top

George A. Anastassiou. "Approximation by perturbed neural network operators." Applicationes Mathematicae 42.1 (2015): 57-81. <http://eudml.org/doc/279947>.

@article{GeorgeA2015,
abstract = {This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.},
author = {George A. Anastassiou},
journal = {Applicationes Mathematicae},
keywords = {neural network approximation; perturbation of operators; modulus of continuity; Jackson type inequality},
language = {eng},
number = {1},
pages = {57-81},
title = {Approximation by perturbed neural network operators},
url = {http://eudml.org/doc/279947},
volume = {42},
year = {2015},
}

TY - JOUR
AU - George A. Anastassiou
TI - Approximation by perturbed neural network operators
JO - Applicationes Mathematicae
PY - 2015
VL - 42
IS - 1
SP - 57
EP - 81
AB - This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.
LA - eng
KW - neural network approximation; perturbation of operators; modulus of continuity; Jackson type inequality
UR - http://eudml.org/doc/279947
ER -

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.