Page 1

Displaying 1 – 11 of 11

Showing per page

Employing different loss functions for the classification of images via supervised learning

Radu Boţ, André Heinrich, Gert Wanka (2014)

Open Mathematics

Supervised learning methods are powerful techniques to learn a function from a given set of labeled data, the so-called training data. In this paper the support vector machines approach is applied to an image classification task. Starting with the corresponding Tikhonov regularization problem, reformulated as a convex optimization problem, we introduce a conjugate dual problem to it and prove that, whenever strong duality holds, the function to be learned can be expressed via the dual optimal solutions....

Event-triggered design for multi-agent optimal consensus of Euler-Lagrangian systems

Xue-Fang Wang, Zhenhua Deng, Song Ma, Xian Du (2017)

Kybernetika

In this paper, a distributed optimal consensus problem is investigated to achieve the optimization of the sum of local cost function for a group of agents in the Euler-Lagrangian (EL) system form. We consider that the local cost function of each agent is only known by itself and cannot be shared with others, which brings challenges in this distributed optimization problem. A novel gradient-based distributed continuous-time algorithm with the parameters of EL system is proposed, which takes the distributed...

Extensions of convex functionals on convex cones

E. Ignaczak, A. Paszkiewicz (1998)

Applicationes Mathematicae

We prove that under some topological assumptions (e.g. if M has nonempty interior in X), a convex cone M in a linear topological space X is a linear subspace if and only if each convex functional on M has a convex extension on the whole space X.

Currently displaying 1 – 11 of 11

Page 1