Displaying 41 – 60 of 98

Showing per page

Employing different loss functions for the classification of images via supervised learning

Radu Boţ, André Heinrich, Gert Wanka (2014)

Open Mathematics

Supervised learning methods are powerful techniques to learn a function from a given set of labeled data, the so-called training data. In this paper the support vector machines approach is applied to an image classification task. Starting with the corresponding Tikhonov regularization problem, reformulated as a convex optimization problem, we introduce a conjugate dual problem to it and prove that, whenever strong duality holds, the function to be learned can be expressed via the dual optimal solutions....

Minimax control of nonlinear evolution equations

Nikolaos S. Papageorgiou (1995)

Commentationes Mathematicae Universitatis Carolinae

In this paper we study the minimax control of systems governed by a nonlinear evolution inclusion of the subdifferential type. Using some continuity and lower semicontinuity results for the solution map and the cost functional respectively, we are able to establish the existence of an optimal control. The abstract results are then applied to obstacle problems, semilinear systems with weakly varying coefficients (e.gȯscillating coefficients) and differential variational inequalities.

Nonconvex Duality and Semicontinuous Proximal Solutions of HJB Equation in Optimal Control

Mustapha Serhani, Nadia Raïssi (2009)

RAIRO - Operations Research

In this work, we study an optimal control problem dealing with differential inclusion. Without requiring Lipschitz condition of the set valued map, it is very hard to look for a solution of the control problem. Our aim is to find estimations of the minimal value, (α), of the cost function of the control problem. For this, we construct an intermediary dual problem leading to a weak duality result, and then, thanks to additional assumptions of monotonicity of proximal subdifferential, we give a more...

Currently displaying 41 – 60 of 98