The present paper is a continuation of [2] where we deal with the duality for a multiobjective fractional optimization problem. The basic idea in [2] consists in attaching an intermediate multiobjective convex optimization problem to the primal fractional problem, using an approach due to Dinkelbach ([6]), for which we construct then a dual problem expressed in terms of the conjugates of the functions involved. The weak, strong and converse duality statements for the intermediate problems allow...
Supervised learning methods are powerful techniques to learn a function from a given set of labeled data, the so-called training data. In this paper the support vector machines approach is applied to an image classification task. Starting with the corresponding Tikhonov regularization problem, reformulated as a convex optimization problem, we introduce a conjugate dual problem to it and prove that, whenever strong duality holds, the function to be learned can be expressed via the dual optimal solutions....
We consider a convex optimization problem with a vector valued function as objective function and convex cone inequality constraints. We suppose that each entry of the objective function is the composition of some convex functions. Our aim is to provide necessary and sufficient conditions for the weakly efficient solutions of this vector problem. Moreover, a multiobjective dual treatment is given and weak and strong duality assertions are proved.
Since the huge database of patent documents is continuously increasing, the issue of classifying, updating and retrieving patent documents turned into an acute necessity. Therefore, we investigate the efficiency of applying Latent Semantic Indexing, an automatic indexing method of information retrieval, to some classes of patent documents from the United States Patent Classification System. We present some experiments that provide the optimal number of dimensions for the Latent Semantic Space and...
In this note we provide regularity conditions of closedness type which guarantee some surjectivity results concerning the sum of two maximal monotone operators by using representative functions. The first regularity condition we give guarantees the surjectivity of the monotone operator S(· + p) + T(·), where p ɛ X and S and T are maximal monotone operators on the reflexive Banach space X. Then, this is used to obtain sufficient conditions for the surjectivity of S + T and for the situation when...
Download Results (CSV)