Displaying 41 – 60 of 84

Showing per page

New bounds for the minimum eigenvalue of 𝓜-tensors

Jianxing Zhao, Caili Sang (2017)

Open Mathematics

A new lower bound and a new upper bound for the minimum eigenvalue of an 𝓜-tensor are obtained. It is proved that the new lower and upper bounds improve the corresponding bounds provided by He and Huang (J. Inequal. Appl., 2014, 2014, 114) and Zhao and Sang (J. Inequal. Appl., 2016, 2016, 268). Finally, two numerical examples are given to verify the theoretical results.

New criteria for H-tensors and an application

Feng Wang, Deshu Sun (2015)

Open Mathematics

Some new criteria for identifying H-tensors are obtained. As an application, some sufficient conditions of the positive definiteness for an even-order real symmetric tensor are given. Advantages of results obtained are illustrated by numerical examples.

New iterative codes for𝓗-tensors and an application

Feng Wang, Deshu Sun (2016)

Open Mathematics

New iterative codes for identifying 𝓗 -tensor are obtained. As an application, some sufficient conditions of the positive definiteness for an even-order real symmetric tensor, i.e., an even-degree homogeneous polynomial form are given. Advantages of results obtained are illustrated by numerical examples.

Norm estimates for solutions of matrix equations AX-XB=C and X-AXB=C

Michael I. Gil' (2014)

Discussiones Mathematicae, Differential Inclusions, Control and Optimization

Let A, B and C be matrices. We consider the matrix equations Y-AYB=C and AX-XB=C. Sharp norm estimates for solutions of these equations are derived. By these estimates a bound for the distance between invariant subspaces of matrices is obtained.

On two matrix derivatives by Kollo and von Rosen.

Heinz Neudecker (2003)

SORT

The article establishes relationships between the matrix derivatives of F with respect to X as introduced by von Rosen (1988), Kollo and von Rosen (2000) and the Magnus-Neudecker (1999) matrix derivative. The usual transformations apply and the Moore-Penrose inverse of the duplication matrix is used. Both X and F have the same dimension.

Rank of tensors of -out-of- k functions: An application in probabilistic inference

Jiří Vomlel (2011)

Kybernetika

Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy -out-of- k functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank...

Currently displaying 41 – 60 of 84