Previous Page 3

Displaying 41 – 48 of 48

Showing per page

On the ψ₂-behaviour of linear functionals on isotropic convex bodies

G. Paouris (2005)

Studia Mathematica

The slicing problem can be reduced to the study of isotropic convex bodies K with d i a m ( K ) c n L K , where L K is the isotropic constant. We study the ψ₂-behaviour of linear functionals on this class of bodies. It is proved that | | · , θ | | ψ C L K for all θ in a subset U of S n - 1 with measure σ(U) ≥ 1 - exp(-c√n). However, there exist isotropic convex bodies K with uniformly bounded geometric distance from the Euclidean ball, such that m a x θ S n - 1 | | · , θ | | ψ c n L K . In a different direction, we show that good average ψ₂-behaviour of linear functionals on an isotropic...

Operations between sets in geometry

Richard J. Gardner, Daniel Hug, Wolfgang Weil (2013)

Journal of the European Mathematical Society

An investigation is launched into the fundamental characteristics of operations on and between sets, with a focus on compact convex sets and star sets (compact sets star-shaped with respect to the origin) in n -dimensional Euclidean space n . It is proved that if n 2 , with three trivial exceptions, an operation between origin-symmetric compact convex sets is continuous in the Hausdorff metric, G L ( n ) covariant, and associative if and only if it is L p addition for some 1 p . It is also demonstrated that if n 2 ,...

Optimal isometries for a pair of compact convex subsets of ℝⁿ

Irmina Herburt, Maria Moszyńska (2009)

Banach Center Publications

In 1989 R. Arnold proved that for every pair (A,B) of compact convex subsets of ℝ there is an Euclidean isometry optimal with respect to L₂ metric and if f₀ is such an isometry, then the Steiner points of f₀(A) and B coincide. In the present paper we solve related problems for metrics topologically equivalent to the Hausdorff metric, in particular for L p metrics for all p ≥ 2 and the symmetric difference metric.

Optimality conditions for maximizers of the information divergence from an exponential family

František Matúš (2007)

Kybernetika

The information divergence of a probability measure P from an exponential family over a finite set is defined as infimum of the divergences of P from Q subject to Q . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from are presented, including new ones when P  is not projectable to .

Currently displaying 41 – 48 of 48

Previous Page 3