### Quantification of prior knowledge about global characteristics of linear normal model

Skip to main content (access key 's'),
Skip to navigation (access key 'n'),
Accessibility information (access key '0')

Back to Simple Search
# Advanced Search

The paper presents the stopping rule for random search for Bayesian model-structure estimation by maximising the likelihood function. The inspected maximisation uses random restarts to cope with local maxima in discrete space. The stopping rule, suitable for any maximisation of this type, exploits the probability of finding global maximum implied by the number of local maxima already found. It stops the search when this probability crosses a given threshold. The inspected case represents an important...

The paper presents an alternative approach to the design of a hybrid adaptive controller of Linear Quadratic Gaussian (LQG) type for linear stochastic controlled system. The approach is based on the combination standard building blocks of discrete LQG adaptive controller with the non-standard technique of modelling of a controlled system and spline approximation of involved signals. The method could be of interest for control of systems with complex models, in particular distributed parameter systems....

Reconstruction of underlying physiological structures from a sequence of images is a long-standing problem which has been solved by factor analysis with a success. This paper tries to return to roots of the problem, to exploit the available findings and to propose an improved paradigm.

The paper solves the problem of minimization of the Kullback divergence between a partially known and a completely known probability distribution. It considers two probability distributions of a random vector $({u}_{1},{x}_{1},...,{u}_{T},{x}_{T})$ on a sample space of $2T$ dimensions. One of the distributions is known, the other is known only partially. Namely, only the conditional probability distributions of ${x}_{\tau}$ given ${u}_{1},{x}_{1},...,{u}_{\tau -1},{x}_{\tau -1},{u}_{\tau}$ are known for $\tau =1,...,T$. Our objective is to determine the remaining conditional probability distributions of ${u}_{\tau}$ given ${u}_{1},{x}_{1},...,{u}_{\tau -1},{x}_{\tau -1}$ such...

Probabilistic mixtures provide flexible “universal” approximation of probability density functions. Their wide use is enabled by the availability of a range of efficient estimation algorithms. Among them, quasi-Bayesian estimation plays a prominent role as it runs “naturally” in one-pass mode. This is important in on-line applications and/or extensive databases. It even copes with dynamic nature of components forming the mixture. However, the quasi-Bayesian estimation relies on mixing via constant...

**Page 1**