On a class of forward-backward stochastic differential systems in infinite dimensions.
Assuming that a Markov process satisfies the minorization property, existence and properties of the solutions to the additive and multiplicative Poisson equations are studied using splitting techniques. The problem is then extended to the study of risk sensitive and risk neutral control problems and corresponding Bellman equations.
The research on a class of asymptotic exit-time problems with a vanishing Lagrangian, begun in [M. Motta and C. Sartori, Nonlinear Differ. Equ. Appl. Springer (2014).] for the compact control case, is extended here to the case of unbounded controls and data, including both coercive and non-coercive problems. We give sufficient conditions to have a well-posed notion of generalized control problem and obtain regularity, characterization and approximation results for the value function of the problem....
We study the asymptotic behavior of as , where is the viscosity solution of the following Hamilton-Jacobi-Isaacs equation (infinite horizon case)withWe discuss the cases in which the state of the system is required to stay in an -dimensional torus, called periodic boundary conditions, or in the closure of a bounded connected domain with sufficiently smooth boundary. As far as the latter is concerned, we treat both the case of the Neumann boundary conditions (reflection on the boundary)...
We study the asymptotic behavior of as , where is the viscosity solution of the following Hamilton-Jacobi-Isaacs equation (infinite horizon case) with We discuss the cases in which the state of the system is required to stay in an n-dimensional torus, called periodic boundary conditions, or in the closure of a bounded connected domain with sufficiently smooth boundary. As far as the latter is concerned, we treat both the case of the Neumann boundary conditions (reflection on the...
The paper considers the problem of active fault diagnosis for discrete-time stochastic systems over an infinite time horizon. It is assumed that the switching between a fault-free and finitely many faulty conditions can be modelled by a finite-state Markov chain and the continuous dynamics of the observed system can be described for the fault-free and each faulty condition by non-linear non-Gaussian models with a fully observed continuous state. The design of an optimal active fault detector that...
Using systematically a tricky idea of N.V. Krylov, we obtain general results on the rate of convergence of a certain class of monotone approximation schemes for stationary Hamilton-Jacobi-Bellman equations with variable coefficients. This result applies in particular to control schemes based on the dynamic programming principle and to finite difference schemes despite, here, we are not able to treat the most general case. General results have been obtained earlier by Krylov for finite difference...
Using systematically a tricky idea of N.V. Krylov, we obtain general results on the rate of convergence of a certain class of monotone approximation schemes for stationary Hamilton-Jacobi-Bellman equations with variable coefficients. This result applies in particular to control schemes based on the dynamic programming principle and to finite difference schemes despite, here, we are not able to treat the most general case. General results have been obtained earlier by Krylov for finite...
We are concerned with the optimal control of a nonlinear stochastic heat equation on a bounded real interval with Neumann boundary conditions. The specificity here is that both the control and the noise act on the boundary. We start by reformulating the state equation as an infinite dimensional stochastic evolution equation. The first main result of the paper is the proof of existence and uniqueness of a mild solution for the corresponding Hamilton-Jacobi-Bellman (HJB) equation. The C1 regularity...