Filter stability is a classical problem for partially observed Markov processes (POMP). For a POMP, an in-correctly initialized non-linear filter is said to be stable if the filter eventually corrects itself with the arrival of new measurement information. In this paper, we first introduce a functional characterization of observability for a POMP and show that this characterization is sufficient to guarantee stability of the non-linear filter in a weak sense. Under further regularity conditions, we establish stability under the notions of weak convergence, total variation, and relative entropy; thus complementing and also unifying some existing results in the literature. In addition, we study controlled partially observed Markov decision processes (POMDP) to arrive at analogous stability once control, and hence non-Markovian dependence between random variables, is introduced into the system. This brings together results in non-linear filtering theory and stochastic control theory which had previously remained isolated.
展开▼