When compared to alternative approaches, such as Gaussian Mixture Models (GMMs), particle clouds more faithfully represent uncertainty. A concern about particle clouds, however, is their inability to provide the analyst with closed form expressions for many standard information theoretic quantities such as entropy and divergence. Recent advances in information theory have provided techniques that can approximately estimate such quantities. One approach in the literature is the use of the k-th nearest neighbor (k-NN) algorithm to estimate the probability density function of the particle cloud. Given this density estimate, one can then compute various information theoretic quantities. In this paper, we review the k-NN algorithm and then discuss two applications. The first application is the estimation of the entropy of a particle cloud. Specifically, we show that the entropy of a nonlinear Hamiltonian system is conserved if canonical coordinates are used as a coordinate frame. The second application is to estimate the divergence between two particle clouds. Specifically, we use the estimated Bhattacharyya divergence to solve an uncorrelated track (UCT) correlation problem.
展开▼