Probabilistic belief contraction is an operation that takes a probability distribution P representing a belief state along with an input sentence a representing some information to be removed from this belief state, and outputs a new probability distribution P_a~- . The contracted belief state P_a~- can be represented as a mixture of two states: the original belief state P, and the resultant state P_(﹁a)~* of revising P by ﹁a. Crucial to this mixture is the mixing factor e which determines the proportion of P and P_(﹁a)~* that are used in this process in a uniform manner. Ideas from information theory such as the principle of minimum cross-entropy have previously been used to motivate the choice of the probabilistic contraction operation. Central to this principle is the Kullback-Leibler (KL) divergence. In an earlier work we had shown that the KL divergence of P_a~- from P is fully determined by a function whose only argument is the mixing factor ε. In this paper we provide a way of interpreting ε in terms of a belief ranking mechanism such as epistemic entrenchment that is in consonance with this result. We also provide a much needed justification for why the mixing factor ε must be used in a uniform fashion by showing that the minimal divergence of P_a~- from P is achieved only when uniformity is respected.
展开▼