Ambiguity is a notorious problem for Natural Language Processing. According to results obtained by Schmitz and Quantz I see disambiguation as a process in which contextual defaults are used to derive the most preferred interpretation of an expression. I show how contextual information comprising grammatical as well as conceptual knowledge can be modeled in a homogeneous manner using Terminological Logics (TL). I slightly modify the default extension to tl presented by Quantz and Royer to allow a relevance ordering between multisets of defaults. The preferred interpretation is the one containing the fewest exceptions with respect to such an ordering. Interpretation is thus achieved by exception minimization.I combine this idea with deductive and abductive approaches to interpretation and show how they can be formalized in terms of tl entailment. Furthermore, I obtain a variable depth of analysis by controling the granularity of interpretation via a set of relevant features.
展开▼