This paper presents a Japanese zero anaphora resolution model which deals with both intra-and inter-sentential zero anaphora. Solving inter-sentential anaphora needs to consider a large number of antecedent candidates beyond the sentence boundaries, which is a crucial obstacle for training the model and resolving the anaphora. To cope with this problem, we propose an effective candidate pruning method using case frame information. Also, we introduce a local single-attention RNN for inter-sentential anaphora resolution, allowing the model to consider the distant context from the target predicate. We evaluated the proposed models with a Japanese balanced corpus and confirmed the effectiveness of the candidate pruning by showing 0.056 point increase of accuracy.
展开▼