ABCNN: Attention-Based Convolutional Neural Network for Modeling Sentence Pairs

Wenpeng Yin, Hinrich Schütze, Bing Xiang, Bowen Zhou


How to model a pair of sentences is a critical issue in many
NLP tasks such as answer selection (AS), paraphrase identification (PI) and textual entailment (TE). Most prior work (i) deals with one individual task by fine-tuning a specific system; (ii) models each sentence's representation separately, rarely considering the impact of the other sentence; or (iii) relies fully on manually designed, task-specific linguistic features. This work presents a general Attention Based Convolutional Neural Network (ABCNN) for modeling a pair of sentences. We make three contributions. (i) The ABCNN can be applied to a wide variety of tasks that require modeling of sentence pairs. (ii) We propose three attention schemes that integrate mutual influence between sentences into CNNs; thus, the representation of each
sentence takes into consideration its counterpart. These
interdependent sentence pair representations are more
powerful than isolated sentence representations. (iii)
ABCNNs achieve state-of-the-art performance on AS, PI and TE tasks. We release code at:


  • There are currently no refbacks.

Copyright (c) 2016 Association for Computational Linguistics

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.