Ruminating Reader: Reasoning with Gated Multi-hop Attention

Published in ArXiv, 2017

Recommended citation: Gong, Y., & Bowman, S. R. (2017). Ruminating Reader: Reasoning with Gated Multi-Hop Attention. arXiv.org. https://arxiv.org/abs/1704.07415

To answer the question in machine comprehension (MC) task, the models need to establish the interaction between the question and the context. To tackle the problem that the single-pass model cannot reflect on and correct its answer, we present Ruminating Reader. Ruminating Reader adds a second pass of attention and a novel information fusion component to the Bi-Directional Attention Flow model (BiDAF). We propose novel layer structures that construct an query-aware context vector representation and fuse encoding representation with intermediate representation on top of BiDAF model. We show that a multi-hop attention mechanism can be applied to a bi-directional attention structure. In experiments on SQuAD, we find that the Reader outperforms the BiDAF baseline by a substantial margin, and matches or surpasses the performance of all other published systems.

Paper Link

Recommended citation:

@article{Gong:2017wo,
	author = {Gong, Yichen and Bowman, Samuel R},
	title = ,
	journal = {arXiv.org},
	year = {2017},
	eprint = {1704.07415v1},
	eprinttype = {arxiv},
	eprintclass = {cs.CL},
	month = apr,
	annote = {10 pages, 6 figures}
}