Skip to main content

Long Text Analysis Using Sliced Recurrent Neural Networks with Breaking Point Information Enrichment

Bo Li‚ Zehua Cheng‚ Zhenghua Xu‚ Wei Ye‚ Thomas Lukasiewicz and Shikun Zhang


Sliced recurrent neural networks (SRNNs) are the state-of-the-art efficient solution for long text analysis tasks; however, their slicing operations inevitably result in long-term dependency loss in lower-level networks and thus limit their accuracy. Therefore, we propose a breaking point information enrichment mechanism to strengthen dependencies between sliced subsequences without hindering parallelization. Then, the resulting BPIE-SRNN model is further extended to a bidirectional model, BPIE-BiSRNN, to utilize the dependency information in not only the previous but also the following contexts. Experiments on four large public real-world datasets demonstrate that the BPIE-SRNN and BPIE-BiSRNN models always achieve a much better accuracy than SRNNs and BiSRNNs, while maintaining a superior training efficiency.

Book Title
Proceedings of the 2019 IEEE International Conference on Acoustics‚ Speech and Signal Processing‚ ICASSP 2019‚ Brighton‚ UK‚ May 12−17‚ 2019
IEEE Computer Society