首页|Gated Convolutional Neural Networks for Text Classification

Gated Convolutional Neural Networks for Text Classification

扫码查看
The popular approach for several natural language processing tasks involves deep neural networks, and in particular, recurrent neural networks (RNNs) and convolutional neural networks (CNNs)。 While RNNs can capture the dependency in a sequence of arbitrary length, CNNs are suitable for extracting position-invariant features。 In this study, a state-of-the-art CNN model incorporating a gate mechanism that is typically used in RNNs, is adapted to text classification tasks。 The incorporated gate mechanism allows the CNNs to better select which features or words are relevant for predicting the corresponding class。 Through experiments on various large datasets, it was found that the introduction of a gate mechanism into CNNs can improve the accuracy of text classification tasks such as sentiment classification, topic classification, and news categorization。

Gate mechanismConvolutional neural networksText classification

Jin Sun、Rize Jin、Xiaohan Ma、Joon-young Park、Kyung-ah Sohn、Tae-sun Chung

展开 >

Computer Engineering, Ajou University, Suwon-si, Gyeonggi-do 16499, Korea

School of Computer Science and Software Engineering, Tianjin Polytechnic University, Tianjin 300160, China

International Conference on Computer Science and its Applications;International Conference on Ubiquitous Information Technologies and Applications

Macao(CN)

Advances in Computer Science and Ubiquitous Computing

309-316

2019