Abstract:Convolutional neural networks (CNN) have recently achieved remarkable performance in a wide range of applications. In this research, we equip convolutional sequence-to-sequence (seq2seq) model with an efficient graph linearization technique for abstract meaning representation parsing. Our linearization method is better than the prior method at signaling the turn of graph traveling. Additionally, convolutional seq2seq model is more appropriate and considerably faster than the recurrent neural network models in this task. Our method outperforms previous methods by a large margin on both the standard dataset LDC2014T12. Our result indicates that future works still have a room for improving parsing model using graph linearization approach.