Applying a Hybrid Sequential Model to Chinese Sentence Correction release_srkdbjgnnvhrrbfikxqimqpbhq

by Jun Wei Chen, Xanno K. Sigalingging, Jenq-Shiou Leu, Jun-ichi Takada

Published in Symmetry by MDPI AG.

2020   Volume 12, Issue 12, p1939

Abstract

In recent years, Chinese has become one of the most popular languages globally. The demand for automatic Chinese sentence correction has gradually increased. This research can be adopted to Chinese language learning to reduce the cost of learning and feedback time, and help writers check for wrong words. The traditional way to do Chinese sentence correction is to check if the word exists in the predefined dictionary. However, this kind of method cannot deal with semantic error. As deep learning becomes popular, an artificial neural network can be applied to understand the sentence's context to correct the semantic error. However, there are still many issues that need to be discussed. For example, the accuracy and the computation time required to correct a sentence are still lacking, so maybe it is still not the time to adopt the deep learning based Chinese sentence correction system to large-scale commercial applications. Our goal is to obtain a model with better accuracy and computation time. Combining recurrent neural network and Bidirectional Encoder Representations from Transformers (BERT), a recently popular model, known for its high performance and slow inference speed, we introduce a hybrid model which can be applied to Chinese sentence correction, improving the accuracy and also the inference speed. Among the results, BERT-GRU has obtained the highest BLEU Score in all experiments. The inference speed of the transformer-based original model can be improved by 1131% in beam search decoding in the 128-word experiment, and greedy decoding can also be improved by 452%. The longer the sequence, the larger the improvement.
In application/xml+jats format

Archived Files and Locations

application/pdf  631.4 kB
file_cdddkpgok5amriid47x7muui2m
res.mdpi.com (publisher)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article-journal
Stage   published
Date   2020-11-25
Language   en ?
Container Metadata
Open Access Publication
In DOAJ
In ISSN ROAD
In Keepers Registry
ISSN-L:  2073-8994
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: e6b9577a-ab5e-4450-985a-4665ab5bb374
API URL: JSON