In this study, we investigated how accurately the BERT model can predict Chinese directional complement. In addition, we analyzed which words the BERT model uses as an important clue in the Chinese directional complement inference process. According to the results of this study, it can be seen that the BERT model shows excellent performance in inferring distributional features and grammatical relationships based on transfer learning. Results of experiments with five Chinese directional complements show that the accuracy rate of predictions is quite high. In addition, as a result of analysis using the masked language model, it was found that the BERT model appropriately uses important clues to determine Chinese directional complement in context.
We believe that this study is not only meaningful in the field of NLP, but also provides insight into Chinese grammar research or language education. If this methodology is properly utilized, it will be possible to establish an application system for Chinese grammar research and education. In Neural network models, sufficient language data learning allows us to predict which language expressions are more natural to use. Proper use of these advantages will give us insight into Chinese grammatical functions. This Chinese grammar prediction system will also help Chinese learners improve their skills by showing them what expressions are grammatically correct.