Persian Ezafeh Recognition using Transformer-Based Models

Published in 2023 9th International Conference on Web Research (ICWR), 2023

Authors

Ali Ansari, Zahra Ebrahimian, Ramin Toosi, Mohammad Ali Akhaee

Abstract

In Persian, the grammatical particle ezafe connects two words. Ezafe is one of the salient factors in Persian phonology and morphology to understand the meaning of a sentence completely and truly, whereas it is not usually written in sentences, resulting in mistakes in reading complex sentences and errors in natural language processing tasks. Therefore, recognizing words that need Ezafe at the end of themselves, is a major factor to improve the performance of a variety of NLP-based systems such as a Text TTSsystem. Because in Persian TTS systems without an Ezafe recognition module cannot make Ezafe constructions to read the text correctly and does not recognize the relations between the words. As Transformer-based methods shows state-of-the-art results in lots of NLP tasks, in this paper, we experiment ParsBERT in the task of ezafe recognition. The latter earning 2.68% better F1-score than the prior state-of-the-art, we obtain the most advantageous outcomes.