forked from exaexa/better-mff-thesis
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy paththesis.xmpdata
7 lines (6 loc) · 1.85 KB
/
thesis.xmpdata
1
2
3
4
5
6
7
% Metadata to be stored in PDF, see documentation of the pdfx package for more details.
\Author{Petr Kašpárek}
\Title{Cross-lingual transfer for the annotation of the SynSemClass ontology}
\Keywords{annotation projection\sep zero-shot cross-lingual transfer\sep ontology\sep multilingual natural language processing\sep lexical semantics}
\Subject{This work compares two approaches to automatic preannotation of semantic class to verbs in a sentence for the purpose of adding a new language to the SynSemClass ontology. Both approaches rely on a multilingual deep learning classification model fine-tuned on already annotated English, Czech and German data of the ontology. The first, more classical, approach is annotation projection. It uses a parallel corpus and the aforementioned model to make predictions on a source language already present in the ontology and projects the predictions onto the target language using automated word alignment. The second approach, zero-shot cross-lingual transfer, assumes that the multilingual properties of the underlying model are sufficient and that we can make reasonable predictions directly on the target language, even though the model was never trained for that specific task on the specific target language. For the purpose of evaluation, we manually build and annotate a small Korean language dataset to test the performance on a language significantly different from English, Czech and German. We conclude that the zero-shot approach performs notably better than the alignment approach (p $<$ 0.005) obtaining 0.54 both in recall and precision, compared to 0.37 and 0.41 in recall and precision respectively of the alignment approach. We perform an analysis of the errors and find that the extra steps of annotation projection introduce cascading errors and that loose translation poses a problem in itself.}
\Publisher{Charles University}