Fruth, LeonLeonFruth0009-0001-2128-30252026-02-262026-02-262026https://fis.uni-bamberg.de/handle/uniba/112994Masterarbeit, Otto-Friedrich-Universität Bamberg, 2022Text simplification aims to make texts easier to read and comprehend for people. Recent approaches tackle this task by training neural models on large-scale parallel datasets. However, most languages only have limited simplification data available. Laban et al. [2021b] presented an unsupervised simplification approach to avoid the need for parallel data. They introduced the training algorithm k-SCST, which optimizes a reward by generating and scoring multiple candidate simplifications and encouraging the candidates that outperform the mean reward. This thesis adapts this approach to simplify short paragraphs from German Wikipedia articles. The individual scores of the reward regarding simplicity, fluency and meaning preservation are modified for the new domain and language. In addition, other aspects of the training method are explored. The results show some lexical and syntactic simplification phenomena but also problems regarding fluency and faithfulness. The findings are assessed, and suggestions for future improvements are presented.engText SimplificationNatural Language ProcessingUnsupervised Machine LearningGerman Text ProcessingLanguage Models004An Approach Towards Unsupervised Text Simplification on Paragraph-Level for German Textsmasterthesisurn:nbn:de:bvb:473-irb-112994x