Options
Prevention or Promotion? : Predicting Author's Regulatory Focus
Velutharambath, Aswathy; Sassenberg, Kai; Klinger, Roman (2023): Prevention or Promotion? : Predicting Author’s Regulatory Focus, in: Northern European Journal of Language Technology, Linköping: Linköping University Electronic Press, Jg. 9, Nr. 1, doi: 10.3384/nejlt.2000-1533.2023.4561.
Faculty/Chair:
Author:
Title of the Journal:
Northern European Journal of Language Technology
ISSN:
2000-1533
Publisher Information:
Year of publication:
2023
Volume:
9
Issue:
1
Pages:
Language:
English
Abstract:
People differ fundamentally in what motivates them to pursue a goal and how they approach it. For instance, some people seek growth and show eagerness, whereas others prefer security and are vigilant. The concept of regulatory focus is employed in psychology, to explain and predict this goal-directed behavior of humans underpinned by two unique motivational systems – the promotion and the prevention system. Traditionally, text analysis methods using closed-vocabularies are employed to assess the distinctive linguistic patterns associated with the two systems. From an NLP perspective, automatically detecting the regulatory focus of individuals from text provides valuable insights into the behavioral inclinations of the author, finding its applications in areas like marketing or health communication. However, the concept never made an impactful debut in computational linguistics research. To bridge this gap we introduce the novel task of regulatory focus classification from text and present two complementary German datasets – (1) experimentally generated event descriptions and (2) manually annotated short social media texts used for evaluating the generalizability of models on real-world data. First, we conduct a correlation analysis to verify if the linguistic footprints of regulatory focus reported in psychology studies are observable and to what extent in our datasets. For automatic classification, we compare closed-vocabulary-based analyses with a state-of-the-art BERT-based text classification model and observe that the latter outperforms lexicon-based approaches on experimental data and is notably better on out-of-domain Twitter data.
GND Keywords: ; ; ; ;
Textanalyse
Mustererkennung
Neurolinguistisches Programmieren
Selbstregulation
Autor
Keywords:
Author's Regulatory Focus
DDC Classification:
RVK Classification:
Peer Reviewed:
Yes:
International Distribution:
Yes:
Open Access Journal:
Yes:
Type:
Article
Activation date:
March 7, 2024
Versioning
Question on publication
Permalink
https://fis.uni-bamberg.de/handle/uniba/93872