Options
Emotion-Conditioned Text Generation through Automatic Prompt Optimization
Menchaca Resendiz, Yarik; Klinger, Roman (2023): Emotion-Conditioned Text Generation through Automatic Prompt Optimization, in: Devamanyu Hazarika, Xiangru Robert Tang, Di Jin, u. a. (Hrsg.), Proceedings of the 1st Workshop on Taming Large Language Models : Controllability in the era of Interactive Assistants!, Prag: Association for Computational Linguistics, S. 24–30.
Faculty/Chair:
Author:
Title of the compilation:
Proceedings of the 1st Workshop on Taming Large Language Models : Controllability in the era of Interactive Assistants!
Editors:
Hazarika, Devamanyu
Tang, Xiangru Robert
Jin, Di
Conference:
1st Workshop on Taming Large Language Models : Controllability in the era of Interactive Assistants!, September 2023 ; Prag
Publisher Information:
Year of publication:
2023
Pages:
Language:
English
Abstract:
Conditional natural language generation methods often require either expensive fine-tuning or training a large language model from scratch. Both are unlikely to lead to good results without a substantial amount of data and computational resources. Prompt learning without changing the parameters of a large language model presents a promising alternative. It is a cost-effective approach, while still achieving competitive results. While this procedure is now established for zero- and few-shot text classification and structured prediction, it has received limited attention in conditional text generation. We present the first automatic prompt optimization approach for emotion-conditioned text generation with instruction-fine-tuned models. Our method uses an iterative optimization procedure that changes the prompt by adding, removing, or replacing tokens. As objective function, we only require a text classifier that measures the realization of the conditional variable in the generated text. We evaluate the method on emotion-conditioned text generation with a focus on event reports and compare it to manually designed prompts that also act as the seed for the optimization procedure. The optimized prompts achieve 0.75 macro-average F1 to fulfill the emotion condition in contrast to manually designed seed prompts with only 0.22 macro-average F1.
GND Keywords: ; ; ;
Maschinelles Lernen
Generierung <Sprache>
Gefühl
Prompt Engineering
Keywords:
Emotion-Conditioned Text Generation
DDC Classification:
RVK Classification:
Peer Reviewed:
Yes:
International Distribution:
Yes:
Open Access Journal:
Yes:
Type:
Conferenceobject
Activation date:
March 7, 2024
Versioning
Question on publication
Permalink
https://fis.uni-bamberg.de/handle/uniba/93873