Options
Real-time Hand Motion Synthesis for Playing a Virtual Guitar
Canales, Ryan; Jörg, Sophie (2026): Real-time Hand Motion Synthesis for Playing a Virtual Guitar, in: Bamberg: Otto-Friedrich-Universität, S. 1–11.
Faculty/Chair:
Author:
Publisher Information:
Year of publication:
2026
Pages:
Source/Other editions:
Robert W. Sumner, Fabio Zünd, Sophie Jörg, u. a. (Hrsg.), Proceedings of the 2025 18th ACM SIGGRAPH Conference on Motion, Interaction, and Games, New York: ACM, 2025, S. 1–11, ISBN: 979-8-4007-2236-3
Year of first publication:
2025
Language:
English
Abstract:
Concerts and performances in virtual reality are becoming increasingly popular. Accurately visualizing the detailed hand motions of musicians playing instruments is challenging. In this work, we present a real-time motion synthesis method that generates the detailed fretting hand motion for playing guitar. Our approach first involves capturing and post-processing hand motion data from guitar performances to create a data set for training. The post-processing ensures that the fingertip positions are placed as accurately as possible regarding distance from the fretboard and location between the frets. We then train a neural network that learns to predict hand and finger poses based on guitar tabs and previous poses. We found that our method produces reasonably stable motion and evaluate our results using accuracy measures and visual evaluation.
Keywords: ; ; ;
character animation
motion synthesis
virtual humans
hand motions
Type:
Conferenceobject
Activation date:
January 30, 2026
Project(s):
Permalink
https://fis.uni-bamberg.de/handle/uniba/112832