Canales, RyanRyanCanalesJörg, SophieSophieJörg0000-0002-7910-85532026-01-302026-01-302026https://fis.uni-bamberg.de/handle/uniba/112832Concerts and performances in virtual reality are becoming increasingly popular. Accurately visualizing the detailed hand motions of musicians playing instruments is challenging. In this work, we present a real-time motion synthesis method that generates the detailed fretting hand motion for playing guitar. Our approach first involves capturing and post-processing hand motion data from guitar performances to create a data set for training. The post-processing ensures that the fingertip positions are placed as accurately as possible regarding distance from the fretboard and location between the frets. We then train a neural network that learns to predict hand and finger poses based on guitar tabs and previous poses. We found that our method produces reasonably stable motion and evaluate our results using accuracy measures and visual evaluation.engcharacter animationmotion synthesisvirtual humanshand motionsReal-time Hand Motion Synthesis for Playing a Virtual Guitarconferenceobjecturn:nbn:de:bvb:473-irb-112832x