Full text loading...
This study examines the use of enhanced trackable markings in semi-automatic animation rotoscoping with EbSynth, an open-source free software, to uncover new efficiencies and streamline the animation workflow. Unlike generative artificial intelligence (AI) applications such as Runway, Stable Diffusion and Sora, which use machine learning (ML) and neural network (NN) techniques, EbSynth uses example-based synthesis (EBS) algorithms to create images for animation sequences based on one or more keyframe inputs. This research examines how distinctive facial markings – dotted markers, contour lines and character-specific markings – affect the manual labour involved in frame-by-frame editing in semi-automatic rotoscoping, particularly in scenarios where the animated character’s design significantly deviates from the live-action actor’s facial features. Through a series of controlled experiments with an actress portraying various facial movements and expressions, we analysed the effectiveness of these markings in improving the accuracy and reducing the need for subsequent adjustments in rotoscoped sequences. This research addresses a gap in empirical studies regarding the practical application of semi-automatic tools in animation, particularly in optimizing the rotoscoping process. By detailing the impact of character-specific trackable markings, this study contributes to the academic field while also providing practical guidelines for animators seeking to integrate semi-automation technologies such as EbSynth into their creative processes.