Dynamic Motion Matching: Context-Aware Character Animation with Subspaces Ensembling
In Proceedings of the 23rd IEEE International Symposium on Multimedia (ISM 2021)

Abstract
Game developers have widely adopted motion matching, which retrieves the most appropriate short motion clip from a database in real-time, as a core animation system. However, existing motion matching systems have limited scalability to support the wide range of scene contexts because their complexity increases with the number of different contexts due to the necessity to change the whole feature vector when the context changes. This paper proposes an improved motion matching algorithm that enables geometry and objects aware motion synthesis by dynamically selecting features applied in data retrieval. Our method, dynamic motion matching (DyMM), combines an offline subspace decomposition of raw motion clips with a subspace ensemble matching to compare relevant sub-features selected according to the needs of a character controller at runtime. The offline subspace decomposition method extracts relevant sub-features from motion capture clips at the database construction phase to create a set of retrieval spaces tailored to a specific scene context. The subspace ensemble matching method applies the nearest neighbor search to score motion clips for each sub-feature, and finally normalizes each score to determine the most appropriate motion clip. We verified the effectiveness of our method by applying it to industry-level motion data captured by professional game artists for multiple configurations and character controllers. The results of this study show that, by finding motion clips that comply well with the scene context, one can leverage large motion capture datasets to create practical systems that generate believable and controllable animations for games.
Info
URL: IEEE ISM 2021
DOI: 10.1109/ISM52913.2021.00028
Citation: .bib format
Index Terms—Motion Matching, Character Animation, Animation, Data-Retrieval, Motion-Capture.