|
Research
Abstracts - 2007 |
Partial-Matching and Query-by-Example for Motion Capture DataBennett Rogers & Jovan PopovićAbstractMotion capture datasets are employed widely in animation research and industry, however there currently exists no efficient way to index and search this data for diversified use. Motion clips are generally searched by filename or keywords, neither of which incorporates knowledge of sub-motions within the clip. We propose to investigate a method for indexing and searching a large database of motion capture clips that allows for fast insertion and query-by-example. Queries on this database will be able to find matching subsequences within the stored motions, effectively increasing the quantity of data exponentially. The result of the project will be a tool that can reduce the amount of time spent gathering new data for many motion applications, and will increase the utility of existing motion clips. OverviewMotion capture data has become a critical tool in many areas of industry and academia, including movies, computer graphics research, video games, and medicine. Every application of motion data requires a dataset tailored to that specific task, meaning that even as the global collection of data grows, no project benefits from the overall accumulation. Our project aims to develop a motion capture database system capable of supporting queries that would lift this data limitation. Suppose, for example, that you are developing an animation sequence that requires a character to perform a specific jump. If you do not already have a clip that matches what you need, you will have to search existing motions, or capture a new sequence. If you choose to search for a relevant clip, there is currently little chance of finding what you are looking for. Most motion capture databases are stored based on the filename of the motions, so your best option is to look for clips with the word "jump" in the title. Another method would be to manually inspect longer motion clips for the action you need. Clips depicting more complex actions such as a dance or an athletic sequence may contain a jump, but it is infeasible to look through the entire database for one particular subsequence. Therefore you are forced to spend time and money in a motion capture lab capturing the required action. This new motion will perhaps be added to the database, yet it is unlikely to ever be used again unless its filename happens to describe exactly what someone else looks for in the future. In this project, we propose to help solve this data recycling problem. We would like to make it possible for the animator in the above scenario to find all instances of a jumping motion in any clip in the database simply by providing an example of a jump. This requires the system to maintain knowledge about the contents of the stored clips. Additionally, partial matching on queries is necessary to allow subsequences that fit the query to be returned. The search also needs to be fast, which requires an efficient indexing scheme on the motions. Given such a system, many applications would become possible and we would like to explore the space of those options as well. |
||||
|