From BlenderWiki

Jump to: navigation, search
Note: This is an archived version of the Blender Developer Wiki. The current and active wiki is available on wiki.blender.org.

Google Summer Of Code 2011 Proposal

Improving Motion Capture Import & Workflow - Benjy Cook


A pdf copy of this proposal is available here: http://sites.google.com/site/thehadrian/files/GSOCProposal.pdf It's formatted better and therefore easier to read


Synopsis

The largest portion of work in a CG-related project is animation. Be it video game development, film or TV, animation takes up the lion’s share of time and resources. To reduce these costs, motion capture animation is available as an alternative for keyframe animation. Both faster and at times more realistic than a human animator, motion capture data is hard to work with, for a variety of reasons. While Blender has had BVH import (a popular format for mocap data) for a while, it lacks tools to deal with this type of data properly. My proposal is to provide tools to streamline mocap data into a project’s workflow and to deal with the issues mocap raises.

Benefits for Blender

My goal at the end of this project is to allow artists to easily import and work with BVH files in such a way that the end product will look and feel no different than if the animation was keyframed by an artist. This will be accomplished via the following enhancements, the largest of which is easy retargeting of the original motion onto a user-created rig.

Project Details

I plan on accomplishing this goal through a series of smaller steps, resulting in the finished project:

  • Import as F-curves:

Currently Blender imports a BVH file as it arrives, keyframing every frame in the animation. Chiefly, this makes mocap data difficult to work with, as it is bulkier, takes up more resources and is considerably harder to adapt and change. My first step will be to add a feature allowing users to convert this heavily keyframed motion into a series of more natural keyframes, simulating how a human animator operates. This is a simple job, resampling is done in many fields in computing and Blender’s API is well developed in this area. This step is necessary for the following ones, and it is a good way to start the project.

  • Conversion to cycles and transfer of root displacement to stride bone:

Many animations have a cyclical nature to them, such as walk cycles, jumping in place, and so on. Due to that nature, a system is needed to transfer mocap files that contain such cycles and modify the original animation so that it cycles well. In addition, the backbone of any animated character, its walk, run and other cycles, depend on a “stride bone” that can be moved freely, even along a curve. In video games, often the locomotion system controls this bone, making the animation interactive. Mocap files obviously do come equipped with a stride bone, but displace the root bone instead. A feature is needed that transfers this motion to a stride bone or object so mocap can be used easily for walk and other cycles.

  • Retargeting animations:

The main goal of the project. Once the above goals are completely or in advanced stages, work can begin on retargeting. Retargeting is the process of copying an animation from one rig (or character) to another. Motion capture subjects rarely fit the size and other requirements of whatever character we want to animate. In addition, motion capture skeletons are not true rigs, rather just a basic FK rig. Therefore to make motion capture truly viable in a workflow, a system is needed to retarget the animation to an End-User rig. There is much literature on this subject and some commercial software do this very well (Autodesk MotionBuilder), proving that it is definitely possible. Also, Thomas Larsson at MakeHuman has shown some promising work in the area, even implementing it within Blender. After transferring the animation automatically, tools must be provided to the animator to polish up the transferred animation. Post-retraget features, such as IK’ing limbs and solving constraints provide tools to deal with such issues as foot-skate, keeping contact with a floor or object, and so on. Ideally, a non-destructive system can be set up to allow the artists to “layer” on such fixes and even manual keyframing on top of the motion capture. Keyframing constraint influence, baking mid-workflow or using delta channels are all possible avenues of investigation to implement such a system fully or partially.

Deliverables

Blender Add-on with UI, documentation and at least a few video tutorials showcasing the system. It’s possible that the entire system will be enclosed within the existing import_bvh add-on. Separating the more advanced functionality into an additional add-on is also a possibility.

Literature and guidance

I have been inspired by the article “Using an Intermediate Skeleton and Inverse Kinematics for Motion Retargeting” (Jean-Sebastien Monzani, et al.) 1, which explains in detail a robust methodology and algorithm for retargeting. In addition, I plan to rely on these secondary sources:

  • Rig Retargeting for 3d Animation - Martin Poirier, Eric Paquette. 2
  • Retargetting Motion to New Characters - Michael Gleicher 3
  • Footskate Cleanup for Motion Capture Editing - Lucas Kovar, Michael Gleicher, John Schreiner 4

Also, I believe that many in the Blender Development Community can assist me if needed with the practical and theoretical work, such as Campbell Barton, Nathan Letwory, Martin Poirier and Thomas Larsson #.

Contact Information

Name: Benjamin (Benjy) Cook
D.O.B: 01.12.1987
Email / IRC: benjytcook@gmail.com.
I frequent #blendercoders IRC as benjycook, and subscribe to the bf-commiters mailing list.
Israeli Cell Phone: 972-54-7256429
Address: 14 Washington Avenue, Apt. 7, Tel Aviv, Israel.

Project Schedule

I am fairly familiar with Blender’s codebase and particulary its python API, and am confident I could start actual work during the “Community Bonding Period”, especially the first item (F-curve import), which is theoretically simple and is a good place to start. By the time the official “start coding!” date, I plan to have finished the F curve and some of the smaller enhancements, and can begin the main work of retargeting. Next follows implementation of the retarget algorithm, which will take about a month. Therefore by the end of June I hope to work on the various artist tools that will accompany the project. Work on retargeting will continue side-by-side with this and UI design. By the mid-term evaluation and entering the last month of coding, the project will shift its attention to getting Blender community feedback, bug fixing, documentation and tutorial writing. I have budgeted a fair amount of time for this phase, because I assume some major open issues and bugs will need t be dealt with.

Availability

While I am a full-time CS and Film student at Tel Aviv University, I am in my second year and 90% of the school work are smaller programming projects which do not require a large amount of my time. My exams are in the fall, so that will not be an issue, and there is a two-week “Spring-break” in Israel during this semester, providing me with even more time. I will be moving to the United States (Boston area) in mid August, towards the end of the project, which will probably mean taking a few days before I can get set up again.

Bio and motivation

I am 23 years old and a second-year student at Tel Aviv University, doing a double degree in Computer Science and Film & Television studies. I have been programming since I was six (http://en.wikipedia.org/wiki/ZZT-oop) and have never looked back. After high school, I served for three years in the Israeli Army, in the IT unit of the Prison Service, and rose to be the NCO in charge of all computing systems in the northern part of the country for that organization. I was honorably discharged as a 1st Sergeant in April 2009, and began school the following october. At school, I found myself between the two worlds than interest me - Art (and film in particular) and Computer Science. I have advised and technically aided many film students in various stages of production, in addition to working as a special effects artists, editor and colorist. I enjoy working closely with artists and users who need software to work, and being able to assist them technically. In addition to programming and art, I enjoy video games, cycling and drinking coffee with friends around Tel Aviv.

Being in my second year of Computer Science, my math skills are adequete, having had an assortment of classes in Linear Algebra, Calculus, Discrete Mathematics and Statistics. I am most experienced with Python (the main and possibly sole language for this project), but also know C, Java and Lisp. Having been a Blender user since about 2.35 (2003?), I am familiar with the program itself. I programmed extensively within the Blender Game Engine, and thus am familiar with some of Blender’s Python API (the old one, and standard libs such as mathutils). Regarding its codebase, I have spent the last few weeks reading up on the code, especially the new python API, UI system and 2.5 in general. In addition, I have worked with Campbell Barton (ideasman_42) on a minor improvement to the existing bvh import script, as a way to demonstrate I can work with the project - submitting patches, working with svn, frequenting the IRC and bf-committers mailing list, etc. (http://projects.blender.org/tracker/index.php?func=detail&aid=26599&group_id=9&atid=127)

One example of programming in other areas I have done is a school project, where I improved the artificial intelligence of Unreal Bots (via better pathfinding). This was done in Java and Python. Here is a site containing documentation, slides from my presentation and the source code. I believe it shows my ability to work and program independently with good documentation. http://code.google.com/p/smart-path-unreal-2004/

My vimeo account, http://www.vimeo.com/user1418655, contains screencasts and clips from various projects, both in programming and film. Unfortunately, my backup hard-drive failed a few months ago and as such I am unable to provide code for some of the AI tests and other projects seen there.

My motivation in choosing this project is, as a more technical person rather than an animator, I am frustrated by the lack of real support and streamlined workflow with mocap data in Blender. Also, I strongly feel that use for game development is a selling point for Blender, and mocap data can be used for game production as well.

Sources


Response to comments recieved

More Implementation and schedule details, in response to Martin Poirier's questions on the GSOC site.

  • IK solution

I plan on using a full body IK solution, which will encompass all constraints. The algorithm I intend to use is detailed in Monzani (2000), reference 1 in my proposal. This calculates rotations for all constraints, and sets as a "attraction pose" the retargeted mocap pose. The IK solution is computed for the entire timeline of the animation to prevent inconsistencies between frames.

  • Non destructive system

I realize the importance of a non-destructive system solution as a tool for users and also how my system fits in with the general Blender philosophy (non-destructive mesh modifiers, NLA animation, etc.). I envision a layer/modifier system where on the bottom is the original data, and added to it is the IK solution, smoothing between frames effected by IK changes and not, and user-added keyframes and tweaks. I completely agree that such as system will provide users with the most flexibility while working.

  • Implementation schedule:

As stated in the proposal, work on the retargeting solution will occur between mid-May to mid-July (8 weeks).I will begin with implementing the rotation transfer of bones from the original to End User Skeleton, which I estimate will take 2 weeks. Following this, 2 weeks to implement the IK solution. The last month will be devoted to creation of the UI for these features, concurrently with working on the code behind them - adding different types of constraints, the above smoothing methods, etc. I hope to have an alpha version by the end of this period to show and get feedback from users, and in the last month of the project shift attention to user-feedback, bugs, documentation and tutorials.