Saturday, November 7, 2015

Maya Python: Get the Hierarchy Root Joint

  I am taking a break from the Rigging System Case Study series; Part 3 may take some time to write out everything.  I've recently began exploring a personal project that required me to to take a look at rewriting some really simple rigging utility functions.  I decided to post a few and here is the first...


import maya.cmds as cmds


def _getHierarchyRootJoint( joint="" ):
    """
    Function to find the top parent joint node from the given 
    'joint' maya node

    Args:
        joint (string) : name of the maya joint to traverse from

    Returns:
        A string name of the top parent joint traversed from 'joint'

    Example:
        topParentJoint = _getHierarchyRootJoint( joint="LShoulder" )

    """
    
    # Search through the rootJoint's top most joint parent node
    rootJoint = joint

    while (True):
        parent = cmds.listRelatives( rootJoint,
                                     parent=True,
                                     type='joint' )
        if not parent:
            break;

        rootJoint = parent[0]

    return rootJoint 
  

  I've used this particular function for part of the traversal from mesh->skinCluster->influence->top parent joint.  I've used it for mostly exporter, animation tools and for rigging purposes - building an export skeleton layer on a game rig.

  For the purposes of my personal project - the code needs to be as fast as possible for the utility functions.  I like to stay away from plugin dependencies for Maya tools where possible,  so I am working with Maya commands engine for my Maya utility code - it's not as Pythonic as PyMel, but is faster and worth spending the extra time to consider if you are worried about speed of the tool.  It's a trade off to consider when thinking of the needs of the tool.  For instance the Rigging System that I've been blogging about was written with PyMel, where as most of the animation tools I've worked with I have used the Maya commands engine.  With my timing decorator on I am averaging about 0.0075-0.008s for this function traversing about 250 joints up the chain.

  Speaking of the time decorator, here is mine that I created to track/debug my utility stuff.  I would suggest using logging instead of print, print is bloated and would provide less accurate data for you to analyze. 


from functools import wraps
import time
import logging
import maya.utils 
 
# Create debug logger within in - a few Maya version block the basic logger
logger = logging.getLogger( "MyDebugLogger" )
logger.propagate = False
handler = maya.utils.MayaGuiLogHandler()
handler.setLevel( logging.INFO )
formatter = logging.Formatter( "%(message)s" )
handler.setFormatter( formatter )
logger.addHandler( handler )
 
def timeDecorator( f ):
    """
    Decorator function to apply a timing process to a function given

    Args:
        f (object) : Python function passed through the decorator tag

    Returns:
        return the value from the function wrapped with the decorator
        function process

    Examples:
        @timeDecorator
        def myFunc( arg1, arg2 ):

    """

    @wraps(f)
    def wrapped( *args, **kwargs ):
        """ 
        Wrapping the timing calculation around the function call 
        
        Returns:
            Result of the called wrapped function
            
        """
        

        
        # log the process time 
        t0 = time.clock()
        r = f( *args, **kwargs )
        logger.warning( "{funcName} processing took : {processTime}".format( funcName=f.__name__, processTime= + time.clock() - t0 ) )
        
        return r

    return wrapped

Wednesday, November 4, 2015

Case Study: Building a Rigging system Part 2

The rigging tool kit v2.0 ui and using the Armature system to place and orient joints from a Biped Template

  I mentioned at the end of Part 1 - accuracy of joint placement with "Volume Guides" and empowerment for the Rigging team were the areas that needed improvements.  In this entry, I will walk through version 2.0 of the rigging system and the improvements made and how those improvements impacted the artists.  And again, try to explain the holes I could see and my thought process in fixing those for the future release.


Big Takeaways


  User experience is EXTREMELY important - even though all the tools and functionality exist, if they are in a confusing layout then the user isn't able to work at full capacity because they are fighting with a bad experience.  Thinking of UX (user experience) from things like UI layout down to how a user interacts with editing a custom meta data node eventually would lead to the most current release which I will cover in a future post.


Empowering the Artist

  • Improving Templates:
  The new release for the rigging system would do away with the "Volume Guide" step and would start using only "Templates" - which are Maya files that have a skeleton and meta data attached to them that Artists have saved out for future use through the "Template Library" feature. 
 This decision freed the artists from relying on new anatomies from the "Volume Guide" created by the TA team and allowed them to draw their own Skeletons and save them as needed.  "Templates" have ranged from full skeletons like a Biped or Quadruped down to single components like wings, cape, arms, etc.  Seeing how the artists have branched out the "Template Library" in this way has reassured me that giving them the ability to do this was definitely the correct decision.

  • Exposing and Expanding Rig Meta Data:
The v2.0 Meta Data Editor.  It is very painful to look at now-a-days :(
  The "Volume Guides" in v1.0 already had some representation of meta data.  They were custom attributes added to transforms that were stored in the scene, the attributes instructed the rig builder process on how to construct the rig.  Mixing different anatomies would result in different rig setups based on the hidden meta data. 
  In v2.0 the decision was made to expose the editing of these nodes to artists and expand the use of meta data to rigging "Modules".   Thinking of the rigging system as rig modules rather than specific anatomy types was a HUGE step in the foundation of the rig builder.  The meta data was still an empty transform with extra attributes. For example the original FK Module meta data node had these attributes....
ModuleType (string) - This would store the name of the module type (i.e. "FK") 
Joints (string) - This stored a list of string names for the joints 
MetaScale (float) - This value was used to set an intial build scale for controls
MetaParent (string) - This would store the name of the DAG parent for this module to parent to.
Side (string) - This string value would determine the side of the component, left/right/center 
  To further customize a rigging "Module" - the artist could create a sub-module rigging component named a rigging "Attribute".  These rigging "Attributes" would apply a modification to a rig "Module".  Examples are things like SingleParent (Module follows some transform with translations and rotations), DynamicParent (Module can dynamic switch what transform it follow), etc
  A Meta Data Editor was also added to the rigging system, which allowed the Artist to create or edit meta data nodes easier than working in Maya's Attribute Editor. The build process could figure out what to do and how to do it based on meta data information. The build process was (1) Module with Module's Python Code on post build,  (2) Loop through Module's Attributes with Python Code on each post build.

  • Custom Python Code Customization:
    Each Meta Data node also had a custom string attribute that would hold Python code.  The code would execute the python after the build process for that specific module - which allowed a lot of flexibility for the artist to work.  The Meta Data Editor also had a custom python code editor - which at this time was just a a simple PyQt QLineEdit.
  This was a big deal for the extensiveness of the system but it also motivated our artists to learn more scripting - which has been a tremendous win for the overall rigging and tech art departments.  A motivating reason to learn! :)

  • Seeking a Better Control:
  The original release of the rigging tools was using a very traditional design for rig controls - NURBS curves.  Nurbs were easily customizable but not as easy for the builder to restore those edits on rig delete/build.
  This led to an exploration of a custom C++ Maya node (MPxLocator class) that used custom attributes that would dictate what shape is drawing.
  The custom control node allowed the artist to edit the "look" of the rig and it created an easy way to save the control settings when the rig is deleted - so that when it's recreated it will restore the last control settings. The build process would temporarily save the settings to custom attributes on the joints, then restore those settings when the rig builds - and later delete those temporary attributes.


The available attributes for the custom Maya Locator node, also the
Beauty Pass tool which allowed copy/mirror control settings for faster work flow
  Since the custom control was using the OpenGL library we were able to manipulate things like shape type, thickness of lines, fill shading of a shape, opacity of lines and shades, clear depth buffer (draw on top) among many other cool features. 
* Thinking back on using a custom plugin for controls, I think I would look more into wrapping a custom pymel node from NURBS and trying to use custom attributes to save out the data for each CV similar to how I saved the custom attributes for the plugin control.  I would lose the coolness of controlling the OpenGL drawing onto the viewport, but would gain a lot of flexibility on the shape library and the overall maintenance of the plugin with Maya updates.  


Speed with Accuracy
The Armature system, the spheres are color coded based on the primary joint axis.
The bones are colored with the secondary and tertiary axis.

  • Interactive and Non-Destructive Rigging with Armature:
  This update addressed a lot of empowering the artist to control the system, with the removal of the "Volume Guide" system we needed a similar work process that would assist the artist in positioning and orienting joints.  We introduced the Armature system, which was a temporary rig that would allow the artist to position and orient joints with precision and speed.  
  I won't go into details of the rig system for Armature, but high level description is it would build a temporary rig based on the connected meta data, the artist would manipulate the rig into position with a familiar "puppet control system" then remove the Armature and have the updated Skeleton.  This skeleton update would have NO detrimental affects to existing skinClusters - which was a HUGE win for the artists as they would have small joint placement iterations as they were skinning the character.  
 Using a rig to manipulate your joints made a lot of sense to our artists and as a tool the rig could toggle certain features on like symmetry movement which would mirror adjustments across the body.  The Artist also had a toggle feature for hierarchy movement which would cause children to follow the parent or not. 


Thoughts

  Throughout the development of the v2.0 update I was already formulating plans for the v3.0 update.  Version 2.0 was huge for laying the ground work for how I personally thought of rig construction - and even how I approach teaching it to students or more novice co-workers.

   Thinking of rigging on a component or module level instead of a specific anatomy type, gave me a perspective of feature needs rather than general anatomy needs.  Don't get me wrong the anatomy is still high priority when figuring out the way something moves, but what I am saying is thinking of the UX for the animator or for the rigger can have a huge impact on how you build a rigging system.


A post-mortem study of v2.0's ui layout readability, and then a really early mock up of the ui that was eventually used for v3.0's ui.

  At the time of v2.0 the buzz word for our Tech Art department was UX - that is probably the big take away from v2.0.  I took that as my main driving force for the most current update (v3.0).  At the time of this release I was still learning best practices for UX - a lot of time was spent through the iterations of v1.0 and v3.0 shadowing artists, doing walk through tutorials and just chatting on what is a good work flow and what would be the theoretical "perfect workflow".  Some of the things that popped up that I will cover in v3.0.

  • The Meta Data editor required too many steps (This still relied on the user using the Attribute Editor, Connection Editor, etc)
  • A string based meta data attribute is easy to mess up (I discovered message attributes as a key solution to this issue)
  • It's hard to acclimate folks who are used to rigging their own way (This can be helped a bit by providing structure with flexibility)
  • There was too many set instructions for the rig builder - not enough flexibility.  Even with a full python post build script available - artists wanted more nodes to work with rather than script it.
  • Layout of the UI needed optimization, more templates visible, add template search filters, reworking specific tools.
  • Debugging a module was difficult for the artist - this required a lot of shadow time for me to find out how the artist was working and thinking but it also provided a very valuable information and solutions that we would implement in v3.0
  • The more we went into our own territory of "what is the method of rigging" with our own tool set the more important high level terms, tutorials and documentation became.  This became a big hurdle - we had to make sure we trained people on rigging theory instead of just learning the default process of rigging in Maya.  We managed to lessen this hurdle by really pushing the UX of the tool in v3.0.