Showing posts with label Case Study. Show all posts
Showing posts with label Case Study. Show all posts

Wednesday, November 4, 2015

Case Study: Building a Rigging system Part 2

The rigging tool kit v2.0 ui and using the Armature system to place and orient joints from a Biped Template

  I mentioned at the end of Part 1 - accuracy of joint placement with "Volume Guides" and empowerment for the Rigging team were the areas that needed improvements.  In this entry, I will walk through version 2.0 of the rigging system and the improvements made and how those improvements impacted the artists.  And again, try to explain the holes I could see and my thought process in fixing those for the future release.


Big Takeaways


  User experience is EXTREMELY important - even though all the tools and functionality exist, if they are in a confusing layout then the user isn't able to work at full capacity because they are fighting with a bad experience.  Thinking of UX (user experience) from things like UI layout down to how a user interacts with editing a custom meta data node eventually would lead to the most current release which I will cover in a future post.


Empowering the Artist

  • Improving Templates:
  The new release for the rigging system would do away with the "Volume Guide" step and would start using only "Templates" - which are Maya files that have a skeleton and meta data attached to them that Artists have saved out for future use through the "Template Library" feature. 
 This decision freed the artists from relying on new anatomies from the "Volume Guide" created by the TA team and allowed them to draw their own Skeletons and save them as needed.  "Templates" have ranged from full skeletons like a Biped or Quadruped down to single components like wings, cape, arms, etc.  Seeing how the artists have branched out the "Template Library" in this way has reassured me that giving them the ability to do this was definitely the correct decision.

  • Exposing and Expanding Rig Meta Data:
The v2.0 Meta Data Editor.  It is very painful to look at now-a-days :(
  The "Volume Guides" in v1.0 already had some representation of meta data.  They were custom attributes added to transforms that were stored in the scene, the attributes instructed the rig builder process on how to construct the rig.  Mixing different anatomies would result in different rig setups based on the hidden meta data. 
  In v2.0 the decision was made to expose the editing of these nodes to artists and expand the use of meta data to rigging "Modules".   Thinking of the rigging system as rig modules rather than specific anatomy types was a HUGE step in the foundation of the rig builder.  The meta data was still an empty transform with extra attributes. For example the original FK Module meta data node had these attributes....
ModuleType (string) - This would store the name of the module type (i.e. "FK") 
Joints (string) - This stored a list of string names for the joints 
MetaScale (float) - This value was used to set an intial build scale for controls
MetaParent (string) - This would store the name of the DAG parent for this module to parent to.
Side (string) - This string value would determine the side of the component, left/right/center 
  To further customize a rigging "Module" - the artist could create a sub-module rigging component named a rigging "Attribute".  These rigging "Attributes" would apply a modification to a rig "Module".  Examples are things like SingleParent (Module follows some transform with translations and rotations), DynamicParent (Module can dynamic switch what transform it follow), etc
  A Meta Data Editor was also added to the rigging system, which allowed the Artist to create or edit meta data nodes easier than working in Maya's Attribute Editor. The build process could figure out what to do and how to do it based on meta data information. The build process was (1) Module with Module's Python Code on post build,  (2) Loop through Module's Attributes with Python Code on each post build.

  • Custom Python Code Customization:
    Each Meta Data node also had a custom string attribute that would hold Python code.  The code would execute the python after the build process for that specific module - which allowed a lot of flexibility for the artist to work.  The Meta Data Editor also had a custom python code editor - which at this time was just a a simple PyQt QLineEdit.
  This was a big deal for the extensiveness of the system but it also motivated our artists to learn more scripting - which has been a tremendous win for the overall rigging and tech art departments.  A motivating reason to learn! :)

  • Seeking a Better Control:
  The original release of the rigging tools was using a very traditional design for rig controls - NURBS curves.  Nurbs were easily customizable but not as easy for the builder to restore those edits on rig delete/build.
  This led to an exploration of a custom C++ Maya node (MPxLocator class) that used custom attributes that would dictate what shape is drawing.
  The custom control node allowed the artist to edit the "look" of the rig and it created an easy way to save the control settings when the rig is deleted - so that when it's recreated it will restore the last control settings. The build process would temporarily save the settings to custom attributes on the joints, then restore those settings when the rig builds - and later delete those temporary attributes.


The available attributes for the custom Maya Locator node, also the
Beauty Pass tool which allowed copy/mirror control settings for faster work flow
  Since the custom control was using the OpenGL library we were able to manipulate things like shape type, thickness of lines, fill shading of a shape, opacity of lines and shades, clear depth buffer (draw on top) among many other cool features. 
* Thinking back on using a custom plugin for controls, I think I would look more into wrapping a custom pymel node from NURBS and trying to use custom attributes to save out the data for each CV similar to how I saved the custom attributes for the plugin control.  I would lose the coolness of controlling the OpenGL drawing onto the viewport, but would gain a lot of flexibility on the shape library and the overall maintenance of the plugin with Maya updates.  


Speed with Accuracy
The Armature system, the spheres are color coded based on the primary joint axis.
The bones are colored with the secondary and tertiary axis.

  • Interactive and Non-Destructive Rigging with Armature:
  This update addressed a lot of empowering the artist to control the system, with the removal of the "Volume Guide" system we needed a similar work process that would assist the artist in positioning and orienting joints.  We introduced the Armature system, which was a temporary rig that would allow the artist to position and orient joints with precision and speed.  
  I won't go into details of the rig system for Armature, but high level description is it would build a temporary rig based on the connected meta data, the artist would manipulate the rig into position with a familiar "puppet control system" then remove the Armature and have the updated Skeleton.  This skeleton update would have NO detrimental affects to existing skinClusters - which was a HUGE win for the artists as they would have small joint placement iterations as they were skinning the character.  
 Using a rig to manipulate your joints made a lot of sense to our artists and as a tool the rig could toggle certain features on like symmetry movement which would mirror adjustments across the body.  The Artist also had a toggle feature for hierarchy movement which would cause children to follow the parent or not. 


Thoughts

  Throughout the development of the v2.0 update I was already formulating plans for the v3.0 update.  Version 2.0 was huge for laying the ground work for how I personally thought of rig construction - and even how I approach teaching it to students or more novice co-workers.

   Thinking of rigging on a component or module level instead of a specific anatomy type, gave me a perspective of feature needs rather than general anatomy needs.  Don't get me wrong the anatomy is still high priority when figuring out the way something moves, but what I am saying is thinking of the UX for the animator or for the rigger can have a huge impact on how you build a rigging system.


A post-mortem study of v2.0's ui layout readability, and then a really early mock up of the ui that was eventually used for v3.0's ui.

  At the time of v2.0 the buzz word for our Tech Art department was UX - that is probably the big take away from v2.0.  I took that as my main driving force for the most current update (v3.0).  At the time of this release I was still learning best practices for UX - a lot of time was spent through the iterations of v1.0 and v3.0 shadowing artists, doing walk through tutorials and just chatting on what is a good work flow and what would be the theoretical "perfect workflow".  Some of the things that popped up that I will cover in v3.0.

  • The Meta Data editor required too many steps (This still relied on the user using the Attribute Editor, Connection Editor, etc)
  • A string based meta data attribute is easy to mess up (I discovered message attributes as a key solution to this issue)
  • It's hard to acclimate folks who are used to rigging their own way (This can be helped a bit by providing structure with flexibility)
  • There was too many set instructions for the rig builder - not enough flexibility.  Even with a full python post build script available - artists wanted more nodes to work with rather than script it.
  • Layout of the UI needed optimization, more templates visible, add template search filters, reworking specific tools.
  • Debugging a module was difficult for the artist - this required a lot of shadow time for me to find out how the artist was working and thinking but it also provided a very valuable information and solutions that we would implement in v3.0
  • The more we went into our own territory of "what is the method of rigging" with our own tool set the more important high level terms, tutorials and documentation became.  This became a big hurdle - we had to make sure we trained people on rigging theory instead of just learning the default process of rigging in Maya.  We managed to lessen this hurdle by really pushing the UX of the tool in v3.0.


Tuesday, June 9, 2015

Case Study: Building a Rigging system Part 1

  I have decided to do a few blog series posts based on a few case studies that have popped up in my experiences as a tech artist.  Not necessarily in any particular order but I will start with the task of building a rigging system...


Big Takeaways:  


  Empower the Rigging team to OWN the tool kit as their own,  it's not the 100 controls they personally named themselves that makes them a Rigger but the overall artistic design and functionality of the creature they just designed for kinematic movement.  Accuracy of joint placement is so important that it must be a higher priority than speed in any system where you are laying out a skeleton.



Starting Out


  Early on my first task at Turbine was to work on our rigging tools, this was great for me - my background in tech art began from learning rigging and I really wanted to bring the pipeline into a modern standard.  The final product went through three major "versions", but it all started with trying to wrap up a bunch of helper functions into a build procedure.  This was still so early that we were still exploring exactly what would the animators' preferences be for basic feature styles, orientations, etc.  We knew we ( the tech art team ) were making a shift from MEL to Python, and we were starting to explore PyMel - so I started up coding that build procedure in PyMel.

  It was fortunate that most of the characters at the time were bipeds, the build process only handled our basic biped.  I realize most auto riggers start out with a biped design and some branch out to quadrupeds...I knew my vision for our tool kit would be a system that would aim to let the artist dictate the rig and not the tool kit itself - but it took some time to imagine how best to enable that control.


Inspiration, Incubation and Implementation


  The skeleton layout tools for the first release of the rigging tool kit were heavily influenced on a Siggraph paper from years ago by ILM on their Block Party system.  A lot of auto riggers use locators to set up a layout for a skeleton.  Although the locator technique gives a lot of accuracy - I didn't feel like it was particularly fast enough.  I adopted the Block Party term "Volume Guide" loosely applied to my proxy resolution biped mesh guide.  The rigger would move the body parts to fit within the mesh, the skeleton would be built to replace the guide and then they could proceed in the build rig procedure.

  The biped guide was styled like an art maquette, the rigger would use translate/rotate/scale to manipulate the proxy mesh into place.  The shapes of the proxy geo were actually shape parented under their respective joints, this gave the user the speed of rotation mirroring across the body during the layout process because they were actually manipulating joints and not polygonal objects. Later on this iteration of the tool kit grew to support more "Volume Guides", horse, lion, bird and spider were added.  The ability to "kit-bash" body parts was also added.  For example combining the upper body of a human with the base of the horse allowed you to create a centaur,  or adding wings onto a human would give you an angel,  another example is a werewolf - top half human, bottom half lion/cat guide.


The basic Human Biped Guide
"Kit Bashing" together a Human and Horse Guide
  















  The actual building tools and how the Guide's "knew how they were to be rigged" was inspired by a power point presentation at GDC from several years ago as well by Bungie, you can view the power point here.  The main thing I learned from the Bungie talk was to add mark up data (meta data) onto the maya objects, that the building procedure could later find and determine what to do with that.

  This was the start of the "modularity" of what the system would become, it allowed artist to load in parts of other guides to have multiple fingers, arms, spines, legs, tails, wings, etc and plug them with a simple press of the P button (maya parent).  This allowed a lot of quick development for creature rigs.  The speed of the Guide rig to create a skeleton layout that could be rigged with a click of a button was great - but I started to notice that it was somewhat difficult to get our Rigging team to adapt to a "volumetric" way of building a skeleton - so there needed to be some changes to improve and adapt to how we as a team wanted to work.


Volume Guide - The term used for a master anatomical creature maya file that I built to serve as a building block for a character.  The Guide could be "Kit-Bashed" together with other Guides to create something unique.


Template - The Rigger has edited a Guide and wanted to save it for re-use.  This could just be as simple as an A-pose Human Biped or as complex as a Centaur with 4 Human arms, 2 Human Heads, Bird Wings and a Lions Tail.


  Below is the Version 1.0 UI layout, it was awkward and very "tabby" - it served it's purpose of having a way to list the available Guides and Templates.  Along the way, other tools were added to other tabs, a Skinning tab, Control tools, etc.  The actual meta data editing tools were rather limited at this point to just strings and float based UI widgets.  At the time, meta data was only thought of as simple hooks to make the python code to build the rig find the right objects - later on it would become the core foundation of thought for the Rigger in designing their vision for the character's movement.

Rigging Tool Kit Version 1.0


Looking Back


  Overall this first release served best as a way to get our rigging team to start thinking less on the topic of "how do I make this IK setup?" and more on "how does the anatomy of this creature function?".  This was a really key growth point I think for most of our Riggers, as we were hand building most rigs for each project.  We already saw speed gains from originally 5 days for a rigged character down to less than a day.

  There was a ton of personal growth as far as becoming a better Tech Artist, I learned a lot about supporting a team, good coding practices and it was a good way to build my understanding of games - Turbine was my first game job.  The expectations of rigs for games are different than what I had run into in film.

  Version 1.0 did have some downfalls that ultimately lead to developing version 2.0.  The pseudo geometry/joint based manipulation of the layout was fast but not accurate enough - it needed some way to be more precise without any effort from the rigging artist.  The other main issue was the guides were created by myself, which meant if a new anatomy was needed I would have to have the forethought to make the volume guide.  These 2 issues of lack of accuracy for faster speed, and the rigging team empowerment were enough to spend a little more time into the rigging system to build something that would really be the core of the tool kit in version 2.0 and eventually version 3.0.