Tuesday, June 9, 2015

Case Study: Building a Rigging system Part 1

  I have decided to do a few blog series posts based on a few case studies that have popped up in my experiences as a tech artist.  Not necessarily in any particular order but I will start with the task of building a rigging system...


Big Takeaways:  


  Empower the Rigging team to OWN the tool kit as their own,  it's not the 100 controls they personally named themselves that makes them a Rigger but the overall artistic design and functionality of the creature they just designed for kinematic movement.  Accuracy of joint placement is so important that it must be a higher priority than speed in any system where you are laying out a skeleton.



Starting Out


  Early on my first task at Turbine was to work on our rigging tools, this was great for me - my background in tech art began from learning rigging and I really wanted to bring the pipeline into a modern standard.  The final product went through three major "versions", but it all started with trying to wrap up a bunch of helper functions into a build procedure.  This was still so early that we were still exploring exactly what would the animators' preferences be for basic feature styles, orientations, etc.  We knew we ( the tech art team ) were making a shift from MEL to Python, and we were starting to explore PyMel - so I started up coding that build procedure in PyMel.

  It was fortunate that most of the characters at the time were bipeds, the build process only handled our basic biped.  I realize most auto riggers start out with a biped design and some branch out to quadrupeds...I knew my vision for our tool kit would be a system that would aim to let the artist dictate the rig and not the tool kit itself - but it took some time to imagine how best to enable that control.


Inspiration, Incubation and Implementation


  The skeleton layout tools for the first release of the rigging tool kit were heavily influenced on a Siggraph paper from years ago by ILM on their Block Party system.  A lot of auto riggers use locators to set up a layout for a skeleton.  Although the locator technique gives a lot of accuracy - I didn't feel like it was particularly fast enough.  I adopted the Block Party term "Volume Guide" loosely applied to my proxy resolution biped mesh guide.  The rigger would move the body parts to fit within the mesh, the skeleton would be built to replace the guide and then they could proceed in the build rig procedure.

  The biped guide was styled like an art maquette, the rigger would use translate/rotate/scale to manipulate the proxy mesh into place.  The shapes of the proxy geo were actually shape parented under their respective joints, this gave the user the speed of rotation mirroring across the body during the layout process because they were actually manipulating joints and not polygonal objects. Later on this iteration of the tool kit grew to support more "Volume Guides", horse, lion, bird and spider were added.  The ability to "kit-bash" body parts was also added.  For example combining the upper body of a human with the base of the horse allowed you to create a centaur,  or adding wings onto a human would give you an angel,  another example is a werewolf - top half human, bottom half lion/cat guide.


The basic Human Biped Guide
"Kit Bashing" together a Human and Horse Guide
  















  The actual building tools and how the Guide's "knew how they were to be rigged" was inspired by a power point presentation at GDC from several years ago as well by Bungie, you can view the power point here.  The main thing I learned from the Bungie talk was to add mark up data (meta data) onto the maya objects, that the building procedure could later find and determine what to do with that.

  This was the start of the "modularity" of what the system would become, it allowed artist to load in parts of other guides to have multiple fingers, arms, spines, legs, tails, wings, etc and plug them with a simple press of the P button (maya parent).  This allowed a lot of quick development for creature rigs.  The speed of the Guide rig to create a skeleton layout that could be rigged with a click of a button was great - but I started to notice that it was somewhat difficult to get our Rigging team to adapt to a "volumetric" way of building a skeleton - so there needed to be some changes to improve and adapt to how we as a team wanted to work.


Volume Guide - The term used for a master anatomical creature maya file that I built to serve as a building block for a character.  The Guide could be "Kit-Bashed" together with other Guides to create something unique.


Template - The Rigger has edited a Guide and wanted to save it for re-use.  This could just be as simple as an A-pose Human Biped or as complex as a Centaur with 4 Human arms, 2 Human Heads, Bird Wings and a Lions Tail.


  Below is the Version 1.0 UI layout, it was awkward and very "tabby" - it served it's purpose of having a way to list the available Guides and Templates.  Along the way, other tools were added to other tabs, a Skinning tab, Control tools, etc.  The actual meta data editing tools were rather limited at this point to just strings and float based UI widgets.  At the time, meta data was only thought of as simple hooks to make the python code to build the rig find the right objects - later on it would become the core foundation of thought for the Rigger in designing their vision for the character's movement.

Rigging Tool Kit Version 1.0


Looking Back


  Overall this first release served best as a way to get our rigging team to start thinking less on the topic of "how do I make this IK setup?" and more on "how does the anatomy of this creature function?".  This was a really key growth point I think for most of our Riggers, as we were hand building most rigs for each project.  We already saw speed gains from originally 5 days for a rigged character down to less than a day.

  There was a ton of personal growth as far as becoming a better Tech Artist, I learned a lot about supporting a team, good coding practices and it was a good way to build my understanding of games - Turbine was my first game job.  The expectations of rigs for games are different than what I had run into in film.

  Version 1.0 did have some downfalls that ultimately lead to developing version 2.0.  The pseudo geometry/joint based manipulation of the layout was fast but not accurate enough - it needed some way to be more precise without any effort from the rigging artist.  The other main issue was the guides were created by myself, which meant if a new anatomy was needed I would have to have the forethought to make the volume guide.  These 2 issues of lack of accuracy for faster speed, and the rigging team empowerment were enough to spend a little more time into the rigging system to build something that would really be the core of the tool kit in version 2.0 and eventually version 3.0.



Wednesday, June 3, 2015

No more Crisis :(

  Yesterday, WB/Turbine announced the closing of Infinite Crisis, the MOBA game that I had the opportunity to work on for 2.5 years.  It has been sad to see the game you work on for so long come to an end, but it has been encouraging to reminisce on how fun it was to work on a super hero game with such a creative team.  IC was the project that allowed the Tech Art team to revolutionize Turbine's Art pipeline.  It was a blast to make great art work with such great people.





Monday, June 1, 2015

The P4 Handshake

  I recently ran into the issue at work where we had to make sure perforce and maya were communicating when a new file was exported to the tree.  This was only an issue on newer projects due to a new pipeline with a new engine choice.   We were already using the P4 python library  in house,  but we just needed to make sure the right functions were called, users were logged in, etc.  I just wanted to post a few findings because I felt like the documentation was particularly clear for what I needed.

Example Scenario:
  On export of a new asset to the P4 tree, I wanted to add the new file to the P4 Depot.  If a user already logged into P4, then a ticket exists ( lasts for 12 hours by default in P4 ).  If a user has not logged into Perforce, then prompt the user for the info and create a ticket using the perforce python api run_login() call.

Now the important piece of info that documentation was not clear about: you can call "add" or "edit" calls from P4Python as long as a ticket exists and hasn't expired on the matching user/port/client info.  The documentation seems to infer you can pass an argument into run_login() of the ticket hash id, but according to perforce support it is worded incorrectly.  So we set out to check if the user had an existing non-expired ticket by using this call..


from P4 import P4 as p4api
mP4 = p4api()

mP4.port = sP4Server
mP4.user = sP4User
mP4.client = sP4Workspace
mP4.host = os.environ["COMPUTERNAME"].upper()

bConnection = mP4.connect()

try:
    mP4.fetch_change()

    mP4.disconnect()
except:
    # prompt user for password, because fetch change failed



The try/except, using fetch_change() gave me a quick way to check if the current info for the user was correct for an active and existing ticket without actually doing any work in p4 ( such as edit/add ).  If the fetch_change() failed, I proceeded with some user prompting to supply correct information.

If it was successful, I later would just call the edit or in this case the "add" perforce python api call..


mP4.run( "add", sPathToFile )




  These are just a few snippets of the fix we had to implement to make sure we are correctly adding to changelists and interacting with P4.  Maybe not knowing you can just call edit/add if you have a ticket isn't an issue for other TAs but it was one of those things that really bothered me when I got into it and it felt the need to post about it.