Stumbling Toward 'Awesomeness'

A Technical Art Blog

Thursday, July 31, 2008

MGS4 Character Pipeline

Character Creation Pipeline

Thanks to my brother, Mike, for translating  this from the original japanese [here]

The hero, Snake, and nearly all other characters we animate on the PS3 and make an appearance in the game are restrained to the range of about 5 thousand to 1 million polygons (including the face). Also, in both gameplay and “cutscenes” the same resolution polygon characters are used. This allows for seamless transition between the gameplay and cutscenes and makes it easier for the player to get emotionally involved in the reality of the game.

Furthermore, for all other characters, except crowds, the same resolution of polygon characters are used in game as well as in cutscenes. Separate from the resolution models used on the PS3, high rez data is modeled at the same time to generate a normal map. Wrinkles in clothing and other details are expressed through this normal map, created from the high rez model.

Of all the bones within the character’s body, the number that contain and are driven by animation data is roughly around 21. But, in reality a number of helper (auxiliary) bones are used to supplement motions like twisting in the knees, elbows, arms and legs.  These however are not driven by animation data. Instead, they reference values of the basic animation driven joints and move in like manner.


The same method is employed on the PS3, not just in XSI; all you have to do is extract the helper bones’ definition files from the XSI data and you can achieve the same kind of control on the PS3 as well. (Awesome! Rig syncing constraints and driven bones between DCC app and game engine)
Since there is no actual motion data stored inside the driven-bones, you are able to not only limit the data volume but even in the event that you need to add or delete helper bones, there’s no need to reconvert the motion data- you can just adjust the model data instead.

posted by Chris at 8:27 AM  

Thursday, July 31, 2008

MGS4 Facial Animation

Shockingly Realistic Facial Animation

Thanks to my brother, Mike, for translating  this from the original japanese [here]

One of the most notable things about MGS4 is its world-leading cutting edge facial animation. Exactly how were these real-to-life facial expressions created?

Since the Metal Gear Solid series is lip-sinked for localization, from a workload standpoint voice analysis software is employed

In MGS4 for example, lip-sinking for Japanese and English were done seperately with different voice analysis software.Other emotions and expressions besides lip-sinking were animated by hand. In nearly all cases, the expression and phoneme elements were worked on together simultaneously, reducing interference and allowing MGS4 to achieve its simultanious world release.

When doing voice analysis, it’s necessary to set parameters for both expression components (i.e., anger, smile, etc.) and phoneme components (all languages) seperately. After setting this up, we need to see how it behaves as a rig. It’s possible to use parameters for the rotation and movement of bones; however, the rig can become more complicated and it can also become more difficult to predict how the bones with transform/change once enveloped. In other words, when facial animation is done by only controlling the bones, ituitively the designer’s job becomes more difficult and he runs into the following two problems: 1) expressing the behavior of bones, and 2) setting parameters for phonemes.

However, with shape animation (even though it has the drawback of linear interpolation) it’s extremely easy to set up parameters for all your phonemes and
expressions. Most of all, it’s adventagous in that the desiger will be able to intuitively predict the result.

For these reasons, this time on our rig we used bone-driven animation based on the results of various parameter shapes.

With this set up, using voice analysis automated animation (not just the mouth, but automatic animation of the tounge and throat phonemes as well) and hand animation for emotions, we are able to achieve an abundance of realistic expressions.

In the following flash movie you can see how smooth muscular expressions are achieved through superb rig setup

Flash Movie:

Facial rig setup pipeline

————————————————
1. Lo-poly model driven by shape animation
2. Above that, the constrained bones
3. Polygonal mesh enveloped to the bones
4.Tangent color
5. OpenGL display (wrinkles expressed also with normal map)

————————————————

Expressions, phonemes, eyes (eyebrows), and shader driven wrinkle animation are all tab selectable.
Through the combination of various parameters we can create life-like expressions like those shown above.

The most suprising thing is, we developed a tool that automatically sets up this facial rig that allows such sophisticated control. In other words, if you enter the facial model data and run the tool it will automatically identify the optimal position for bones– in this system the tool will create controls that include the preset parameters for emotions. (a smily face, an angry face, etc.) To perform the automated facial rigging, the facial data’s topology information needs to be standardized ahead of time. If you adhere to this one rule, your set up can be done automatically, and all that’s left to do is for the designer to fine-tune the controls and you have a constructed enviorment where you can get right into your facial animation.

Next, a rig that controls the movement of the eyeball and surrounding muscles can also be generated automatically using this tool. Since the area around the eye, like the area surrounding the mouth, is controlled by the simultanious usage of shapes and bones, when you move the eyeball locater you get smooth muscular movement. What’s more, even if you edit the shape, or redefine the configuration of the outline of the eye, it doesn’t disrupt the expression of brow wrinkles or the blinking of the eye in any way.

Behind all the characters that make an appearance in this game, and appeal to the player’s emotions, we have implemented this set up and animation system; and, through it we are able to increase and maintain a high quality user experience.

posted by Chris at 1:14 AM  

Tuesday, July 29, 2008

3dsMax 2008 Node Event System: Does Not Exist

After writing a bit of code to leverage the new Node Event System, and then not be able to get it to work properly, I found a post by someone from Autodesk saying that it is not present in Max 2008, however it is in the documentation. This is somewhat frustrating, I hope this post saves you time and frustration.

posted by Chris at 2:30 PM  

Monday, July 28, 2008

Gleaning Data from the 3dsMax ‘Reaction Manager’

This is something we had been discussing over at CGTalk, we couldn’t find a way to figure out Reaction Manager links through maxscript. It just is not exposed. Reaction Manager is like Set Driven in Maya or a Relation Constraint in MotionBuilder. In order to sync rigging components between the packages, you need to be able to query these driven relationships.

I set about doing this by checking dependencies, and it turns out it is possible. It’s a headache, but it is possible!

The problem is that even though slave nodes have controllers with names like “Float_Reactor”, the master nodes have nothing that distinguishes them. I saw that if I got dependents on a master node (it’s controllers, specifically the one that drives the slave), that there was something called ‘ReferenceTarget:Reaction_Master‘:

refs.dependents $.position.controller
#(Controller:Position_Rotation_Scale, ReferenceTarget:Reaction_Master, Controller:Position_Reaction, ReferenceTarget:Reaction_Set, ReferenceTarget:Reaction_Manager, ReferenceTarget:ReferenceTarget, ReferenceTarget:Scene, Controller:Position_Rotation_Scale, $Box:box02 @ [58.426544,76.195091,0.000000], $Box:box01 @ [-42.007244,70.495964,0.000000], ReferenceTarget:NodeSelection, ReferenceTarget:ReferenceTarget, ReferenceTarget:ReferenceTarget)

This is actually a class, as you can see below:

exprForMAXObject (refs.dependents $.position.controller)[2]
"<<Reaction Master instance>>"
 
getclassname (refs.dependents $.position.controller)[2]
"Reaction Master"

So now we get the dependents of this ‘Reaction Master’, and it gives us the node that it is driving:

refs.dependentNodes (refs.dependents $.position.controller)[2]
#($Box:box02 @ [58.426544,76.195091,0.000000])

So here is a fn that gets Master information from a node:

fn getAllReactionMasterRefs obj =
(
	local nodeRef
	local ctrlRef
	for n = 1 to obj.numSubs do
	(
		ctrl = obj[n].controller
		if (ctrl!=undefined) then
		(
			for item in (refs.dependents ctrl) do
			(
				if item as string == "ReferenceTarget:Reaction_Master" then
				(
					nodeRef = (refs.dependentNodes item)
					ctrlRef = ctrl
				)
			)
			getAllReactionMasterRefs obj[n]
		)
	)
	return #(nodeRef, ctrlRef)
)

The node above returns:

getAllReactionMasterRefs $
#(#($Box:box02 @ [58.426544,76.195091,0.000000]), Controller:Position_Rotation_Scale)

The first item is an array of the referenced node, and the second is the controller that is driving *some* aspect of that node.

You now loop through this node looking for ‘Float_Reactor‘, ‘Point3_Reactor‘, etc, and then query them as stated in the manual (‘getReactionInfluence‘, ‘getReactionFalloff‘, etc) to figure out the relationship.

Here is an example function that prints out all reaction data for a slave node:

fn getAllReactionControllers obj =
(
	local list = #()
	for n = 1 to obj.numSubs do
	(
		ctrl = obj[n].controller
		if (ctrl!=undefined) then
		(
			--print (classof ctrl)
			if (classof ctrl) == Float_Reactor \
			or (classof ctrl) == Point3_Reactor \
			or (classof ctrl) == Position_Reactor \
			or (classof ctrl) == Rotation_Reactor \
			or (classof ctrl) == Scale_Reactor then
			(
				reactorDumper obj[n].controller data
			)
		)
		getAllReactionControllers obj[n]
	)
)

Here is the output from ‘getAllReactionControllers $Box2‘:

[Controller:Position_Reaction]
ReactionCount - 2
ReactionName - My Reaction
    ReactionFalloff - 1.0
    ReactionInfluence - 100.0
    ReactionStrength - 1.2
    ReactionState - [51.3844,-17.2801,0]
    ReactionValue - [-40.5492,-20,0]
ReactionName - State02
    ReactionFalloff - 2.0
    ReactionInfluence - 108.665
    ReactionStrength - 1.0
    ReactionState - [65.8385,174.579,0]
    ReactionValue - [-48.2522,167.132,0]

Conclusion
So, once again, no free lunch here. You can loop through the scene looking for Masters, then derive the slave nodes, then dump their info. It shouldn’t be too difficult as you can only have one Master, but if you have multiple reaction controllers in each node effecting the other; it could be a mess. I threw this together in a few minutes just to see if it was possible, not to hand out a polished, working implementation.

posted by Chris at 4:42 PM  

Monday, July 28, 2008

Fixing Clipboard Problems in Photoshop

Over the past few years I have noticed that Photoshop often, usually after it is left idling for a few hours or days, no longer imports the windows clipboard.

Here is a fix if you don’t mind getting your hands dirty in the registry:

[HKEY_CURRENT_USER\Software\Adobe\Photoshop\9.0]
"AlwaysImportClipboard"=dword:00000001

The above is for photoshop cs2, depending on your version you will have to look in different reg locations. There is also a problem when you hit a ‘size limit’ for an incoming clipboard image, and Photoshop dumps it. This can also be circumvented by editing the registry:

[HKEY_CURRENT_USER\Software\Adobe\Photoshop\9.0]
"MaxClipSize"=dword:00000000
posted by Chris at 10:24 AM  

Friday, July 11, 2008

Simple Perforce Animation Browser/Loader for MotionBuilder

This is a simple proof-of-concept showing how to implement a perforce animation browser via python for MotionBuilder. Clicking an FBX animation syncs it and loads it.

The script can be found here: [p4ui.py], it requires the [wx] and [p4] libraries.

Clicking directories goes down into them, clicking fbx files syncs them and loads them in MotionBuilder. This is just a test, the ‘[..]’ doesn’t even go up directories. Opening an animation does not check it out, there is good documentation for the p4 python lib, you can start there; it’s pretty straight forward and easy: sure beats screen scraping p4 terminal stuff.

You will see the following, you should replace this with the p4 location of your animations, this will act as the starting directory.

	path1 = 'PUT YOUR PERFORCE ANIMATION PATH HERE (EXAMPLE: //DEPOT/ANIMATION)'
	info = p4i.run("info")
	print info[0]['clientRoot']

That should about do it, there are plenty of P4 tutorials out there, my code is pretty straight forward. The only problem was where I instanced it, be sure to instance it with something other than ‘p4’, I did this and it did not work, using ‘p4i’ it did without incident:

p4i = P4.P4()
p4i.connect()
posted by Chris at 6:45 PM  

Powered by WordPress