Stumbling Toward 'Awesomeness'

A Technical Art Blog

Saturday, December 1, 2018

The Asset Registry: Finding and Iterating Through Assets with UE4 Python

One thing that you will want to do right away is iterate through a bank of existing assets or find assets in a build. In UE4, your main window into the ‘content browser’ is the ‘asset registry’. You can use it to find all kinds of assets, iterate through assets in a folder, etc.

Let’s go ahead and instance it to take a look, now would be a good time to open the unreal.AssetRegistryHelpers UE4 Python API docs in another tab! Also, I am running this in UE4 4.21 release, with the free Paragon asset Marketplace assets.

Walking Assets In A Directory

Let’s ask it for all the assets in a certain path.

asset_reg = unreal.AssetRegistryHelpers.get_asset_registry()
assets = asset_reg.get_assets_by_path('/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes')

The method to get the asset registry has returned an unreal.AssetRegistry class. If you look at this class, you can see some really useful calls, like get_assets_by_path, that I used on the next line.

Let’s take a look at the assets:

for asset in assets: print asset

This yields:

LogPython: <Struct 'AssetData' (0x000001ADF8564560) {object_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh_Skeleton.Morigesh_Skeleton", package_name: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh_Skeleton", package_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes", asset_name: "Morigesh_Skeleton", as
set_class: "Skeleton"}>
LogPython: <Struct 'AssetData' (0x000001ADF8566A90) {object_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Orion_Proto_Retarget.Orion_Proto_Retarget", package_name: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Orion_Proto_Retarget", package_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes", asset_name: "Orion_Proto_R
etarget", asset_class: "Rig"}>
LogPython: <Struct 'AssetData' (0x000001ADF8564560) {object_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh_Cyl_Shadows.Morigesh_Cyl_Shadows", package_name: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh_Cyl_Shadows", package_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes", asset_name: "Morigesh_Cyl_
Shadows", asset_class: "PhysicsAsset"}>
LogPython: <Struct 'AssetData' (0x000001ADF8566A90) {object_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh_Physics.Morigesh_Physics", package_name: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh_Physics", package_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes", asset_name: "Morigesh_Physics", asset_
class: "PhysicsAsset"}>
LogPython: <Struct 'AssetData' (0x000001ADF85654B0) {object_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh.Morigesh", package_name: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh", package_path: "/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes", asset_name: "Morigesh", asset_class: "SkeletalMesh"}></code>

It has returned Python Objects of ‘unreal.AssetData‘ type, this class has a lot of things we can query, like class type, name, full path, etc. Let’s print the class name for each:

for asset in assets:
    print asset.class

Let’s only look at skeletal meshes and then let’s do something to them. In order to manipulate them, we need to load them, look at what the get_full_name function returns:

for asset in assets:
    #you could use isinstance unreal.SkeletalMesh, but let's build on what we learned
    if asset.asset_class == 'SkeletalMesh':
        print asset.get_full_name()
#>SkeletalMesh'/Game/ParagonMorigesh/Characters/Heroes/Morigesh/Meshes/Morigesh.Morigesh'

We need to split that output, then load the asset:

for asset in assets:
    if asset.asset_class == 'SkeletalMesh':
        full_name = asset.get_full_name()
        path = full_name.split(' ')[-1]
        skelmesh = unreal.load_asset(path)

Now this returned an unreal.SkeletalMesh class and we can ask it for it’s skeleton:

skeleton = skelmesh.skeleton

Finding Assets

Let’s say someone gives you a list of problematic assets, but they’re not long paths, just asset names! You want to be able to find the long path for all assets in the list so that you can do something with them. The AssetRegistry can help!

Let’s build a dictionary of all asets in the build, the keys will be the short names, and the values will be the long paths:

def get_asset_dict(asset_type=None):
    asset_list = None
    if asset_type:
        asset_list = unreal.AssetRegistryHelpers.get_asset_registry().get_assets_by_class(asset_type)
    else:
        asset_list = unreal.AssetRegistryHelpers.get_asset_registry().get_all_assets()
    asset_dict = {}
    for asset in asset_list:
        asset_name = str(asset.asset_name)
        obj_path = asset.object_path
        if asset_name not in asset_dict:
            asset_dict[asset_name] = [str(obj_path)]
        else:
            asset_dict[asset_name].append(str(obj_path))
 
    return asset_dict

This takes a second or two to build, but you now have an index of all assets by package name, that you can query their full path. It’s a bit faster if you query all assets of a certain type you know you’re looking for. You also will know when there is more than one asset with a name, because it’s list will have multiple entries. (That’s why we store what we find in a list, there could be multiple assets with the same name)

posted by Chris at 10:12 PM  

Friday, March 16, 2018

GitHub and converting tools to Maya 2017

Hey guys, I have multiple tools on github and have never converted them to Maya 2017 / Pyside.  I see many of you have forked them to add that support.

What’s the best way to deal with this? Should I just update the tools to only work with PySide2? Is there an option through github to branch my own tool, or let people know that for older Maya support they need to grab up to a certain changelist?

posted by Chris at 6:29 PM  

Saturday, August 24, 2013

Ryse at the Anaheim Autodesk User Event

I have been working on Ryse for almost two years now, it’s one of the most amazing projects I have had the chance to work on. The team we have assembled is just amazing, and it’s great to be in the position to show people what games can look like on next-gen hardware..  Autodesk asked us to come out to Anaheim and talk about some of the pipeline work we have been doing, and it’s great to finally be able to share some of the this stuff.

A lot of people have been asking about the fidelity, like ‘where are all those polygons?’, if you look at the video, you will see that the regular Romans, they actually have leather ties modeled that deform with the movement of the plates, and something that might never be noticed: deforming leather straps underneath the plates modeled/rigged holding together every piece of Lorica Segmata armor, and underneath that: a red tunic! Ryse is a labor of love!

We’re all working pretty hard right now, but it’s the kind of ‘pixel fucking’ that makes great art -we’re really polishing, and having a blast. We hope the characters and world we have created knock your socks off in November.

posted by Chris at 11:16 PM  

Monday, February 11, 2013

Object Oriented Python in Maya Pt. 1

I have written many tools at different companies, I taught myself, and don’t have a CS degree. I enjoy shot-sculpting, skinning, and have been known to tweak parameters of on-screen visuals for hours; I don’t consider myself a ‘coder’; still can’t allocate my own memory.  I feel I haven’t really used OOP from an architecture standpoint. So I bought an OOP book, and set out on a journey of self improvement.

‘OOP’ In Maya

In Maya, you often use DG nodes as ‘objects’. At Crytek we have our own modular nodes that create meta-frameworks encapsulating the character pipeline at multiple levels (characters, characterParts, and rigParts). Without knowing it, we were using Object Oriented Analysis when designing our frameworks, and even had some charts that look quite a bit like UML. DG node networks are connected often with message nodes, this is akin to a pointer to the object in memory, whereas with a python ‘object’ I felt it could always easily lose it’s mapping to the scene.

It is possible now with the OpenMaya C++ API to store a pointer to the DG node in memory and just request the full dag path any time you want it, also PyMel objects are Python classes and link to the DG node even when the string name changes.

“John is 47 Years Old and 6 Feet Tall”

Classes always seemed great for times when I had a bunch of data objects, the classic uses are books in a library, or customers: John is 47 years old and likes the color purple. Awesome. However, in Maya, all our data is in nodes already, and those nodes have attributes, those attributes serialize into a Maya file when I save: so I never really felt the need to use classes.

Although, all this ‘getting’, ‘setting’ and ‘listing’ really grows tiresome, even when you have custom methods to do it fairly easily.

It was difficult to find any really useful examples of OOP with classes in Maya. Most of our code is for ‘constructing’: building a rig, building a window, etc. Code that runs in a linear fashion and does ‘stuff’. There’s no huge architecture, the architecture is Maya itself.

Class Warfare

I wanted to package my information in classes and pass that back and forth in a more elegant way –at all times, not just while constructing things. So for classes to be useful to me, I needed them to synchronously exist with DG nodes.

I also didn’t want to have to get and set the information when syncing the classes with DG nodes, that kind of defeats the purpose of Python classes IMO.

Any time I opened a tool I would ‘wrap’ DG nodes in classes that harnessed the power of Python and OOP. To do this meant diving into more of the deep end, but since that was what was useful to me, that’s what I want to talk about here.

To demonstrate, let’s construct this example:

#the setup
loc = cmds.spaceLocator()
cons = [cmds.circle()[0], cmds.circle()[0]]
meshes = [cmds.sphere()[0], cmds.sphere()[0], cmds.sphere()[0]]
cmds.addAttr(loc, sn='controllers', at='message')
cmds.addAttr(cons, sn='rigging', at='message')
for con in cons: cmds.connectAttr(loc[0] + '.controllers', con + '.rigging')
cmds.addAttr(loc, sn='rendermeshes', at='message')
cmds.addAttr(meshes, sn='rendermesh', at='message')
for mesh in meshes: cmds.connectAttr(loc[0] + '.rendermeshes', mesh + '.rendermesh')

So now we have this little node network:

node_network

Now if I wanted to wrap this network in a class. We are going to use @property to give us the functionality of an attribute, but really a method that runs to return us a value (from the DG node) when the ‘attribute’ is queried. I believe using properties is key to harnessing the power of classes in Maya.

class GameThing(object):
	def __init__(self, node):
		self.node = node
 
	#controllers
	@property
	def controllers(self):
		return cmds.listConnections(self.node + ".controllers")

So now we can query the ‘controllers’ attribute/property, and it returns our controllers:

test = GameThing(loc)
print test.controllers
##>>[u'nurbsCircle2', u'nurbsCircle1']

Next up, we add a setter, which runs code when you set a property ‘attribute’:

class GameThing(object):
	def __init__(self, node):
		self.node = node
 
	#controllers
	@property
	def controllers(self):
		return cmds.listConnections(self.node + ".controllers")
 
	@controllers.setter
	def controllers(self, cons):
		#disconnect existing controller connections
		for con in cmds.listConnections(self.node + '.controllers'):
			cmds.disconnectAttr(self.node + '.controllers', con + '.rigging')
 
		for con in cons:
			if cmds.objExists(con):
				if not cmds.attributeQuery('rigging', n=con, ex=1):
					cmds.addAttr(con, longName='rigging', attributeType='message', s=1)
				cmds.connectAttr((self.node + '.controllers'), (con + '.rigging'), f=1)
			else:
				cmds.error(con + ' does not exist!')

So now when we set the ‘controllers’ attribute/property, it runs a method that blows away all current message connections and adds new ones connecting your cons:

test = GameThing(loc)
print test.controllers
##>>[u'nurbsCircle2', u'nurbsCircle1']
test.controllers = [cmds.nurbsSquare()[0]]
print test.controllers
##>>[u'nurbsSquare1']

To me, something like properties makes classes infinitely more useful in Maya. For a short time we tried to engineer a DG node at Crytek that when an attr changed, could eval a string with a similar name on the node. This is essentially what a property can do, and it’s pretty powerful. Take a moment to look through code of some of the real ‘heavy lifters’ in the field, like zooToolBox, and you’ll see @property all over the place.

I hope you found this as useful as I would have.

posted by Chris at 1:03 AM  

Thursday, July 12, 2012

Not Dead Yet

Click to Enlarge

I have been really busy on Ryse, this past weekend I found some time to wrap the XNA import methods I had written in a UI.  I will post it soon in an un-padded form for the people asking for it.

For those who don’t know what I am referring to, a while back I wrote some python to import XNA character files (from retail discs) into Maya as textured characters with original joint names, skinning, etc. I hit some snags on the UV, texturing, and then viewport 2.0 stuff. It’s really great to see topology, bind pose, weighting, joint layout, etc.. of your favorite characters. Great reference!

I would also like to make a post about viewport 2.0 in the next week or so, that whole system is such a complete piece of frustrating garbage, hopefully you can benefit from my aimless bumping into walls in the darkness.  Anyway, gotta start ramping up for SIGGRAPH, so that might have to wait.

posted by admin at 1:36 AM  

Saturday, September 25, 2010

Perforce Triggers in Python (Pt 2)

So last time I more introduced you to the idea of triggers, here’s a more complex example. This worked on my db, but if you have branching you would need to check each returned file against the branch you are submitting to.

Check if The File is Already in the Depot

This is a trigger that checks the hash digest of the incoming file against that of the server. This way you can see if the user is checking in a file that already exists.

import sys
from P4 import P4, P4Exception
 
p4 = P4()
describe = []
try:
	p4.user = "admin"
	p4.password = "admin"
	p4.port = "1666"
	p4.connect()
	lst = sys.argv[2]
	stat =  p4.run('fstat', ('-Ol','//depot/...@'+ str(lst)))
	hash = stat[0]['digest']
	fname = stat[0]['depotFile']
	m =  p4.run('fstat', ('-Ol','-F', ('digest = ' + str(hash)),'//depot/...'))
	existing = []
	for file in m:
		if file['depotFile'] != fname: existing.append(file)
 
	if existing != []:
		print '\n\nFILE EXISTS IN DEPOT!!'
		print  'YOUR FILE:  ' + (fname.split('/')[-1])
		for  file in existing: print 'EXACTLY MATCHES:  ' + file['depotFile']
		print 'P4 DIGEST:  ' + hash
		print 'SOLUTION: Contact your lead if you believe this message was generated in error.'
		sys.exit(1)
 
except Exception, e:
	print "Error: %s" % (e)
	sys.exit(1)
 
p4.disconnect()
sys.exit(0)

Then your trigger line looks like this:

Triggers:
	dupeCheck change-submit //depot/... "python X:/projects/2010/p4/dupe_trigger.py %user% %changelist%"

This is what the user will see when they try to check in:

posted by admin at 7:39 PM  

Sunday, December 27, 2009

Update

I haven’t posted in a while, lots of changes going on, I left ILM after Avatar, and have moved back to Germany where my girlfriend is finishing medical school. I promise a good tech art post soon (my pick for tech art game of the year!) I look to be rejoining Crytek next year, working on Crysis 2.

posted by admin at 2:34 AM  

Sunday, December 27, 2009

Decode the Hype: HP DreamScreen 130 Review

FAIL.

FAIL.

Decode the Hype

Being digital artists, photo frames might look like attractive ways to showcase art and content, these devices are being pushed more and more. I got HPs ‘flagship’ model as a present, it retails for $300! I was so excited, but not for long. Unable to find any info online especially reviews, I thought I would post this here.

Lets first just get some things out of the way before I talk about the quality of what the device DOES do lets talk about what it does not, yet claims to do.

Downright Lies

The following quotes are from the HP site itself:

The HP DreamScreen is a gateway to the Internet using your wireless network to access
weather info, Snapfish and your favorite web destinations.

This is just untrue. There is no integrated web browser. It has three web ‘apps’ on it: SnapFish, Pandora, Facebook. That’s it. It does not read RSS feeds, or do much of anything you probably want it to do, simple things like display news or recipes.

Stay current with social network sites like Facebook

‘Like’ facebook? There is only Facebook: nothing else.

Be organized with a built-in alarm clock & calendar.

This is laughable. Wondering how to sync the calendar with outlook or google or anything; maybe even just add appointments, I finally consulted their online documentation. Here, seriously, is the feature list for the calendar ‘app’:

View the current month, press right or left to view the next or previous month.

BWAH HA HA HA… *sigh*

Easy wireless access to your digital entertainment

It shows an icon for a video, but actually; it doesn’t stream video, it plays some videos, only at specific resolutions from specific codecs; off physical media.

Touch-enabled controls—Get fast, easy access to information and entertainment with simple touch controls embedded in the display

This is referring to some buttons around the bezel of the screen and is just so untrue they would have to change the marketing campaign in Europe or get sued. This does however remind me of the old In Living Color sketch where the handcapped superhero always says he is ‘not handicapped, but HANDY-CAPABLE!’.

Videos—Watch home movies and video clips in full screen – Its simple!

It’s as simple as taking your video, recompressing it to a supported video codec, resizing it to a specific resolution, and then physically transferring it ot the device –so simple grandma could do it! (with gordian knot, virtualdub, CCCP, and all those other video tools she has)

Decode the Hype: The Screen

Resolution

The thing is a frickin’ 300 dollar photo frame, but it’s resolution is 800 x 480, this equates to 0.38 megapixels, at the time the frame came out, the average cheap point n shoot ranks 9 to 10 megapixels: this is well over twenty times the resolution of the screen!

Because of this, it can take 10 full seconds to load a photo and downsample it to 800 pixels from it’s original resolution. This makes browsing photos a pain, and loading photos from your camera cards nearly useless. Power users will use photoshop or xnView to batch all their content to 800 pixels.

There is aliasing galore, as 800 pixels is the resolution of many phones and handheld devices, not 13″ photo frames!

UPDATE: I have talked to HP and done some hunting, uncovering something that is just ridiculous: The DreamScreen 130 has a 1280×800 resolution. However, HPs software only works ar 800×480, the resolution of the cheapest screen, (the 100). To get around this, they upscale to 1280 pixels. This means they down sample your image to 800 pixels, then they upscale it with a software upscaler, so your pictures will ALWAYS LOOK LIKE JAGGY AND SOFT: NO MATTER WHAT YOU DO. This is a joke, HP should be ashamed of themselves.

Notice the terrible artifacts from the 1280 image, which was downscaled by the frame software, then upscaled to fit the panel.

Notice the terrible artifacts from the 1280 image, which was downscaled by the frame software, then upscaled to fit the panel.

Color Reproduction

It is a cheap TN panel, the gamma of your images widely fluctuates depending on the angle they are viewed. I would be ok if they had a low resolution but used a nice IPS, SIPS, or OLED panel, but this is just unremarkable. The black point is a dark shade of grey, in all seriousness, the panel quality seems on par with something like the panels they use in the dashboard of a Prius, or other industrial UI readouts.

banding

Pretty terrible banding

Pretty bad black point

Pretty bad black point

Pretty bad white point

Pretty bad white point

Pretty mediocre contrast

Pretty mediocre contrast

Decode the Hype: Misc Tech Tidbits

Streaming / Network

Streaming requires lots of Microsoft Windows Media software and services running on a PC server in your house that is always on, they relied on this instead of doing the footwork themselves. If you were under the impression from their marketing that it could read files off samba shares or work with Macintosh, you would be wrong.

Software / User Interface

The software is pretty terrible. It is very clunky and unresponsive. Many times it does not recognize that physical media has been inserted and must be rebooted. The UI graphics themselves show terrible compression artifacts.

dscreen

When you bring up the on screen keyboard to type in, say, the name of the device, it clearly shows buttons like [HTTP://], [www.], [.com], and others to make it easier to browse the web, however there is no web browser! There are other places in the print ads and UI itself that refer to features the device just does not have!

“Touch Screen”

The device claims to have a ‘touch sensitive screen’, and IT DOES! A small area around the bezel of the screen has botons that can be pressed/touched! This product is in NO WAY a touch screen device, and has no touch sensitivity, other than the buttons on the bezel, the marketing is a lie.

Open Source?

On the CD that ships with it, they have a ton of readme files showing they used a lot of GPL’d code, however the source installer did not work on my windows 7, x64.

Conclusion

Pros:

  • They used Linux and GPL’d code so they will have to release theirs soon, hopefully it will be taken under the wing of the open source community and all these issues can be fixed by hard working college students and kids in their spare time.
  • The packaging/box is very high quality with a great look and feel

Cons:

  • The screen is low res and low quality
  • The device is way overpriced for the quality of it’s screen and software
  • The docs and UI refer to features that just do not exist
  • No battery, it must always remain plugged into the wall
  • Super-glossy, all you may be seeing is windows!
  • Software-wise, the average cellphone is vastly superior in extensibility and quality (browsing photos, playing mp3s, videos…)
  • The UI looks like a rip of cell phone UIs, but only in pictures… There are no smooth animated transitions, nothing in common with the user interfaces they seemed to want to copy. To an experienced person, the UI feels like something HP outsourced to Asia and sent them a poor art-bible of the end product they were expecting…
  • The device seems unfinished
posted by admin at 2:33 AM  

Monday, February 2, 2009

First Transformers 2 Teaser Trailer!

It’s exciting that you can see some of our work already! Check out the teaser trailer, be sure to click [watch in high quality]

posted by Chris at 10:11 AM  

Friday, October 24, 2008

Autodesk Acquires Softimage for 35 Million

Really? Wow, I mean this isn’t as surprising as when they bought Alias 3 years ago (182 Million), but still. And 35 million? That’s the price of a single movie or three year videogame production these days. I thought the ‘desk bought Maya to kill it, but it’s still around… Wonder how long XSI will be around now.

http://usa.autodesk.com/adsk/servlet/item?id=12022457&siteID=123112

Maybe they will merge all three teams into one highly experienced, ‘all-star’ development team to make a new 3d app to end all 3d apps.

Aren’t there laws about these kinds of monopolies? Looks like the Lightwave and C43D guys are your only alternative..

One less booth at SIGGRAPH..

posted by Chris at 12:50 PM  

Friday, August 1, 2008

MGS4 Cluster Constraint Setup

From Ideas to Reality with XSI’s Cluster Constraints

Thanks to my brother, Mike, for translating  this from the original japanese [here]

When asked about which features of XSI helped the most on this project, Hideki Sasaki (Facial Animation Set Up Lead) came back with the rather surprising answer, “There were many but in regards to facial animation, cluster constraints really saved us.” In our facial rig setups, every point-cluster of your target shape is tied to bones using cluster constraints. Cluster constraints also were extremely useful in the following situations:

Since in MGS4 we were really trying to lighten the processing, on the PS3 we employed a method where tangent colors change only with the rotation of bones. In other words, if simply constrained to coordinates, in animation it will behave correctly, but the tangent color will not change. Basically, you run into a dilemma where shading goes from its default state to a one where it will no longer change. However, by using cluster constraints to constrain both normal and tangent lines the correct rotation values will be input, and that’s how we accomplished the shading.

(this sounds really interesting, i guess they are talking about smoothing angle tangents? In many engines like the CryEngine, the smoothing angle is based on the character’s default pose at export and never changes. This makes it sound like that exported cluster data to ‘drive’ the smoothing angle in realtime)

Furthermore, nearly all fluctuating objects attached to the character’s clothing, in cutscenes and gameplay, are done by the PS3’s simulation engine. That being said, there are some cases in cutscenes with intense action where it’s difficult to simulate. In those cases we use animation simulated in XSI’s Syflex. The basic workflow in those situations is as follows:

1. To express fluctuations in the clothing, make a simulation in Syflex

2. Convert the cached simulation results into shape targets

3. Constrain bones to the points on your shape controlled object with cluster constraints

4. Bone envelope the final model to be used on the PS3 (Basically the same idea as a facial rig)

(Baking arbitrary data to bones ftw!)

The advantage of using this type of control is, even if you temporarily get a little caving in or some kind of flaw in the simulation result, you are able to apply corrections with “Secondary Shape Mode” at stage 2 of the workflow.

It’s possible to edit the shape’s geometry using vertex shift; you can also use smooth and push to fix little imperfections if needed. It goes without saying that the results of these intuitive adjustments will be reflected in the envelope control’s PS3 data as well.

Sasaki explains, “You can set cluster constraints for all components, vertex, polygon and edge. I believe XSI is the only one that comes standard with support for constraining both normals and tangents. Without the help of these cluster constraint functions we could have never accomplished techniques like cross-simulation transfer to bones, or our ideas concerning facial rig set up.”

(they export/sync cluster rig element data to engine)

posted by Chris at 12:54 PM  

Sunday, June 22, 2008

3D Models not Subject to Copyright

I saw this over at slashdot:

“The US Court of Appeals for the Tenth Circuit has affirmed (PDF) a ruling that a plain, unadorned wireframe model of a Toyota vehicle is not a creative expression protected under copyright law. The court analogized the wire-frame models to photographs: the owner of an object does not have a copyright in all images of the object, but a photographer may have a limited copyright over a particular image based on artistic choices such as costumery, lighting, posing, etc. Thus, the modelers could only copyright any ‘incremental contribution’ they made to Toyota’s vehicles; in the case of plain models, there was nothing new to protect. This could be a two-edged sword — companies that produce goods may not be able to stop modelers from imaging those products, but modelers may not be able to prevent others from copying their work.”

This will have some interesting ramifications. And I don’t just mean for the Limbo of the Lost guys. (j/k)

posted by Chris at 11:09 PM  

Tuesday, June 17, 2008

RIP Stan Winston

One of my heroes passed away today. I never knew the guy but it made me very sad and hollow to hear he had passed. He was responsible for many of the creatures in films that made me eventually want to be a Technical Director.

posted by Chris at 12:00 AM  

Sunday, April 27, 2008

First Post!

I am carving out a little space for my thoughts and musings related to the field I am in. Nothing here will be related to the work I am doing at Crytek, so if you are here for industry secrets; you’ve come to the wrong place. I will however talk about industry trends, art, games, graphics, characters, rigging, sex, math, etc.

I work in an industry where everyone wants everything to look amazing, play amazing, sound amazing, etc… To be on the cutting edge in an industry like this, you are often breaking new ground, so in a way: ‘stumbling toward awesomeness’.

posted by Chris at 1:13 PM  

Powered by WordPress