Blog

This blog showcases educational and inspirational content related to art, design, process, and more.

We should clone game Interaction Design tools.

Previously, I wrote about the state of Interaction Design tools. It’s prettypiss poor if you ask me. I’ve been penning down endless pages of tool ideas over the years, and I’d like to share some with you.

DISCLAIMERI’m publishing this in a hurry. I want to start helping people make these tools, instead of just writing and talking about something theoretical. Sorry if this is somewhat scrappy.


My friend Wilson Miner’s tweet reminded me of some ideas I’d been logging.

We really should be borrowing concepts from game & animation tools, to make interfaces. In many 2d & 3d packages, we can do things like simulate a whole fucking waterfall, but in the few design tools we have, we struggle to move primitive shapes around in space. It’s a joke!

Here’s an excerpt from an email I sent around to a few bright engineer friends that enscapulates my thoughts on future tools.


The big breakthrough in thinking that I had for animation/interaction tools is: It exists already, but applied in a different context. 3D animation / simulation for film & games. In a typical game/animation studio, there’s usually a core package being used by the whole team. In most environments I was involved in, the package was Autodesk Maya.

The production flow looked something like this:

Modeling Tools

Most of these tools exist in the master viewport. You can pan, zoom & orbit around an object and perform operations on the polygon & NURBS objects. The tools exist in a toolbar and let you do things like extrude faces, chamfer edges, scale things, split poly faces, mirror, etc. All tools around virtual sculpting really.

Rigging Tools

Rigging tools. Some native, some made just for this puppet.Hypergraph Editor

As well as using the master viewport, riggers (who take the models and put the skeletons & handles into the puppets) use a tool called the Hypergraph, which is really just a node editor with connections.

Animator Tools

A GUI for observing direct feedback (posing, watching playback)Graph EditorRegular timeline hybrid view (key ‘ticks’)Dope Sheet (more like classic cell animation timing). Dope as hell right?

The animators would take the puppets from the riggers & manipulate them with a number of tools. Most times it was timeline based.

Dynamics/simulation/FX

Particle simulation

This pass is done by artists with more of an engineering+art background. They use a combo of the GUI, prebuilt physics behaviors & code to make stuff like fire, water, cloth, springs, etc work. In editions of Maya Live, you can interact with the simulation as it plays back. This is common for blocking out simulations without having to dial in numbers, and to test simulated animation which might need to always be dynamic (game animation for example!). Some of this is driven by keyframe animation, some of this modifies keyframe animation, and some of it is completely dynamic!

Lighting/Texturing

Usually more steps, but for simplicity’s sake — we also have a node editor JUST for lighting & creating textures. It’s ‘procedural’

Melscript. On a windows machine. Weird.

So the interesting thing about all of this, is that all of these operationswrite to an ascii file. In Maya, it’s called a Maya ASCII file. Inside the file is a bunch of MEL commands. If you’re familiar with MEL, you can write a whole scene with it.

All of the tools I mentioned above, create meaningful lines of code, which can be operated on because the file format is very open. At the end of the day, a rigger, a lighter, an animator, or an engineer are using tools built for their specific tasks, but all share a common format. This is extremely helpful, because a file will be pushed up and down the pipeline between departments. Sometimes animators will adjust rig properties. Sometimes engineers will need to tune animation curves. Changing tools is a simple matter of picking which ‘workspace’ you’d like to use. You can even create your own specialized workspace, and most people do!

So how could this work for interface animation / interaction design?

A good tool for interfaces would borrow a lot of this thinking, but have amuch more narrow set of tools. I can imagine simple tools for working with layout, tools for putting down animation curves, dynamic connections between objects, simulation, etc. It seems like a lot, because the thing that inspired the idea is huge. I think it could be stripped back to 5% of what something like Maya is. After all, the animation problems being solved in interfaces are much wimpier than say — simulating a waterfall splashing on rocks, underwater hair, or even a fist deforming a face’s vertices as a result of a dynamic punch.

Probably one of the most raw ideas I have, but I wanted to share it with you in detail first. I want to more formally collect it all in an article. I’d love to hear your thoughts.


I didn’t really formalize it like I would have liked to, because it was too large an endeavor for me to wrap my head around… but that’s not the point. The point is that we have so many clues & so many things we can borrow. Traditional software design tools need to catch up to game & animation tools.

More thoughts to come.

Source