Notifications
Clear all

[Closed] Detecting nodes with motion?

Hello,
MaxScript has the isAnimated property for nodes, which tells if that node (if any of its transforms) hold keyframes.

However, this property does not account for all kinds of motion that can be applied to nodes. Say, if you place a LookAt constraint on one static node and target it into an animated node, the static node with the LookAt will have its rotation indirectly animated (and its node.rotation.isAnimated property will hold ‘false’).
Other cases like this happen when you apply a Noise controller to any of the node’s transforms (position, rotation or scale). Such transforms will have a motion per se, but their ‘isAnimated’ properties will still hold ‘false’.

There is no need to check for keyframed animation: I can just use the isAnimated property.
But is there an elegant way to find out about non-keyframed animation\motion?
I want to see if I can avoid having a huge select where I go through the node’s controllers and check if the controller is properly configured to generate motion, like with a LookAt controller having a target (and that target being animated in its position), an Audio controller having an audio file applied etc. based on its kind.

Thanks.

10 Replies
 lo1

I guess the brute force way would be to iterate over all frames and check if the .transform property of the node is any different in any of the frames than it was in frame 0

 lo1

and in code:

fn isTransformAnimated obj =
(
	local motionDetected=false
	local origTM = at time animationrange.start.frame (inverse obj.transform)
	for i = animationrange.start.frame to animationrange.end.frame while not motionDetected do 
	(
		at time i if not isIdentity (obj.transform*origTM) do motionDetected=true
	)
	motionDetected
)

you’ll notice you can’t just check if the transform at frame i is equal to the original transform, because this will always return false due to inaccuracy in matrix3 calculations.
Instead, you must check if the inverse of the original transform multiplied by the current transform is the identity matrix ([1,0,0] [0,1,0] [0,0,1] [0,0,0])

surprisingly this is not such a slow function, for 600 objects and 100 frames it took me 100ms.

Hello lo,
thanks a lot for your input. I was afraid of using such a method when someone might be dealing with large game cutscenes with lots of frames (say, 7k), but when you take into consideration that the slightest change already qualifies the node as ‘motionDetected’ and leaves the loop, it can be quite fast for some cases.
Even if the transformation only takes place on the last frame (worst case scenario), the cost is worth it since this method provides a fail-proof detection; it actually compares the end result of the transformations, so even if the user has custom controllers (or scripted movements), they would still be detected by it.

Thanks again

 lo1

glad to help
yeah you’re right the function will be slowest if no animation takes place at all.

As I understand you need to know what nodes are really animated for the future export. It’s very common task. Usually the problem solves by tagging (marking) animated nodes. It’s slow to search all animated (transform changed in time range) objects in real export time. But you can do it off-line and mark (add cust attribute, add user defined property, add extra name prefix/suffix, etc.) to all animated nodes. After that the exporter will search for marked nodes only. Usually the animator/fx-artist/tech-artist is responsible for marking animated nodes.

There is another profit on marking. Usually a game exporter has to compress the animation and has to know compress (key reduction) settings for every node. Some nodes’ animations have to be reduced synchronously (for example constrained nodes, nodes with wired controllers, etc.). So the developer can defined the fact of animation, the animation range, the settings for compression, coordinate system, etc.

 JHN

Another way can be to getClassInstances of animation controller instances and see if the controller holds keys then refs.dependentNodes to collect the objects the controller depends on.

-Johan

2 Replies
 lo1
(@lo1)
Joined: 11 months ago

Posts: 0

not all animation controllers are keyed, i.e. noise controller, audio controller.

 JHN
(@jhn)
Joined: 11 months ago

Posts: 0

So when you find those then you know you need them right, I would make the search tool handle exception per class instance like

case classof controller do:
  float_bezier: process further
  noise_bezier: process further

This is all hypothetical offcourse, YMMV

-Johan

 lo1

@denisT:
What is the advantage in the pre-marking method? If I understand correctly, the marking process will have to be done in any case before each export, so why not have it as part of the export process and reduce the risk of human error?

As I understand you need to know what nodes are really animated for the future export. It’s very common task. Usually the problem solves by tagging (marking) animated nodes. It’s slow to search all animated (transform changed in time range) objects in real export time. But you can do it off-line and mark (add cust attribute, add user defined property, add extra name prefix/suffix, etc.) to all animated nodes. After that the exporter will search for marked nodes only. Usually the animator/fx-artist/tech-artist is responsible for marking animated nodes.

This is a good suggestion. I will keep the frame-range detection algorithm previously discussed (I think it’s too good to be left out), but will add a “static node” tagging tool so that things the user knows for sure won’t animate or shouldn’t be animated (like level objects) will be left out of the detection.
I will also warn the user to make sure to only activate the “Export Animation” setting when there is at least one animated node in the scene (else he’ll get the overhead of detection without actual benefit).

I was doing some testing and 1024 objects (512 animated, 512 static), with a 10k frame-range, took about 19 seconds. I’m sure these are extreme conditions.
Even if it takes too long, giving the user a progress bar to show that “this will eventually end” will make the process more digestible.