Notifications
Clear all

[Closed] A lot of meshop.attach slowing down gradually

Hi everybody, I’m new here, and I start with some background about this question. I’m making a script to import a custom file format (written by a C++ application) into Max. The file format defines simple planes, and rotation of these planes. Also, a texture file is entered. The problem I’m having is with meshop.attach and materials. When I use materials (loaded into an array before all planes are created) and attachmat:#IDToMat (to apply materials to the mesh) the processing slows down as the import script goes more into the file.

The testing file contains 8000 planes, which eventually slows importing down to making meshop.attach take more than 1 second (after half of the file has been imported). 1000 planes import with an average of 70ms per plane (and attaching), and with these 8000 the average adds up to 750ms. Though an hour may not be a long time, I need to import 25 of these files… and I suppose it can be a lot faster!

Does anybody know how to make meshop.attach have a constant (and fast, preferably ) speed, instead of slowing down further in the file?

Thanks in advance,

Bas

7 Replies

check out garbage collection
gc()

Read the “How To Make It Faster” topic in the MAXScript Reference.
Pay attention specifically to the part where it discusses attaching and disabling the UNDO around the attachment code…
Might help.

Hi, sorry for not mentioning it – I did read the ‘How To… make it faster’ page, and used all applicable hints. I already had gc() run every (approximately) 20 plane loading, and almost the entire script is wrapped inside of an ‘undo off’ block. I also used suspendEditing() and disableSceneRedraw().

Does anybody know what else could be the problem?

Yep you need to break the attach process down into chunks.
This thread should help http://forums.cgsociety.org/showthread.php?f=98&t=635477&highlight=speed

My code was already split up in multiple mesh importing, as the application I’m planning to use the model for requires that – and splitting it up even more just makes it slower. The script is a bit strangely structured, but the problem really seems to be in material calculations, as when I remove the material loading lines (which makes it set the material property to ‘undefined’) or remove the attachMat:#IDToMat parameter, the application stays on the same speed for the entire file.

As an example, it takes 110 seconds to import around 1000 planes (split to meshes each 4.5 units in height and width), but importing a full file of 8000 planes takes 120 minutes. When removing the material support, the 8000 planes take less than 10 minutes. I’m probably doing materials a bit inefficient – my file format puts a material number and filename at the top, which is then loaded into an array, and each plane line references a number from the top, which is then assigned with a ‘plane.material = materials[matNum]’ line.

I hope you understand all of this, and know a solution.

Growing multimaterials will slow down your attaches. I don’t really think you can get around this, if a large number of mat-id’s is the correct result you are aiming for. If you can identify some redundancy in the materials (many similar/identical materials), then I guess you could find a way to optimize it.

In my experience trying to attach multiple editable mesh objects crashed my max. My solution was to always convert to poly and use polyops. I hope this helps.