Notifications
Clear all

[Closed] Memory issue (?) with script after rendering large scene

Hi guys,

once again I’ve got a bit of a problem with maxscript that I can’t really seem to solve on my own, so I’m asking for some help here
I’ve made a script that makes the rendering of large images much easier by using regions. It basically lets you set an output size and amount of regions, the rest is automated. When the rendering is done, the script tries to open (and close after that) the created bitmaps, to make sure that they exist. After that, it stitches the rendered regions together into one image.
It’s all working fine, except for one particularly heavy scene. The real bugger is that I had actually created the script for that scene in the first place.

In that scene, the script crashes at the start of the stitching process. It throws an ‘unknown system exception’ at the setPixels line. The odd thing is that if I let the script skip the rendering and move straight on to stitching, it works just fine.
I suspected this to be a memory issue, so I’ve monitored the memory usage during the running of the script:


(sorry for the Dutch in the taskmanager)
As you see, at the time of the crash, there’s about 500mb free memory, and I can’t believe that filling a 3000×3000 pixels bitmap with 500px in each loop consumes all this.
I have tried some things to free extra memory however, like increasing the heapSize and running garbage collection on a couple of places (especially before stitching). This didn’t yield any result though.

I hope that you can help me solving this problem.
Thanks in advance.

ps. If you’d like to see the script I can post it, but it’s rather long.

6 Replies

Is there anybody who can help me? 🙁

1) The rendering of a scene takes place in main 3ds max memory, not in the Maxscript memory area designated by the heapSize variable. Bitmaps that you create/open/save/modify via Maxscript exist in the Maxscript memory area.

As the main memory needs of 3ds max change during a session, Windows can allocate more or less memory as needed. The Maxscript heapSize, on the other hand, is NOT managed and is simply set in stone. If you have your heapSize set at 50MB, then 3ds max will allocate that 50MB and not give it up…it is designated for Maxscript alone and will not be released back to either Windows or main 3ds max memory.

Its not a surprise that renders can complete with no problems, while your large bitmap operations in Maxscript are causing memory crashes. 3ds max just grabs more memory from Windows when needed for a process, but its up to you as a scripter to adjust the Maxscript heapSize as needed. If you try to do more in Maxscript than your current heapSize allows, you WILL run out of memory and cause a fatal error for Maxscript/3ds max.

2) Its been a long time since I dealt with this, but I worked on a script last year that required a heapSize well in excess of 350MB, and I encountered problem that might be related to yours.

Specifically, I would get memory-caused crashes because the heapFree value would eventually start misreporting the actual free memory when the heapSize become very large. This prevented my script from increasing the overall heapSize as needed, and I ended up having to simply estimate the needed heapSize and set it to that amount.

I never had any problems with the heapFree value when the heapSize was less than about 150MB, but I never tested this to find the exact cutoff for the problem.

I see. How would you estimate the needed heapSize?
When I do a quick calculation, a bitmap of say 3000×3000 pixels would require 3000x3000x32 bit, about 34mb. Add the opened bitmap to that, say 500x500pixels, becomes a total of about 42mb.
Would this mean that I’ll have to set the heapSize to at least 42mb?

What I found odd about this by the way, is that a the same bitmap operation does not crash without rendering, but does after rendering. With the same heapSize.
Even when rendering a small scene the bitmap operation goes fine, at the same size as the large scene that does crash.

Thanks a lot for your input!

Ok, first there is the method you use to set the heapSize:

Option 1:
You can set the default Maxscript heapSize to accomodate this script, which will set it to that value every time 3ds max starts. (This is in the Customize>Preferences>Maxscript tab.)

Option 2:
You can set the heapSize when this particular script runs, which will allow the heapSize to be smaller the rest of the time. (if (heapSize<102400000) do (heapSize=102400000))

Second, there is the question of what heapSize to use. I would just set it to at least 100MB in this particular script, and run the process to see if it completes. If your final image uses 36MB, there is the potential to have twice that much in use when both the final image and the tiles are residing in Maxscript memory. I’d also recommend putting some “print (heapFree as string)” commands in at certain points to see how much Maxscript memory is being used.

Most scripters never go near heapSize values like this, but I’ve worked with hundreds of megs of data in persistent global variables with no problems. That heapFree bug I mentioned was the only problem I’ve encountered.

Somehow I keep getting runtime errors when I try to increase the heap size:
– Runtime error: Unable to increase heap size to: 101760000
Any idea what could be causing this?

edit: adding something in the range of 10mb does work. I added some traces in the stitch loop, and it turned out that there’s about 21mb out of 32mb free (without rendering first)… I’m getting confused…

edit2: I’ve managed to get the heapSize increased to about 100mb using a loop and adding 10mb with each iteration. But it’s still crashing right at the start of the stitching process, the first time it tries the setPixels…:sad:

In that case, I have no idea what is going on…particularly because you’ve had it run successfully, which means your code isn’t bugged. It might be that 3ds max 7 has a bug with large image handling after a large render task. Hard to tell…

You might have to just accept doing the stitching separately, or start trying workarounds such as a resetmaxfile #noPrompt command before starting the stitching process and then loading the image files.

Any serious scripting project eventually requires bizarre workarounds, and this might be a case of that. (I just spent half of yesterday avoiding a bug in max 3-8 that would crash max when I tried to delete the group head of custom group object that had a complex internal structure. 3ds max 9 handled it fine, but to be backward compatible I had to find some way to avoid the instability…it took 5 hours.)