[Closed] another backburner issue
I’m not getting my maps included when I submit by script, and I’m wondering if I’m the only one seeing this.
I checked the zip from the job and they’re not there, I submit by hand same file and there included…
Besides job.includeMaps = true, I can’t think of another way to have it work… anyone else had troubles with this? I’m so close to my automated render pipeline tool… so close…
-Johan
Why include your maps at all?
Just make them on a network path and then you don’t need to include them, will be much quicker to submit jobs to backburner without them.
DW
Hi Dave,
Unfortunatly our network is not up to par yet and where all on XP where there’s a 10 connections max limit, referencing textures will throw lot’s of errors then. So we need to include them for now…
Anyone else have seen that problem before?
-Johan
Tested in Max 9 sp2, and Max 2009
m = netrender.getmanager()
m.connect #manual "compmanagers26" port:3264
NJ = m.newJob()
nj.includemaps = true
nj.name = "Test Include Maps"
nj.Submit()
Included all maps, and even Vray Proxies, in the zip package on the manager.
johan, you cast my mind back to the times before our NAS server. that 10 connections business is a pain! have you seen this? – http://www.freenas.org/
I must admit i’ve been following these backburner threads with interest, i tried writing something similar a while ago and gave up with extreme frustration!
Pete, yeah it is extremely frustrating, but when it works the payoff is tremendous… that is when it works
Keith, I hoped you’d reply, thanks, then I know it’s probably me doing something wrong and I should spend more time on it… Btw do Xref’s scene and objects get picked up in the zip as well, or should they be referenced…
Btw, I was really surprised that you have to set all the render parameters by hand, if you just do submit() then it renders in fields and some other settings are ignored too, really weird… I figured it would take all custom settings and with the extra settings allow you to override certain settings, which is not the case obviously…
I should mention we have 1 2003 server but the way our projects are setup is that textures are always locally (on XP machines), that’s why we don’t reference them… and I’d be happier when everything is just one big zip that is rendered locally, because there’s lot’s more network traffic for files editing and compositing, trying to free as much bandwidth as we can really…
Thank you both, and Pete, if you can get all that dotnet stuff working surely a backburner script should be within reach
-Johan
Yes, it collects Scenes and Objects, just tested it…
I would make a generic render preset file (.rps) and load that, then change what few things need to be changed for that scene before submittal. Instead of chasing down every render setting, there are just too many now…
Making huge zips is not the best low bandwidth solution. I’ve been pushing to have all “render files” to be just x-refs. So the files are tiny moving about the network and only load the xref’d objects it needs when rendering. I don’t know what percentage of average shot uses assets that are “library Items” or custom to the job… but all the items you use across multiple jobs you should look to have on your server and x ref’d in.
You may want to consider doing a “save select” of the items needed for a shot and then send that to the queue. You can save alot that way for some shots.
Good luck