Notifications
Clear all

[Closed] Tips of working on a large MaxScript projects

it might be another reason that size to put something in a separate file… usage.

do you know how many times i wrote function “findItemByName”? hundreds!
and every next time i was writing this function again i said “No! not again. i sick of it”
but … did it. why?
because it needs to be a bit different from case to case.
check:
the basic:

fn getItemByName list name =
(
    n = 0
    for k=1 to list.count while n == 0 where list[k] == name do n = k
    n
)

now if you can ignore the case:

fn getItemByName list name =
(
    n = 0
    for k=1 to list.count while n == 0 where stricmp list[k] name == 0 do n = k
    n
)

but if we need to check every item is valid? what if we want to find by pattern? what if we want to find all instead of first? what if we need the item itself instead of its index?

the function that provides all these ‘ifs’ as an option – is a monster! big and heavy

 MZ1

And what if I say I don’t have ANY file except launcher in the max (app data) folders? So how max is able to find my scripts to load them?

see several posts above… user has to manually specify the path to your tools as one of a 3rd party paths.
i can do it automatically with an installation process but prefer to let a user do it himself. user has to know where all my tool’s components are.

2 Replies
 MZ1
(@mz1)
Joined: 10 months ago

Posts: 0

I’m agree with you, user must know the tools folder, but I prefer installation because this way also user knows where files are. And important than installation is uninstallation.

(@denist)
Joined: 11 months ago

Posts: 0

it’s good… if user trusts you

never liked (and never use) the “include” and “filein” functionality in max found they cause more issues than they solve especially when there are more elegant solutions out there. Large project management has been mxs Achilles heel since year one and has never been addressed in fact a lot of the interesting additions over the years have made it worse in some ways. Saying that the end user doesn’t give a **** about nuts and bolts of how it’s put together just as long as it works and is easy to install.

Interesting discussion. I suppose I’m guilty of abusing filein then.
Once I discovered the proper use of struct based tools I began using my structs like C# class files. One file per struct, loading them with filein as needed. Just like “using” in C#.

Had no idea this was frowned upon by the more experienced.

If separation of your code into fucntional chunks and loading them via fielin is not a good idea, then what is the ’ more elegant solution’ ?

1 Reply
(@denist)
Joined: 11 months ago

Posts: 0

it was an answer above… let your modules be loaded by max. just an easy example: if you put them in a script startup directory.

sure you can say that is not a good solution in general. but i don’t say it is good. there is another place for auto-load – stdscripts. or another – 3rd party plugins.

an elegant solution is to let max load your stuff.

here is a question… what would you prefer if you have two options?
#1 push a button and easy install whole tool package into your max system. push another button to uninstall it
#2 copy one directory (tool package) in any place and just point to it by specifying a path, and delete this directory any time if you don’t need or like it

1 Reply
(@wallworm)
Joined: 10 months ago

Posts: 0

Denis is always a wise master!

I used #2 for years but after so much time dealing with 1) People who want an installer and 2) People who prefer an automatic “Update Me” function, I added both to the tools I work on. While I think most of us who actually create scripts might prefer using scripts that just plop into a folder–my experience is that many users actually do not like unzipping files into specific folders and prefer automation.

I choose to use fileIn, however, and manually choose what to run. The reason I chose this is that my tools have far more tools than any one user will utilize in most sessions. As such, I have the required classes load when called instead of loading all into memory automatically. So many structs remain unknown until a function or macro needs it… then gets initialized with a global filein.

My solution is basically #2, although on the previous project I built an installer.

Because of windows user account permission annoyances, and the pain of also maintaining and building an installer, I developed a preference to keep my max script tools in a folder I can control completely, far away from Max’s defaults. I have “Tools” folder in our source control that our art team all utilizes. All my max scripts run therein. I have a single script an artist must run to install our custom menus which creates macroscripts which then all point to .ms files in our Tools folder. This includes a “update me” function for when the menu system itself changes. A tool change is as simple as having the artist sync to perforce, at worst also running “update me”. No max restarts needed.

 lo1

For inhouse tools I use a system similar to Javascript’s ‘Require’, which lazily loads .ms files, and caches the module in a central location.

Thus, in any file I can use something like

m1 = Require(“module1”)
m2 = Require(“module2”)
m3 = Require(“module3”)

If any of these have been previously loaded, it will fetch the cached version. Otherwise, it will fileIn and cache the result.

The caching system is loaded at max startup.

I can also set a flag on the caching system to use development mode so that it never cache files.

This promotes code reuse and modular programming, and doesn’t create a large delay at startup, because libraries are only loaded when needed.

interesting…such an approach might be useful for me.
is testing for a cached module something like

 if module1 == undefined

or do you have a more explicit set of properties defined?

 lo1

The following is the function I’m using, part of a larger framework struct which is loaded at startup.

The libraries are cached by name.

libraryCache = #()

fn LoadLibrary libName loadAlways:off &wasCached: path: =
(		
	if loadAlways == undefined do loadAlways = off
	local existingCache = (for c in libraryCache where matchPattern c.v1 pattern:libName collect c)[1]
	if (not loadAlways) and (existingCache != undefined) then
	(
		if (wasCached != unsupplied) do wasCached = true
		existingCache.v2
	)
	else
	(
		if (wasCached != unsupplied) do wasCached = false
		local commonPath = if path == unsupplied then (/*(my default path for library modules*/) else path
		local libPath = commonPath + "\\" + libName + ".ms"
		if doesFileExist libPath then
		(
			local newLib = filein libPath
			if existingCache == undefined then
			(
				append libraryCache (dataPair libName newLib)
			)
			else
			(
				existingCache.v2 = newLib
			)
			newLib
		)
		else
		(
			undefined
		)
	)
)

The caller can pass a byref argument (wasCached) if it’s important for them to know if the library was just loaded or if it was already cached.

This approach allows me to reuse functions across tools with little need to think about their location.
For example I might have a “SkinTools.ms” library, which returns a singleton instance of a struct. In any tool I can then access it by using

local skinTools = MyFramework.LoadLibrary "SkinTools"

If some other tool has already loaded it, this would not trigger another filein.

Page 3 / 3