Notifications
Clear all

[Closed] search an array for multiple results

Good morning team at scriptSpot!

i had a quick maxScript “approach” question regarding searchs.

basically, i want to be able to search for an item in an array which contains multiple results.

i.e

#(1,10,1,20,10,1)

and if i wanted to search for 1, using finditem or even more eleborately bsearch, it will only return the first hit, not the whole array of hits.

i was wondering if there is any way of searching an array for all the hits, or occurances.

Thank you so much in advance!!

5 Replies

easy:


for item in list where item == something collect item -- returns list of equal items
-- or 
for index=1 to list.count where list[index] == something collect index -- returns list of indexs of equal items

thanks for your reply!

however, i was hoping to use “search” functions to speed things up.

as stated on the maxScript help document, looping through collection is lot slower then searching.

big advantage of looping is that i can really condition the results and finds to my liking, however, i was trying to get into habbit of writing efficient script and was wondering if there would be faster way of doing it, incoporating likes of “finditem” or “bsearch” to speed up the search.

Hope you can further shine a light on the topic!

thank you in advance

One linear search through a large array to collect all indices as DenitT demonstrated is actually very fast. The bsearch and qsort make sense where you intend to perform thousands or millions of linear searches for single values where the search key is different every time. I would not waste time trying to optimize the collect loop…

For example, if you build an array with 1 million elements where each value is a random integer between 1 and 10 and then want to know the indices of those elements that are equal to 10, the direct linear loop requires only 1 million iterations to collect them all. I just tested it and it took 985 ms. to collect the approx. 100K indices.

Any other approach that involves sorting 1 million items would require a lot more iterations before you can even start collecting data. For example one could do indexed qsort to bring all items with value of 10 to the beginning of the index array and then loop only about 100000 times to collect those that ended up in front. But the indexed qsort will eat a huge amount of time to prepare the array, and in fact even the creation of the indexed array to sort requires a million iterations loop!

Even in the benchmarks of the bsearch() in the Reference, the time to perform the sorting is mentioned as a factor. In fact, if you intend to perform very few searches into the unsorted array, a linear seach is usually faster than sorting the array and bsearching through it.

I hope you get the picture.

Thank you for your input BOBO.
MXS was first script language i got into seriously, and having lot of fun learning it.
I really appreciate what you have done with MXS, and your contribution to the MXS community : )

aplogies as i think my example was misleading. i was just trying to simplfy the problem.

basically what i am trying to do is…

“perform thousands or millions of searches for multiple hits on a fixed mutli-dimensional array where the search key is different every time.”

i am planning on writing a script that actively seeks overlapping segments from splines. which is common occurance from importing DWG drawing to 3dsMAX.

my approach was…

  1. explode all selected splines into individual segments

  2. create a multi-dimensional array (or even a structure?) containing

a. spline no.
b. spline length.
c. spline knot position A
D. spline knot position B
E. vector from knot A – B

  1. then the script will loop through segments attempting to find another segments that has same vector

  2. from that base, was thinking expanding the script to find segments thats not a exact vector, but smart enough to seek those smaller segments overlapping bigger segments etc etc.

so basically i was looking for most optimised way of seaching into a fixed (and sorted if i have to) array, with a fixed key (or even ranged key? such as ±10mm for tollerance). where the search does not just stop after a first hit, but searchs till the end of the array returning array of hits.

hmm now that i have written it all out, havnig some idea…

is it that searching through the whole array takes as long as looping through the whole array collecting matching results?

Again, thanks to everyone for their help!!
and thanks in advance for any further input.

I had to do something like this before to get around speed issues when collecting using nested arrays.

  Try something like..

[ul]
[li]Make a copy of the array and an empty array called indexArray.[/li][li]Use a while()do loop.[/li][li]Inside the while() perform the findItem again to see if there are still matching items in the array.[/li][li]Inside the do(), append the item index to the indexArray and then set that index in the array copy to undefined.[/li][/ul]When its done you will have an array of found indices as well as the original array still intact.

fn findAllItems ar val =
     (
     	local arrayCopy = ar as array
     	local foundIndexes = #()
     	
     	while ((foundIndex = (findItem arrayCopy val)) != 0) do
     	(
     		append foundIndexes foundIndex
     		arrayCopy[foundIndex] = undefined
     	)
     
     	foundIndexes 
     )
     findAllItems #(10,2,4,10,23,56) 10
My code in the actual script it was used in was slightly different than this.  I didn't have to create a copy of the array because of how everything was working, which could cause a bit of a speed hit here, but I saw huge speed increases from what I was doing before.