I have built a very large program which basically populates a whole bunch of pre determined parameters for specific elements throughout the entire model. There are ± 30 different categories that get calculated and populated. It works really well, but for very large models it can actually take longer than an hour to complete…
My question is this… Is there are way to reduce the run time of this task? At the moment I am purely using Dynamo nodes, with very little DS and Python… If I had to go through the process of converting the program to running in Python, would that cut down the run time? If it would, does anyone have any stats about how much it would reduce the average run time?
Alternatively, are there any suggestions about how to solve something like this?
for very large scripts i introduce toggle breaks, so i can control what sections of the script actually run.
i too chose to write big scripts to do it “all in one go”, but quickly found that it crashes revit and dynamo as well as locking your system up for a long time.
breaking the script into category chunks and making it dynamo player friendly greatly increases speed of use and efficiency as well as allowing you to have greater control over data flow.
As mentioned, large scripts on large files can quickly eat up all your memory.
As for speeding up your graph, python is great, yes, but design script can go a long way if you start combining things. The fewer nodes you have the fewer things Dynamo has to keep track of. Memory is almost always the issue in these cases.
Thank you very much @gazcoigne and @Nick_Boyts for the info and the suggestions. I will look into both of these as I can definitely see the value of breaking down the script a little into category runs, I like that idea.
Combining that with Python and DS also sounds like it would be a win.
Thank you again!