1.3 speed

Performance has always been an issue with Dynamo, when working with large data sets, mainly due to the functional way in which a Dynamo graph operates. You have to remember that data is never mutated. Instead, it is always copied between nodes.

When you input something into a node, Dynamo’s backend creates a duplicate copy of it, performs the node’s function on the duplicate and finally returns the changed copy of your original input. That means that if you have ten nodes in your graph, by the end of the execution, your memory footprint could easily have increased tenfold. To top it all off, all of that extra data must be managed at all times, which is no small task either and can quickly add additional overhead to a graph’s overall performance.

An ideal and perfectly functioning backend would track the dataflow for changes, invalidate the old copies and free up their resources at a convenient time. However, that’s easier said than done and if you observe dynamo’s memory usage during normal operation, you’ll see that that happens very rarely, if at all.

Simply put, Dynamo is a bit of a memory hog, and to make things even worse, it has to run on top of Revit - which I think we would all agree, is not an application known for its efficiency. That means that Dynamo has to share the memory pool and AppDomain of the currently running Revit instance.

You can test all of the above fairly easily - create a simple number range with 1 million values, start applying some basic arithmetic operations to it and observe the memory spikes after each new node is connected.

Where things go awry, is when you start changing inputs and deleting nodes. You’d expect the memory would eventually go down at some point but unfortunately that doesn’t seem to happen often enough.

I’ve found that avoiding the use of preview bubbles and watch nodes, when you have really long lists helps a lot. It seems like their current implementations get bogged down when asked to display a lot of information and result in huge memory spikes.

In the end, if your dataset is that large, that it causes Dynamo and Revit to grind to a halt, this might just not be the right platform for the task, because it simply hasn’t been designed with such a use case in the first place. You should either try an alternative approach or limit the scope of your actions.

13 Likes