Revit Slows down after large Dynamo script

Hi All,

I know that this topic has been discussed before, but I haven’t found a solution to the problem and I think it is a really important topic. I’m experiencing the same problem described here:

When I run some particularly large scripts, I have to close and reopen Revit because I can’t even change a simple parameter. (For the end-user this is not a good solution)

I also discard that it is a hardware issue because I tested several computers and the problem persists.

Also here are some other threads about:

I’m using Revit 2017.1 and Dynamo 1.3.1.

Is there a way to clean the VRAM or RAM memory at the end of the script through the Revit API?

Thanks for your time. Any commentary will be very appreciated.



Hi. I have experienced the same slow down in Revit and when repeating complex scripts using the Dynamo player. Yes, performance improved upon restarting the Revit session. As these complex scripts can interact with the Revit model through thousands of iterations, perhaps the simple fact that these need to be remembered to be able to do Ctrl-Z for multiple runs is slowing things down? Saving the revit file clears the undo list, and this does help slightly with performance. Anyone else had this as the case?

If saving the file fixes it then that’ll work for me. It’s not ideal but I understand the issue.

Perhaps there could be a warning prior to running large scripts about not being able to undo past that point. Something to make it more transparent on the end user.

Are you sure saving clears thee cache? It doesn’t seem to be doing that on my end. Maybe in a workshared file?

1 Like

I’m guessing it at least partially has to do with borrowing elements - so a workshared model would see some increased performance after a save/sync. A non-workshared model might only benefit from a full close.

That would be consistent with what I’ve experienced. Still… be nice to have a different workaround. Asking staff to close Revit every time they want to make use of a large graph (or a series of large graphs) is a recipe for low Dynamo usage.

1 Like

I agree but also think it’s a good idea anyway.
It’s annoying to have to close just because you ran Dynamo, but how often will you be running these “really big graphs”? Really, if you’re running a graph that’s going to be “touching” a lot of elements in your model you’d probably want everyone else out while you run the script anyway. And after waiting 5/10/15 minutes for a script to run it’s probably a good idea to save and close and let everyone else back in the model. So yes, it is a bit of an extra step, but in reality it’s probably good practice too.

What is a “Big” script? I have run revision cloud exporting graphs (checks every view and sheet to see if there is a cloud visible) with no slow down (ran on a project with over 400 sheets).

I have also ran other scripts which export every view name / number and some other stuff to excel on more than one project (run script > close dynamo > activate other open project > open dyanmo > Repeat)

The biggest one I have ran collected every element in an elevation and overrode its graphics based on distance from elevation. (selected about 15,000 elements) This one I no longer use and was one of the first ones I ever wrote but I do not recall Revit acting strange afterwards. I have not used in over a year so I could be mistaken.

Not saying there is not a problem just that I have not noticed one. I also know that 15,000 elements is quite small when talking about data processing so thats why I asked what Big?

How about calculating shortest path from every corner on a given level… around obstacles?

Actually, I don’t know if it’s the number of elements being managed by the graph but rather the actual size of the graph (number of nodes, etc.) that’s causing the slowdown. Some of my graphs are YUGE and that makes them slow to work with. It’s these graphs that cause the largest slow down for me.

My graph was of similar node population, maybe more… We were using a work shared model on our server @Greg_McDowell. It was one that created a whole lot of complex facade support members in the form of a dual-axis curved truss system. It referenced a linked file from the architect to extract base geometry for our structural truss supports to align to.

Being a player script, it also has user prompts for inputs on parameter values that varied on a truss by truss basis. The graph also populates truss node points with real world coordinate data transformed from Dynamo units, that was then exported into Excel for analysis purposes. So it was rather heavy (big @Steven) on internal calculations as well as the number of elements it was creating and storing information for.

At the time we had only 1 drafter in a separate facade support model doing this process, so we were able to negate the issue of closing Revit on multiple machines often through planning our workflow.

This was the first instance of slow down I have encountered using Dynamo, so for the 1% of the time, closing Revit seems acceptable to me. ( Until the back end coding somewhere ) :face_with_raised_eyebrow:

1 Like

Sounds like yours is bigger than mine! LOL

All I know is that there are graphs that I run frequently, and that I’d want others to run frequently as well, and that Revit is slow afterwards until I restart. Boo


Not ideal @Greg_McDowell
I’ll have a play around and see if I can offer some alternative solutions when I am back in the office on Monday.

I can’t give you any guarantee that it will help, but you could try the “RevitProcess.EmptyWorkingSet” from spring nodes. It will flush any non-critical memory from the active process to the swap file and hopefully spare you from a restart.


Has anyone who have experienced this issue taken a look at task manager after the slowdown begins?

The only time I’ve seen such a slow down is when the graph in question has created tons of data, or has touched so many elements in the Revit model, that the ram is completely used up (and then some).

The biggest reason for slowing down is the way the graph is created. The best way to solve this is to track time taken for each session by adding time.lapse and select 1 or 2 elements. Then try other workarounds to minimize the time where you see the time.lapse is giving you more. Hope it helps!

1 Like

@Kulkul if composition is the main issue, has anyone compiled anything (custom node, external script) that can perform an audit on a .dyn file?

Things like:

  1. Number of nodes
  2. Number of laces
  3. Number of functions upon completion of run
  4. Run time
  5. Node package dependencies

Having this kind of information available would mean you could assess where the “big script” issue starts to occur. Ultimately letting you decide how best to optimise your script (more coding, less lacing…) I’d be keen for something like that!

@Ewan_Opie I mostly use python for my workflows i will add time inside the python so that it helps understand where is the cause :slight_smile:


:+1:t2: top tip, thanks! @Kulkul

Hi All, sorry for late reply.

First of all, thanks All of you for your discussion here. Very pleased to know that the issue is more often than I thought. I agree with @Greg_McDowell that closing Revit is not the best solution for a not-workshared session.

@Kulkul and @Dimitar_Venkov, thanks for the suggestions. I’ll try both of your recommendations.

Last but not the least, I managed to make the script using speed preference (from fastest to slowest) scripting - recommended un some past thread:

  • Python
  • Design Script
  • Custom Nodes
  • OOTB and Package Nodes

For my scripts, I try to make the most of it in python. Also I’ll check the non-python parts to see if I can speed them up somehow.

Again, Thanks all of you or the discussion!


Hi @Kulkul,

I’m trying to replicate your recommendation for a very simple python node:

I have some huge measurement units which I presume they are not seconds. How do you translate it into seconds?

Thanks in advance!