Revit Slows down after large Dynamo script

@Dimitar_Venkov and @Kulkul,

Thanks for some nice ideas to work with. I am struggling a lot with dynamo memory management, and run graphs on very large datasets, which breakes down too often. I will examine how to implement your ideas, both flushing and give time.

Hi All, sorry for late reply.

First of all, thanks All of you for your discussion here. Very pleased to know that the issue is more often than I thought. I agree with @Greg_McDowell that closing Revit is not the best solution for a not-workshared session.

@Kulkul and @Dimitar_Venkov, thanks for the suggestions. I’ll try both of your recommendations.

Last but not the least, I managed to make the script using speed preference (from fastest to slowest) scripting - recommended un some past thread:

  • Python
  • Design Script
  • Custom Nodes
  • OOTB and Package Nodes

For my scripts, I try to make the most of it in python. Also I’ll check the non-python parts to see if I can speed them up somehow.

Again, Thanks all of you or the discussion!

CHeers

Hi @Kulkul,

I’m trying to replicate your recommendation for a very simple python node:

I have some huge measurement units which I presume they are not seconds. How do you translate it into seconds?

Thanks in advance!

Time :slight_smile:
image

1 Like

Thanks!

I also manage to do this:

import time
tiempo = []

tiempo.append(time.time()) #several times (all needed)

tiempo_final = []
for t in range(1, len(tiempo)):
    tiempo_final.append(round(tiempo[t] - tiempo[t - 1], 4))

I had a couple of minutes spare (I know, when does that happen) so I put together an Audit node for .dyn files.
I intend to have this as part of my first package publish (a few weeks away yet) hence the node naming, so expect to see it there also (with some further optimization)

Audit DYN

Inputs/Options:

  • File Path (DYN)
  • Create Report (CSV file in the same folder as the script)

Outputs:

  • File Info (Name, Size, Workspace version)
  • Total Number of Connectors
  • Unique Node names, types and numbers of.
  • Script Composition (Node types, numbers of and total)

I am thinking when used in conjunction with time tracking shown by @Kulkul and here How to measure execution time for nodes? this may form a starting point for thoughts on how to optimize a script.

In case you are wondering "Sastrugi=parallel wave-like ridges caused by winds on the surface of hard snow, especially in polar regions."

EDIT/UPDATE: A revised version of this node can be found here, until I am able to publish my package.

5 Likes

OMG! NIce work @Ewan_Opie

@Ewan_Opie, I tried your node, but the output file is almost the same. COlumn structure is different. I have all the information in the column A of the CSV File.

Hmm not ideal, can you post a screenshot?


I managed to get it orginized as you have in your image, but I needed to import from text, not just open the CSV file.

How did you open the csv? Did you drag the file into excel? or open it using a double-click?
I am trying, and failing, to replecate this error… :thinking:

The first time, Double-CLick. The second imported from the Data ribbon.

Can you post your dyn?

Sorry, I can’t due to my office’s policies. This is the actual dyn that we’re developing. I tried with another script and had the same result. For me, it only works if I import the CSV file from the ribbon in a blank sheet. Maybe it is my Excel configuration.

All good, I understand.
It very well could be Excel… at least you have a work around.

I may include the option of exporting to different formats (TXT,CSV and XLS) in the final node configuration.
If anyone else encounters errors please let me know, would be nice if my release goes smoothly :wink:

1 Like

The biggest reason for slowing down is the way the graph is created.

I can understand that causing an issue during the execution of the graph, but not after it’s over and after Dynamo has been closed.

1 Like

I believe that this is because the data associated with running the Dynamo node is still stored in the ram so that it can be undone. Hence the save/sync clearing things up.

Then you’d expect RAM usage to be quite high, yes? That’s not been my experience.

1 Like

Believe that much data would go directly to the scratch disk.

I think you can easily test this by opening workshare monitor and then the “system performance” popup and watching the various components adjust as you run a script, close Dynamo, and not yet save.