Document.SaveFamilyToFolder node not releasing memory

Hi, I’m trying to make a Dynamo graph to save out families from a project using the Document.SaveFamilyToFolder node, and I’m running into a memory issue.

I have over 900 families I’m saving out, and after a while, I get an error message saying I ran out of memory. I check task manager, and sure enough, I’m at 99% used memory. It looks like the node doesn’t release the file from memory after it’s done saving it.

Would anyone know of a custom node/python script that would work better and release the file out of memory after saving it? Is this even possible, or is this a Revit limitation? I know Revit has a built in function for exporting the family library - currently that’s what I do. I have to export the library from Revit into a folder, wait 20 - 30min for that to finish, then run a graph to put them away. I want to transition away from that and have a Dynamo graph that I can just hit play on and it exports the families and puts them away at the same time.

Any help would be appreciated.
Thanks.

Hard to see where your memory issue is without your code. Post the DYN showing all nodes in use so we can see what you’re up to and the community can help. Otherwise we are kinda guessing in the dark at what you are up against.

One thought is that you’re building a list of 900 families, opening all 900 families as a document (with your RAM needing 20x the file size per open, so at 5mb per file you would only 90gb of RAM), then iterate over each document to save them as a new family document at location X.

Most DYN based methods for bulk family editing will do just this - in fact this is one of the primary reasons why I don’t think using Dynamo to bulk process families is a valid approach (even if you discount the transaction issues, the concurrently open versions, and the like). It just doesn’t scale the way people want.

A better solution is to open the document, save the document, close the document, and move onto the next family. This would mean you would free up that memory. But it will not be any faster than the ‘save as library’ function; in fact it’ll either be slower or less stable, as you will be adding Dynamo and/or Python overhead to the process or circumventing the checks and balances that the Revit development team has built into the function they provide. I would save myself the hassle and automate the filing to the correct folder after the fact, or just provide the correct folder in the first place.

Maybe look into pyrevit which can run a command after the journal is updated as an event. Check for the line in this event which indicates the export has completed then script your refolder process in the hook script.

Attached below is my file and a screenshot of the area in question below. The graph was made in Dynamo 2.12 and uses Data-Shapes 2022.2.100 and Orchid 212.1.0.8047 packages.

Family Exporter.dyn (197.9 KB)

I don’t know how exactly the OOTB Document.SaveFamilyToFolder node works internally, but that seems to be doing something similar to what you describe, which surprises me since I would think it would work the exact same way as the library export option within Revit. It seems to open, save the file, but then retain it in memory, because the memory increases with each file that is saved, and not all at once before it starts saving files.

I agree with the process you describe, but not sure how to do that exactly programmatically and not manually. Also, not sure what you mean with, “…or just provide the correct folder in the first place”. I’m assuming you mean working on files saved in a directory instead of families in a project, but its necessary the files are in a project and saved out.

Let me know if you need more info.
Thanks.

That’s interesting. Coincidentally, I just downloaded PyRevit the other day, so I’ll have to look into this. Thanks!

1 Like

In reviewing the code on the github and testing myself, this is basically doing what the save to library feature does, with similar memory usage on both tests.

You’re doing the same thing in as effective a way as can be done, but with the added overhear of having to load the Revit API into a Dynamo and/or Python environment (a hit at time of use) instead of when you launch Revit (both Revit add-in and Revit features have their libraries loaded at start).

Best overall performance will likely be via the internal tool. This means we are at the ‘where to save step’ I was getting at before, as well as the trigger.

Some questions:

  1. How many models are you doing this for?
  2. How frequently are you doing this?
  3. In a perfect world, what would you want the trigger to be?
  4. Will all families go into one folder, or is it a ‘one folder per condition/property combination’ situation?
  5. What type of storage media will you be saving too?
  6. What type of worksharing methods are you using?

Interesting. What you say makes sense but I’ve used the internal tool just fine before and my RAM never gets anywhere near full. But the Dynamo graph keeps topping out.

To answer your questions:

  1. How many models are you doing this for?
    Currently ~900. Some are a simple model with some having simple parameters. The others are nesting multiple of the simple models and have a little more complex parameters.

  2. How frequently are you doing this?
    Whenever there are updates to any of the families. I try to group updates together to do it as little as possible. About every other week or so.

  3. In a perfect world, what would you want the trigger to be?
    Everytime there is an update to a family, all effected families would be exported.

  4. Will all families go into one folder, or is it a ‘one folder per condition/property combination’ situation?
    Not one folder, I have a directory tree of subfolders where the families are organized into by product name, then model type.

  5. What type of storage media will you be saving too?
    A local network NAS (I don’t manage it, probably HDDs based on speed)

  6. What type of worksharing methods are you using?
    Not exactly sure what you are looking for in this question, but all the family files are stored on a local NAS and users load whichever families they need from it into their projects as needed through file browser/load button in Revit.

Thanks!

The worksharing question was wondering if you are using cloud worksharing, Revit server, or local network for the central file.

On those 900 projects you could have 9000 families with similar overlap in product, scaled out to 90000 types which will conflict between project 1 and project 899… how do you deal with the two?

By the sounds of it a Revit add-in would be best, but knowing where to path the content to ensure you aren’t overwriting or otherwise creating conflicting data is the hard part. I suppose appending the family’s GUID would work as a consistently unique ID so that when part A is loaded into project 1 and project 899 changes could be kept separate…

A lot of this assumes you’re operating as a building design firm though… any chance you’re an equipment manufacturer (would guess railings, curtain walls, or wall panels)? If so that would change things significantly.

Another big one is are the families ever shared and nested? If they are then the nested library instances would be out of sync unless any parent families are detected and written also.

2 Likes

@GavinCrump …thx great information :wink: and make sense

1 Like

Ah, we don’t use worksharing as only one person is working on a project at a time usually.

To start with your last sentence and @GavinCrump comment since it might put things in perspective better; I’m at a railings manufacturer. All the families are shared and either individual parts and fasteners that make the railings, or sub-assemblies containing those families with parameters to control visibility and geometry. These families are only used internally and not by anyone outside the company.

I think I got confused on what you were asking with the first question - Its only one project file with ~900 families. The project file is acting like a library where when I update one family, any other families that have it nested will also be updated. Then I can just export all the families out of that project into individual files for people to use in their projects. Usually people wont have conflicting families when working on project, but if they do, its most likely that I updated one or a few families, so they would just overwrite with the one they are loading in.

I have a solution I’ve been using a while now to manage and use the families. I was just trying to streamline the exporting process a bit more. Like I mentioned above, I have a project file with all the families loaded in and placed - when one family is modified, the change is automatically carried though to the effected families. Once I’m done with making changes, I will export the families though the Revit interface to a folder. Once the export is done, I run a Dynamo graph I made that puts the families away into the appropriate folders. Usually, conflicting families isn’t an issue when using the families on projects, and if it is, its easily resolved.

Ah! You may already be in the most optimal solution then without writing an add-in or changing up the system some… just know that the add-in system could slow down other actions so you might decide against it. There is only so much that can be done when saving to disc unfortunately.

1 Like

That’s a bummer. I don’t necessarily mind if the overall process is a little slower if I can just hit play on a graph and it finishes whenever. Right now, without an exporting solution in Dynamo, I have to constantly check for when the family export process is done so I can hit play on a graph, which the export process will take about 30-40min. I guess its a small thing, but if possible, it would be nice to streamline it. Maybe I’ll give @GavinCrump’s solution a try.
Thanks for your guy’s help.

2 Likes

I have had some success using CleanMem with Revit to keep from running out of RAM.

I’ll have to play around with that and see if it solves my problem. Thanks!
One question though: Does this help while a Dynamo graph is running?

I haven’t tried this particular tool, but others in the past.

It might help and/or solve stuff entirely, but it also might corrupt the Revit instance causing a crash. Proceed with caution and back stuff up before you try to start processing.

I figured as much. The website says it shouldn’t cause anything to crash, so hopefully it wont, but I’ll still be taking precautions just in case. Thanks.

1 Like

Replying back with a potential solution to my issue and to check if there are any issues with it.

After some research, I found this reply with Python code to save out families in a project. I copied it into my graph and modified it a little as I still had RAM issues with it.

Here is my modified code:

"""
SAVE OUT FAMILY TO DIRECTORY
This takes a collection of families from the active document and saves them as RFA files to the path given, overwriting the original.

Original author: Jacob Small
Modified by: LCarrozza4MGVV

Dynamo version: 2.12.0
Revit version: 2022
"""

## Imports
import sys
sys.path.append("C:\Program Files (x86)\IronPython 2.7\Lib")
import clr 
clr.AddReference("RevitServices") 
import RevitServices 
from RevitServices.Persistence import DocumentManager 
from RevitServices.Transactions import TransactionManager 
clr.AddReference("RevitAPI") 
import Autodesk
from Autodesk.Revit.DB import *
import traceback

## Inputs
doc = DocumentManager.Instance.CurrentDBDocument #sets the doc variable as the active document
families = UnwrapElement(IN[0]) #sets the base family variable to the contents of input 0
#directory = IN[1] #sets the directory variable to the contents of input 1
filePaths = IN[1] #[directory+"\\"+family.Name+".rfa" for family in families] #builds a file path for each family by appending the directory to the family name an an rfa file extension.
if not hasattr(families, '__iter__'): #checks if the input wasn't a list
	families = [families] #if not, wraps the object in a list so we can ensure we've got something to iterate over
	filePaths = [filePaths]

## Setting Variables
TransactionManager.Instance.ForceCloseTransaction()
saveAsOptions = SaveAsOptions() 
saveAsOptions.OverwriteExistingFile = True 
resultSet = [] 

# Begins to itterate over the list of families
for family,path in zip(families,filePaths):
    try: # Try to save family to folder
        path = path + "\\" + family.Name + ".rfa"
        famDoc = doc.EditFamily(family) # Gets the family document from the parent family
        famDoc.SaveAs(path, saveAsOptions) # Saves out the family
        famDoc.Close(False) # Closes the family doc
        famDoc.Dispose() # Releases doc from memory
        resultSet.append(family.Name+".rfa saved!")
    except: # If try fails, output error message
        error = traceback.format_exc().split("Exception: ")[-1].strip()
        error = error.split("Parameter name: ")[0].strip()
        resultSet.append(family.Name+".rfa not saved.\nException:\n"+error) 

OUT = resultSet

I mainly changed how the for loops work and added in “famDoc.Close()” and “famDoc.Dispose()” (is .Dispose() redundant?) to manage memory usage. This seems to work perfectly and is just as quick as the in-UI method, my only hesitation is why does the OOTB note “Document.SaveFamilyToFolder” (or any other package’s save family nodes) not seem to use this since I still have RAM issues with them?

If anyone could take a look at the Python code and advise if there are any issues, that would be great!

Thanks!

Awesome find!

Some context:

This closes the document.

This marks the memory as free. You can’t do this until you close the document as Revit still needs the data in memory.

Individual nodes may not call on the dispose, as the information in memory might still be needed downstream, or it could be an oversight on the author’s end.

Awesome! Thanks for the context. Funny enough, that was my initial thought on how those methods worked which is why I put them in that order.

That makes sense as to why the other nodes don’t use them. After looking at the nodes, it does seem to be that they intend to keep the file open for use later down the line.

I marked my comment as the solution since this seems like a perfect fix!

Thanks for your help!

1 Like