Reload Family from Folder

For anyone interested I am pasting my findings below. Note that I haven’t entirely tested this, but it should suffice for something you can build a POC from. Runs fine for me in Dynamo Sandbox 3.4 and Dynamo for Revit 2.12 using the CPython engine.

The Python Code
########################################
__author__ = 'Jacob Small'
__version__ = '0.1.0'
__description__ = "Attempts to parse the Dynamo environemnt to pull all custom nodes (*.dyf files) and checks each to confirm if there is (0) an unsupported Python in use, (1) only supported Python engines in use, (2) no Python in use."
__DynamoBuilds__ = "3.4"
__ReleaseNotes__ = "Runs in the CPython3 engine in Dynamo. Not completely tested in all environments and configurations. Test thuroughly before implementing."
__Dependancies__ = "None."
__Copyright__ = "2025, Autodesk Inc."
__License__ = "Apache 2"

########################################
### configure the Python environment ###
########################################
import sys, clr, os, json #import sys, clr, os, and json modules into the Python environment
clr.AddReference("DynamoServices") #add DynamoServices to the CLR
from Dynamo.Events import ExecutionEvents #import the execution events class from the Dynamo.Events namespace
import traceback

########################################
#### get dyf files from environment ####
########################################
packagePaths = list(ExecutionEvents.ActiveSession.GetParameterValue("PackagePaths")) #get the package paths from the current Dynamo environment
userFolder = [i for i in packagePaths if "AppData\\Roaming\\" in i][0] #get the user folder - sequence can vary here, but this should be fairly robust; parsing from the Dynamo engine or a property of the workspace might be better but I don't have time to solve everything today
defsFolder = userFolder.replace("packages","definitions") #replace the user pacage folder with the definitions folder
packagePaths.append(defsFolder) #append the definitions folder to the package paths
dyfs = [] #empty list to hold all DYFs found
walk = [os.walk(path) for path in packagePaths] #walk the packag e paths to get all files and directories recursively
for step in walk: #for each step in the walk
    for root, subdir, files in step: #for the root folder, subdir, and files in each step
        [dyfs.append(os.path.join(root,file)) for file in files if file.endswith(".dyf")] #append any file paths to the dyfs directory if the file ends with .dyf

########################################
###### prepare the output strings ######
########################################
unsupportedFileVersion, unsupportedCustomNodes, supportedCustomNodes, noPythonCustomNodes = [], [], [], []


########################################
###### parse dyf for python nodes ######
########################################
for file in dyfs: #for each file
    with open(file) as dyf: #open the file
        dyfStr = dyf.read() #read the dyf as a string
        if dyfStr[0]=="<": #if the first character is a "<" the dyf is in file version 1 and uncompatible by default
            unsupportedFileVersion.append("\t"+file) #append anything older than dirt to the appropriate list
        else:
            data = json.loads(dyfStr) #read the JSON structure from the string
            nodes = data["Nodes"] #get the nodes
            pythonNodes = [node for node in nodes if "Python" in node["NodeType"]] #filter to only nodes which are Python type
            if not pythonNodes: #if there is no Python node 
                noPythonCustomNodes.append("\t"+file) #append to the appropriate list 
            failing = [] #build a list for failing nodes
            for pythonNode in pythonNodes: #for each python node
                if not "Engine" in pythonNode.keys(): #if there is no engine tag the graph is really old and will try to default to IronPython2
                    failing.append([file, pythonNode["Id"]]) #append to the appropriate list
                elif pythonNode["Engine"] == "IronPython2": #if the engine tag is IronPython2
                    failing.append([file, pythonNode["Id"]]) #append to the appropriate list
            
            if failing: #if there is data in failing
                nodes = "\n".join(["\t\t\t"+i[1] for i in failing]) #get the nodes formatted as a string with tripple indent and new lines between each
                count = "\t\t"+str(len(failing)) + " python nodes with unsupported engine" #count the failing nodes and format as a string
                root = [i for i in packagePaths if i in file][0]
                shortPath = file.replace(root,"")
                result = "\n".join(["\t"+shortPath,count,nodes]) #join the result as a string with line breaks
                unsupportedCustomNodes.append(result) #append to the appropriate list 
            else: #otherwise
                supportedCustomNodes.append("\t"+file) #append the indented title to the appropriate list 

#########################################
##### Return the results to Dynamo ######
#########################################
report = {"Invalid or unsupported file format":unsupportedFileVersion, "Utilized Python Engine in unsupported": unsupportedCustomNodes, "Utilized Python Engine is supported": supportedCustomNodes, "No python in use": noPythonCustomNodes}
OUT = report #return the result to the Dynamo environment```

A quick test of the 8 most installed packages on the package manager in Revit 2022 gave me:

  • 61 invalid files
    • These are still Dynamo 1.0 formatted so they aren’t compatible with the new engine.
    • They were entirely from steam nodes, which hasn’t been updated in over a 7 years so it’s well and truly deprecated at this point.
  • 220 used no Python
    • These are custom nodes which just use OOTB nodes for execution.
    • No worries here - you’ve got bigger fish to fry.
  • 1 using a supported Python version.
    • Clockwork’s CustomNode.Propeorties.dyf node takes the cake here.
  • 834 using an unsupported Python version.
    • 89 in Archi-Lab which are likely not supported based on statements from the package author over the years, where it’s been indicated that there should not be any Python nodes in the package.
    • 293 in Clockwork. There is an effort underway to modernize on a CPython3 engine, but it is as of yet unreleased. Repository is here for anyone interested: ClockworkForDynamo/nodes/3.x at master · andydandy74/ClockworkForDynamo · GitHub
    • 349 in Genius Loci. I am not sure where this stands; last I it was firmly in camp ‘not going to upgrade at this time’.
    • 103 in spring nodes, which hasn’t been updated in nearly 5 years now, so also likely deprecated.

So not a very pretty picture for those looking for Revit automation packages at first glimpse. But it’s important to keep that 834 in perspective - as of Dynamo for Revit 3.2 (what I have to test handy at the moment) you’ve got 946 nodes in the core Dynamo library and another 776 in Dynamo for Revit. Add Rhythm (~276 nodes) and Archi-Lab’s C# nodes (499 C# nodes after removing the unsupported Python nodes) and you’re looking at 2586 nodes which don’t run into any issues with users configuration of their Python engine. The net result is a Dynamo environment with 3552 nodes, 3/4 of which work when the correct package version is installed (totaling 2718 nodes), and 1/4 of which require the proper version of the deprecated Python engine being installed (834).

Now that node count omits deprecated packages, and your library might look quite a bit different (i.e. installing Orchid or Sythesize would make more Python independent nodes, while installing Datashapes would increase the IronPython woes). But for a tool where office wide content management usually is best described as “we alternate between applying duct tape and spraying with WD40 until it works” I’d say that’s pretty good. But for ‘better’ results it’s less going to be about the end user education (what you pointed to above @GavinCrump) or the Dynamo team “locking-in” current versioning (the clear slate method) or someone somehow supporting all previous code forever (the ideal state which never existed, but we all were ignorant to the reality of), and firms developing, maintaining, and sharing best practices…

2 Likes