ForceRun a node on Revit DocumentChanged

I am trying to force a node rerun based on the Revit DocumentChanged event.

I used the Python code in this link, which makes it possible to forceRun any node that starts with name *Force. It uses the Dynamo API for this. I got this part working.
How to force nodes to rerun without changes / When nodes rerun? - Developers - Dynamo (dynamobim.com)

I am trying to combine this with the Revit API OnDocumentChanged event in the Application class. RevitAPI documentation:
Application Class (revitapidocs.com)

The idea is to subscribe to app.DocumentChanged, and execute the exact same function as the one for the Dynamo API. The idea is simple, but it doesnt work.

I am wondering what could be wrong. Is the app.DocumentChanged event even working from Dynamo?

Please help, would be much appreciated.

Extended code:

import clr

#import Dynamo API
clr.AddReference('DynamoRevitDS')
import Dynamo
dynamoRevit = Dynamo.Applications.DynamoRevit()
currentWorkspace = dynamoRevit.RevitDynamoModel.CurrentWorkspace

#import RevitAPI
clr.AddReference('RevitServices')
import RevitServices
from RevitServices.Persistence import DocumentManager
from RevitServices.Transactions import TransactionManager

#get handle to app, which contains documentChanged event
uiapp = DocumentManager.Instance.CurrentUIApplication
app = uiapp.Application

#the ForceRun function
def onEvaluationcompleted(obj, e):
	for i in currentWorkspace.Nodes:
		if i.Name.StartsWith("*Force"):
			i.MarkNodeAsModified(True)

#attach Dynamo API handler to forceRun function
currentWorkspace.EvaluationCompleted += onEvaluationcompleted

#attach Revit API handler to forceRun function
app.DocumentChanged += onEvaluationcompleted

#output
OUT = "Done"

Is this all happening within Python? Are you automating the document change or are you expecting it to run when the user manually changes documents?

Keep in mind Dynamo runs in a singular transaction. A document change occurring outside that transaction won’t be caught. I would assume this is still the case in Automatic mode as the graph still wouldn’t change, but I don’t know that for sure.

1 Like

I haven’t used this yet, but I believe that it will cause nodes which previously ran to run on the subsequent run if renamed to have the correct tag (though I dislike using the node’s name for this, I appreciate the simplicity).

The use case would be if you have a graph, say one which appends a time stamp to the ‘review session’ parameter of a sheet, which relies on the Datetime.Now node (not sure why you’d want this, but bear with me). You open the graph, run the code, and ALL subsequent values even if the selected sheet is change you always have the original time stamp. No bueno. This code forces the tagged object(s) to re-execute, allowing you to refresh that time stamp. A force reexecute command (available in the Tuneup view extension) can also cause a complete re-execution, but that is the entire graph, not a small portion thereof. Some nodes (ie: Math.RandomList) can have a trigger before their input to cause a new value, but that requires navigating the entire graph and toggling a boolean or slider - but the Datetime.Now node doesn’t have an input so you have to replace the node or open/close the graph. Having a means of marking the DateTime.Now node as ‘dirty’ and forcing it to re-execute solves the issue.

However as Nick pointed out document changed events don’t help us trigger a new execution of the graph. In fact if you change documents your Dynamo graph often won’t run at all as it’s already tied to the previous document. Events in general can’t trigger Dynamo actions directly, through you can utilize some a Revit add-in to send a command to run a Dynamo graph elsewhere (ie: external to your Revit session), or as in Revit as part of your active session. The stability of such tooling can vary, depending on what you’re doing with the graph you’re forcing to execute, and Dynamo isn’t by any means stable for overly frequent calls (ie: document changed could mean I swapped from document A to document B and five minutes later back again, or it could mean I cycled though 20 documents while switching between open views…), so if you do go down that route I recommend extreme caution.

3 Likes

Well said. And the Dynamo team is actively looking at how integrations consume it in a way which will change things quite a bit. That is further out, and you never know where things will eventually land, but I expect some really good stuff all the same though, as if only 25% of the ambitious goals are attained we’ll be looking at some pretty ground breaking stuff.

One point of related clarity though.

This would actually need to be undertaken by the Revit team, as they own the way in which Dynamo is integrated to Revit. Similarly the Civil 3D team owns the way the tool interacts with Civil 3d. And the FormIt Team… Advance Steel team… Alias… And so on. The management of transactions is something which those teams take pretty seriously, and there are always trade offs in any system. Using a hypothetical here of data integrity and speed, if something is faster but it has a higher incidence of unresolved data corruption it’s likely a no-go. Similarly if it’s slower but increases data integrity it’s likely out; the trade off works both ways.

This isn’t to say that the integration is idealized now - see my first paragraph - but that in systems as complex as Revit no one aspect can be taken in a vacuum. What could be faster from a transaction standpoint may actually cause issues with worksharing.

I’m also not convinced that the transactions (referring to the commitment of content to the Revit environment) are the slow bit; element binding, session trace, and marshaling are all playing a role, and at a certain point the dataset for these items will consume more RAM than is available, and page faults at that point are very costly from a performance standpoint. Optimizing code base and/or process to reduce datasets, node counts, transaction frequency, round tripping to Revit, and the like certainly comes in handy at this point.

Your results may vary of course, and I’d be interested (and perhaps the larger team as well) in any analysis you have on the topic; the larger community could likely benefit from a ‘creating custom nodes built for speed’ share out as well.

This is a part of the Dynamo for Revit repo, and thereby a part of the Revit team’s efforts, not the Dynamo team’s efforts. It’s an odd distinction to be sure, but an important one to make. Effectively if the work doesn’t apply to both Dynamo for Revit and Dynamo for Civil 3D then it is quite unlikely the Dynamo team owns it (again, for now), but rather the integrating team.

From what I can see the transaction manager utilizes one of two methods - debug mode or automatic mode. From what I can see the default is automatic, and will show a single transaction in the undo menu by opening a transaction group, and wrapping sub-transactions under that. This is what happens for me when I use the SetParameterByName node for changing the comment parameter for all doors in a project to “Jacob was here”.

Conversely the debug mode creates a separate transaction for parameter being set, allowing the team to run automated testing in a more robust way - if one of the transactions fail they can look into why and the machine continues happily (ok begrudgingly) onto the next element in the test. Automated testing at this scale is a HUGE part of the development efforts, as things have to be tested by (my offhand count) 6 different worksharing environments for data integrity. As such it would not surprise me if development builds of Dyanmo forced transactions to the debug mode, with shipping versions toggling it over. I do see a few references to setting the transaction mode to automatic elsewhere in the repository. As you noted debug mode wouldn’t be as useful in day to day use (excepting cases where a transaction needs to be committed to get data to update before the next action in the sequence).

It’s also worth noting that a good amount of data can be read via the API without an open transaction. This may be somewhat unique to Revit, as other tools (AutoCAD certainly) require opening a transaction for read actions (in the case of AutoCAD with a ‘ForRead” tag) while in Revit generally reading is free form, while writing requires the transaction.

Interesting stuff either way, but we’re a good bit off topic. I may push this into it’s own thread later (too tired right now and I don’t want to miss something while my brain wakes up) so the community can refer to it more directly. Good stuff in here that I’d hate to see lost in the shuffle while diverting the attention from the original topic. :slight_smile:

Thanks for the response guys. For now I solved my problem with a workaround.

I am glad to see that the question triggered such a vivid, in-depth and insightful discussion among some key players :slight_smile:

To keep it simple for myself: for now I accept Nick’s answer that the nature of Dynamo’s transactions doenst work with OnDocumentChanged.

Many thanks all

1 Like