I am trying to combine this with the Revit API OnDocumentChanged event in the Application class. RevitAPI documentation: Application Class (revitapidocs.com)
The idea is to subscribe to app.DocumentChanged, and execute the exact same function as the one for the Dynamo API. The idea is simple, but it doesnt work.
I am wondering what could be wrong. Is the app.DocumentChanged event even working from Dynamo?
Please help, would be much appreciated.
Extended code:
import clr
#import Dynamo API
clr.AddReference('DynamoRevitDS')
import Dynamo
dynamoRevit = Dynamo.Applications.DynamoRevit()
currentWorkspace = dynamoRevit.RevitDynamoModel.CurrentWorkspace
#import RevitAPI
clr.AddReference('RevitServices')
import RevitServices
from RevitServices.Persistence import DocumentManager
from RevitServices.Transactions import TransactionManager
#get handle to app, which contains documentChanged event
uiapp = DocumentManager.Instance.CurrentUIApplication
app = uiapp.Application
#the ForceRun function
def onEvaluationcompleted(obj, e):
for i in currentWorkspace.Nodes:
if i.Name.StartsWith("*Force"):
i.MarkNodeAsModified(True)
#attach Dynamo API handler to forceRun function
currentWorkspace.EvaluationCompleted += onEvaluationcompleted
#attach Revit API handler to forceRun function
app.DocumentChanged += onEvaluationcompleted
#output
OUT = "Done"
Is this all happening within Python? Are you automating the document change or are you expecting it to run when the user manually changes documents?
Keep in mind Dynamo runs in a singular transaction. A document change occurring outside that transaction wonât be caught. I would assume this is still the case in Automatic mode as the graph still wouldnât change, but I donât know that for sure.
I havenât used this yet, but I believe that it will cause nodes which previously ran to run on the subsequent run if renamed to have the correct tag (though I dislike using the nodeâs name for this, I appreciate the simplicity).
The use case would be if you have a graph, say one which appends a time stamp to the âreview sessionâ parameter of a sheet, which relies on the Datetime.Now node (not sure why youâd want this, but bear with me). You open the graph, run the code, and ALL subsequent values even if the selected sheet is change you always have the original time stamp. No bueno. This code forces the tagged object(s) to re-execute, allowing you to refresh that time stamp. A force reexecute command (available in the Tuneup view extension) can also cause a complete re-execution, but that is the entire graph, not a small portion thereof. Some nodes (ie: Math.RandomList) can have a trigger before their input to cause a new value, but that requires navigating the entire graph and toggling a boolean or slider - but the Datetime.Now node doesnât have an input so you have to replace the node or open/close the graph. Having a means of marking the DateTime.Now node as âdirtyâ and forcing it to re-execute solves the issue.
However as Nick pointed out document changed events donât help us trigger a new execution of the graph. In fact if you change documents your Dynamo graph often wonât run at all as itâs already tied to the previous document. Events in general canât trigger Dynamo actions directly, through you can utilize some a Revit add-in to send a command to run a Dynamo graph elsewhere (ie: external to your Revit session), or as in Revit as part of your active session. The stability of such tooling can vary, depending on what youâre doing with the graph youâre forcing to execute, and Dynamo isnât by any means stable for overly frequent calls (ie: document changed could mean I swapped from document A to document B and five minutes later back again, or it could mean I cycled though 20 documents while switching between open viewsâŚ), so if you do go down that route I recommend extreme caution.
Well said. And the Dynamo team is actively looking at how integrations consume it in a way which will change things quite a bit. That is further out, and you never know where things will eventually land, but I expect some really good stuff all the same though, as if only 25% of the ambitious goals are attained weâll be looking at some pretty ground breaking stuff.
One point of related clarity though.
This would actually need to be undertaken by the Revit team, as they own the way in which Dynamo is integrated to Revit. Similarly the Civil 3D team owns the way the tool interacts with Civil 3d. And the FormIt Team⌠Advance Steel team⌠Alias⌠And so on. The management of transactions is something which those teams take pretty seriously, and there are always trade offs in any system. Using a hypothetical here of data integrity and speed, if something is faster but it has a higher incidence of unresolved data corruption itâs likely a no-go. Similarly if itâs slower but increases data integrity itâs likely out; the trade off works both ways.
This isnât to say that the integration is idealized now - see my first paragraph - but that in systems as complex as Revit no one aspect can be taken in a vacuum. What could be faster from a transaction standpoint may actually cause issues with worksharing.
Iâm also not convinced that the transactions (referring to the commitment of content to the Revit environment) are the slow bit; element binding, session trace, and marshaling are all playing a role, and at a certain point the dataset for these items will consume more RAM than is available, and page faults at that point are very costly from a performance standpoint. Optimizing code base and/or process to reduce datasets, node counts, transaction frequency, round tripping to Revit, and the like certainly comes in handy at this point.
Your results may vary of course, and Iâd be interested (and perhaps the larger team as well) in any analysis you have on the topic; the larger community could likely benefit from a âcreating custom nodes built for speedâ share out as well.
This is a part of the Dynamo for Revit repo, and thereby a part of the Revit teamâs efforts, not the Dynamo teamâs efforts. Itâs an odd distinction to be sure, but an important one to make. Effectively if the work doesnât apply to both Dynamo for Revit and Dynamo for Civil 3D then it is quite unlikely the Dynamo team owns it (again, for now), but rather the integrating team.
From what I can see the transaction manager utilizes one of two methods - debug mode or automatic mode. From what I can see the default is automatic, and will show a single transaction in the undo menu by opening a transaction group, and wrapping sub-transactions under that. This is what happens for me when I use the SetParameterByName node for changing the comment parameter for all doors in a project to âJacob was hereâ.
Conversely the debug mode creates a separate transaction for parameter being set, allowing the team to run automated testing in a more robust way - if one of the transactions fail they can look into why and the machine continues happily (ok begrudgingly) onto the next element in the test. Automated testing at this scale is a HUGE part of the development efforts, as things have to be tested by (my offhand count) 6 different worksharing environments for data integrity. As such it would not surprise me if development builds of Dyanmo forced transactions to the debug mode, with shipping versions toggling it over. I do see a few references to setting the transaction mode to automatic elsewhere in the repository. As you noted debug mode wouldnât be as useful in day to day use (excepting cases where a transaction needs to be committed to get data to update before the next action in the sequence).
Itâs also worth noting that a good amount of data can be read via the API without an open transaction. This may be somewhat unique to Revit, as other tools (AutoCAD certainly) require opening a transaction for read actions (in the case of AutoCAD with a âForReadâ tag) while in Revit generally reading is free form, while writing requires the transaction.
Interesting stuff either way, but weâre a good bit off topic. I may push this into itâs own thread later (too tired right now and I donât want to miss something while my brain wakes up) so the community can refer to it more directly. Good stuff in here that Iâd hate to see lost in the shuffle while diverting the attention from the original topic.