Tracking Time

Hi All,

I am wondering if anyone here has used Dynamo for time-based queries - or know how it might be done. For example, how long it is taking to open a model per user, how long it takes to sync, or even how many hours are spent working on sheets as compared to modelling. I am looking to gather this information from each of our projects and push it to an officewide database.

This could be helpful in costing/resourcing more complex projects as well as monitoring large project teams that may be having speed issues.

Many thanks in advance!

Assuming you’re just on the local network and not a C4R environment, I recommend looking into Workshare Monitor as it has most (all?) of the data you mentioned.

Thanks for the reply!

Yes I am just looking at our local network arrangement right now (are there more tools/data available in C4R?).

We use Worksharing Monitor, but where is its data stored? Can it be referenced in Dynamo?

Regarding WS Monitor - I think it pulls the data from the *.SLOG file in the central model’s backup directory.
Regarding C4R - never used it, not planning on using it either, so can’t offer any insights on that. Best bet would be to see if C4R has any analytics functions. Otherwise a look at the Forge API ( might help, although I’m pretty sure that kind of data isn’t exposed there.
Journals also contain that information but are a lot harder to parse than *.SLOG files.

As an FYI: I am in the process of developing a Dynamo package for journal analysis. It will also have the capability to parse *.SLOG files - in fact, I worked on that part during the Christmas holidays but haven’t pushed it to my repository yet. If you’re interested in test driving the package, you can find it here:
Will probably add the SLOG-Reader tomorrow since it just needs a little more polishing.
Package is not available on the Dynamo package manager yet since it it still a bit of a work-in-progress.
Feedback welcome, though.

EDIT: The sheet vs. modeling data would be something that you could only pull from a journal. SLOG files only contain information that is relevant in terms of worksharing (i.e. model open, sync etc.).


Note to self: Spend more time stalking Andydandy74’s GitHub…

Seriously good work and excellent idea. If you’re looking into modifying the informative files you may want to consider some basic .ini editing stuff as well. Therein lies a handy means of clearing out the ‘previous files’ list which is shown when you open Revit, which is nice for those presentations where you don’t want the person viewing your screen to know what else you’ve been working on. Rather easy to write and useful, well has been for me at least. :wink:

C4R and BIM360 stuff is a good bit different from normal synchronization, so if/as you migrate you’ll have to change up the tools a good bit. Generally the same data is available if needed. If you go into workshare monitor, click the history button, and then export history shown you can save a .txt for the last five days of data on synchronization efforts. Detailed explanation can be found here:

I don’t recall if it has open times on there or not (I’m not in an environment to check), but I know that sync times and errors are given. Having to open each model and export the history could be rather time consuming though. Likely worth reviewing the work @Andreas_Dieckmann’s started and providing him the feedback he noted above. I’d test it myself but again, I’m not in an environment to test anything like this.

Data for sheet work vs modeling work will be mostly worthless as the journals don’t typically indicate when a user stops working to take a phone call, read an email, run to the kitchen for a cookie delivered from a vendor, and the like. If everyone was always working 100% the data would be more useful but that’s not a likely scenario in my experience.

@Andreas_Dieckmann Great stuff thanks. I will take a look and see how it works!

@JacobSmall We are not using C4R as yet, but that’s good to know for future reference. Breaks in the journal time should be fine. The exercise is not to measure individual productivity but to see how long a task will take under typical working conditions - coffee breaks withstanding. :slight_smile:

@e.c - FYI: I just added worksharing log support to the package.
Some things you could do with it:

  • Find out how long it takes to load a model
  • Find out how long synching takes
  • Find out who was working in central instead of local
  • Find out if the entire team is on the same Revit build

I can think of no less than 10 current jobs of ours that those 4 examples would benefit to a huge extent. Great work @Andreas_Dieckmann :slightly_smiling_face:

1 Like

Apologies for dragging up an old thread, but @Andreas_Dieckmann, can you let me know how i can install this package from github? I can’t find it in the Package search within Dynamo itself.

Thanks in advance

What version of Dynamo are you in? It’s certainly available on the package manager here:

Thanks Jacob, but that link just takes me to the homepage of
If i try searching for the package “journalysis” , nothing comes up:

Yeah, I should really put that on the package manager - here’s hoping I’ll get that done this month.
Meanwhile you can download the entire repo here and then just take the nodes you need:

1 Like

Thanks very much Andreas. Regarding tracking time - is there a method for getting the Load Duration including loading of linked models?

1 Like

There is indeed. Look forWorksharingSession.LoadedLinks and then feed the output into LoadedLink.LoadDuration.

Thanks for the tip @Andreas_Dieckmann. I thought i should give an update:

I added the load duration and the total total of the loaded link durations, but this was still significantly less than the real overall opening time - presumably there are other processes that take time?

After digging through the SLOG files, what I ended up doing was to get the DateTime of the '>Open:Local ’ worksharing event and the first event containing the string ‘<WSConfig’, and then find the time difference using the ‘TimeSpan.ByDateDifference’ node.

This is giving me opening durations which appear to be much closer to the actual durations measured using a stopwatch.

The only issue i’m having now is that the worksharing log files appear to be wiped periodically, so opening a log file on monday morning doesn’t appear to contain the data from friday. Have you come across this?


Thanks for the info. I think the WSConfig part is about applying the selection of which worksets are going to be opened. I had originally included that in the load time calculation but we got some scarily high numbers for sessions where users went for coffee breaks while in the Select Worksets dialogue. Will see if I can filter out such cases. Regarding the logs: make sure to also process the *.slog.bak file - that‘ll give you a few more days‘ worth of data.


Yeah, it’s not ideal admittedly! I’m assuming there isn’t actually a worksharing event triggered when the model finally opens, so there’s no timestamp?
Thanks for the .bak file tip, will look at that.

@paulstevens90 You were absolutely right about the WSConfig part. I re-added it to the load duration calculation. Will publish the package on the package manager soonish.

1 Like

Awesome, glad I could contribute! I can probably cut a third off my script when the package is updated!