I am wondering if anyone here has used Dynamo for time-based queries - or know how it might be done. For example, how long it is taking to open a model per user, how long it takes to sync, or even how many hours are spent working on sheets as compared to modelling. I am looking to gather this information from each of our projects and push it to an officewide database.
This could be helpful in costing/resourcing more complex projects as well as monitoring large project teams that may be having speed issues.
Assuming youâre just on the local network and not a C4R environment, I recommend looking into Workshare Monitor as it has most (all?) of the data you mentioned.
Regarding WS Monitor - I think it pulls the data from the *.SLOG file in the central modelâs backup directory.
Regarding C4R - never used it, not planning on using it either, so canât offer any insights on that. Best bet would be to see if C4R has any analytics functions. Otherwise a look at the Forge API (https://developer.autodesk.com/en/docs/viewer/v2/reference/javascript/) might help, although Iâm pretty sure that kind of data isnât exposed there.
Journals also contain that information but are a lot harder to parse than *.SLOG files.
As an FYI: I am in the process of developing a Dynamo package for journal analysis. It will also have the capability to parse *.SLOG files - in fact, I worked on that part during the Christmas holidays but havenât pushed it to my repository yet. If youâre interested in test driving the package, you can find it here: https://github.com/andydandy74/Journalysis
Will probably add the SLOG-Reader tomorrow since it just needs a little more polishing.
Package is not available on the Dynamo package manager yet since it it still a bit of a work-in-progress.
Feedback welcome, though.
EDIT: The sheet vs. modeling data would be something that you could only pull from a journal. SLOG files only contain information that is relevant in terms of worksharing (i.e. model open, sync etc.).
Note to self: Spend more time stalking Andydandy74âs GitHubâŚ
Seriously good work and excellent idea. If youâre looking into modifying the informative files you may want to consider some basic .ini editing stuff as well. Therein lies a handy means of clearing out the âprevious filesâ list which is shown when you open Revit, which is nice for those presentations where you donât want the person viewing your screen to know what else youâve been working on. Rather easy to write and useful, well has been for me at least.
C4R and BIM360 stuff is a good bit different from normal synchronization, so if/as you migrate youâll have to change up the tools a good bit. Generally the same data is available if needed. If you go into workshare monitor, click the history button, and then export history shown you can save a .txt for the last five days of data on synchronization efforts. Detailed explanation can be found here:
I donât recall if it has open times on there or not (Iâm not in an environment to check), but I know that sync times and errors are given. Having to open each model and export the history could be rather time consuming though. Likely worth reviewing the work @Andreas_Dieckmannâs started and providing him the feedback he noted above. Iâd test it myself but again, Iâm not in an environment to test anything like this.
Data for sheet work vs modeling work will be mostly worthless as the journals donât typically indicate when a user stops working to take a phone call, read an email, run to the kitchen for a cookie delivered from a vendor, and the like. If everyone was always working 100% the data would be more useful but thatâs not a likely scenario in my experience.
@Andreas_Dieckmann Great stuff thanks. I will take a look and see how it works!
@jacob.small We are not using C4R as yet, but thatâs good to know for future reference. Breaks in the journal time should be fine. The exercise is not to measure individual productivity but to see how long a task will take under typical working conditions - coffee breaks withstanding.
Apologies for dragging up an old thread, but @Andreas_Dieckmann, can you let me know how i can install this package from github? I canât find it in the Package search within Dynamo itself.
Thanks Jacob, but that link just takes me to the homepage of https://dynamopackages.com/
If i try searching for the package âjournalysisâ , nothing comes up:
Yeah, I should really put that on the package manager - hereâs hoping Iâll get that done this month.
Meanwhile you can download the entire repo here and then just take the nodes you need: https://github.com/andydandy74/Journalysis/archive/master.zip
Thanks for the tip @Andreas_Dieckmann. I thought i should give an update:
I added the load duration and the total total of the loaded link durations, but this was still significantly less than the real overall opening time - presumably there are other processes that take time?
After digging through the SLOG files, what I ended up doing was to get the DateTime of the '>Open:Local â worksharing event and the first event containing the string â<WSConfigâ, and then find the time difference using the âTimeSpan.ByDateDifferenceâ node.
This is giving me opening durations which appear to be much closer to the actual durations measured using a stopwatch.
The only issue iâm having now is that the worksharing log files appear to be wiped periodically, so opening a log file on monday morning doesnât appear to contain the data from friday. Have you come across this?
Thanks for the info. I think the WSConfig part is about applying the selection of which worksets are going to be opened. I had originally included that in the load time calculation but we got some scarily high numbers for sessions where users went for coffee breaks while in the Select Worksets dialogue. Will see if I can filter out such cases. Regarding the logs: make sure to also process the *.slog.bak file - thatâll give you a few more daysâ worth of data.
Yeah, itâs not ideal admittedly! Iâm assuming there isnât actually a worksharing event triggered when the model finally opens, so thereâs no timestamp?
Thanks for the .bak file tip, will look at that.
@paulstevens90 You were absolutely right about the WSConfig part. I re-added it to the load duration calculation. Will publish the package on the package manager soonish.