Tracking Used Graphs

I’m creating a lot of dynamo scripts lately and I believe my company will start implementing them soon. I work for a 800ish employee company and would like to track who and when they are used.

Idea 1) I first thought about having each script send an email to a dedicated gmail account with any associated data. I was going to use @john_pierson 's rhythm with no avail (issue here: Send email by Python?).

Idea 2) Next was to use @Radu_Gidei 's DynaSlack to send a slack message, however those nodes are locked and each user would have to download that package. I know I can have their package manager look to a network drive but I’d like to keep it as simple as possible.

My main question then is DynaSlack the best possibility? Are there any other ways to have scripts send data back to you?

2 Likes

Yeah, I actually initially built the email node with this kind of stuff in mind. Something changed with Google’s authentication and it was dead in the water (Even after enabling less secure apps). After this happened I kind of abandoned the project unfortunately.

That being said, you could even do something like have users write something locally then have that get injected to a server or database?

If what John is saying, that sending an email with SMTP protocol was not working, then perhaps you can find a will to enable google api, make an app and then use the oauth2 to send emails. that’s a pretty secure protocol. Here’s an example of the actual code to send emails, but you will need to set up google api first.

https://stackoverflow.com/questions/24195508/smtp-and-oauth-2

Personally if you wanted something robust I would setup a database and then send it a pocket of data every time user opens a given definition. Now, that’s not exactly easy and would require some coding. If no coding is the desired approach, then I would actually use Radu’s DynaSlack app, and setup a network share for the packages. That’s something you want to do anyways. Trying to do everything with the OOTB nodes in Dynamo is insane and some of the developers are moving away from Python nodes cc: @john_pierson

With the database approach I think you can actually pretty easily setup a NoSQL and then maybe use the SlingShot plugin by Nate.

4 Likes

Unfortunate with google’s authentication but it was cool while it lasted.

I like that idea. Could be as simple as writing to a network shared excel file? Or were you thinking something else?

1 Like

Why not write it to a database? It’s a lot less disruptive than receiving a message every time somebody runs a script. Also a lot easier to generate meaningful statistics from the stored data…

2 Likes

Definitely. Moving to ZeroTouch has changed my life.

I love this idea. This can start to lead to some great stuff like Don Rudder’s AU class.
http://au.autodesk.com/au-online/classes-on-demand/class-catalog/classes/year-2016/revit/sd20868

1 Like

Although I have little experience with oauth2, after reading through it it doesn’t seem horribly hard to implement. The database method sounds super sexy however I’m afraid I’m not quite at that point skill wise to be able to implement it. DynaSlack sounds like the easiest since, you’re right, I should setup a network share anyway.

Actually, looking through what @john_pierson posted on Don Rudder’s AU class it doesn’t look undoable. I’d have to pick up json and c# but that class looks like it covers most of the stuff.

@Andreas_Dieckmann, with the email idea I had a specific gmail setup for it, not my personal one. However at certain point it would have been pretty funny to get a beep on my personal gmail whenever one of my scripts was used. Kind of like crack.

I guess I’ll attempt a database and some c# with John’s suggestion and see what I could make of it. If not, use DynaSlack for an easy fix or in the interim. Thanks everyone for the ideas.

The database approach is really easy with Slingshot package. I use it for usage tracking of Dynamo scripts as well.
FYI: I recently added a couple of nodes to Clockwork that you may find useful if you want to know more about the machines that are running your scripts (pulling data on CPUs, memory use etc.). New package version will likely be released by the end of the month.

2 Likes

And now I begin stalking the package manager website accordingly…

I briefly tested the idea of writing user name, date, and run time to text file awhile back. Sadly I got pulled off onto other less fun stuff though. The database is a good idea and seems way more powerful so I may look into that soon, so let us know how well it goes @nathaniel.g.macdonal.

Hey everyone, a few thoughts here :

1. OAuth2
I’d recommend to steer clear of OAuth2 if you have limited experience with web & no desire to go down a deep rabbit hole. You’ll be getting expiring tokens and all sorts of issues. Personally, i find Google’s APIs quite nasty to work with.

2. DynaSlack
There’s a few people around that have used DynaSlack for precisely this, see here by @Jesper_Wallaert & here by @Martin_Romby. Having caught up with them at BiLT, I found out you can export all data from a slack channel out to CSV, so @Jesper_Wallaert is parsing that into charts. Flow is DynaSlack > Slack (muted channel) > CSV export > PowerBI/whatevs. I’m sure they’ll happily chime in with more guidance, some real nice work there !

3. Google Analytics
If you can set up a simple page on a server, you could use the built-in Web.Request node to fetch a specific URL, appending URL parameters to it or simply building up the URL. You would do both those on the Dynamo side with simple string operations. On that page, you’d have Google Analytics set up, so all you have to do to get charts is pass some data to it. I imagine you could even host that HTML page with GitHub static pages ? there’s a thought…

Examples :

URL : http://your-server-here.com/dynamoLibrary/standards/modelCheck.dyn
parameters : http://your-server-here.com/dynamoLog.html?dyn=MyScript.dyn&user=Radu&date=231017

4. DynaWeb
If you can set up a server/service, you could use the DynaWeb package (shameless plug, I made it) to send very specific web requests to a service. For example, you could add a row to a Google Sheets, send a request to Zapier, insert a row in a database that has a REST service etc etc, I’m sure there’s enough services out there that offer analytics as a service using webhooks/URLs/REST (check out Keen.io, but you could also use Google Analytics API directly). You can also use this to set up idea #1 or #3 if you want better control of all that.

5. Python route
You can’t make use of the requests library in IronPython so it’ll have to be something else or vanilla IronPython, there’s plenty of code out there on how to call/get a URL, so you can use this to implement any of the above. maybe @Gui_Talarico or some other Python-savvy folk can help here, i’m not into it.

Hope the above helps or at least provides some discussion points :slight_smile:

Radu

4 Likes

:raised_hands: :vulcan_salute:

1 Like

I presume you’re referring to the fact it’s not a custom node and you can’t open it up to see code inside ?
That’s cause it’s a compiled zero-touch node, but if you wanted to delve into it, all the code is available to view on the package’s repository on GitHub :

1 Like

Thank you very much. Given I haven’t gotten into much web stuff, DynaSlack seems to be the easiest/most efficient way. Like you said, I can take the csv’s and plop them in PowerBI and have a whole bunch of colorful graphs to show the graphs implementation success. Thanks for making the package and all the work you’ve put into the community.

2 Likes