Lifting python code out of custom package nodes?

I’m working on a batch of custom tools and scripts to use across my practice, and running into the inevitable issue of having all users download the relevant packages.

I know that a lot of custom nodes have the ability to access the code behind them. Is there any issue with lifting this code out of the node itself and into a python node? I’ve tried it on a couple of nodes and it behaves as desired (I understand this won’t be the case with all), but is this an acceptable method, or is it frowned upon?

I’m not looking to pass off work as my own, just looking to streamline my process.

@dan.buckingham ,

i would recomment to look at Github. A lot of users manage there content there. f.e. ClockWork…

KR

Andreas

1 Like

That is going to depend on the individual package developer’s TOS. Most of the major packages are fine with just about anything that credits the original developer. You’ll even see that some of the published packages contain nodes that already credit other package authors. But it’s always your job to confirm you’re using the content correctly.

3 Likes

I would also add to Nick’s point that you will have future proofing issues with your chosen method.

Say some methods within the API change/update, the package author will capture all this as they develop their package. But since you have extracted the Python, it is a job you will have to do as you go along retrospectively as the errors occur.

I would personally suggest downloading packages to a server location then asking your IT department to push all the packages from this location to all users’ C: drive. This should be well within their capabilities…

If you do not have an IT department it is something you could script yourself in Dynamo then ask all users to run the script.

3 Likes

Thanks, I have a script that copies package folders from a common server to users local PCs, but of course this is one more button that certain users “simply don’t have time” to press…

This just kicks the problem down the line. Eventually ALL of that python will need to be updated for a new version/API change/whatever breaks the current code. There is no such thing as code which doesn’t have to change - even the calculator in windows is drastically different underneath than it was back in 1993.

When that change is forced, you now have to open EVERY SINGLE GRAPH with python in it, do a save as to get a new .dyn, and edit the Python to work with the new environment. Then you have to make sure that users point at the right .dyn for their environment (ie: a 2021 user should point at the graph for 2021, etc.). And they have to always go to the right version, unlike packages which are ‘set once and forget’.

Beyond that, every python node will have to be edited for the particular lacing strategy of the situation. If the graph worked with one list of doors, but now you want it to process a list of doors and a list of windows you’re writing new code (and now you have 101 graphs to deal with). And that issue with ‘two lists’ will require specialist knowledge (likely that only you have) of list handling in Python to fix, not a random Dynamo user who could just check “@L2” on the node.

Lastly, you can’t address zero touch nodes this way, which means the following packages can’t be used in your environment:

  • Rhythm
  • Archi-Lab
  • MeshToolkit
  • Pattern toolkit
  • Dynamo Text
    *Refinery Toolkit
  • Civil 3D Toolkit
  • Camber

That list includes some really heavy hitters… Basically 4 of the top 8 packages for Revit aren’t viable, the three top packages for Civil 3D, and most of the top packages for Generative Design are no-goes for your org in perpetuity until you teach all your users how to leverage the package manager, or you start configuring their environments.

Instead, I recommend you deploy the packages just like you deploy other content.

  1. Build a ‘source’ directory of packages for each environment you have (one for Revit 2021, one for 2022, one for 2023, one for 2024, one for Civil 3D 2021, 2022…)
  2. Utilize Robocopy, Group Policy Login, Installers, or other ‘push’ techniques to shift content onto user’s systems.
  3. Provide a ‘hot link’ to allow updating the content in their packages directory on demand.
  4. Build one directory of graphs built to work with the packages deployed in any given environment.
1 Like

I’m sure you know this already but, this is a horrible reason to reduce package use. You’re throwing out all the reasons packages are helpful and efficient because some people don’t want to follow the necessary steps to use automation. It’s not a valid request. It’s a non sequitur. I would strongly push back on this.

2 Likes

Thanks all, I’ll stick to using the packages & nodes as intended!

Regarding Rhythm specifically:

Even though I use C# primarily, I also provide some of my nodes as python scripts for those interested.

Although, as @jacob.small said, you’re the one on the line for future proofing those python scripts as it’s not code I “ship” per se.

5 Likes

Just a suggestion that might suit you if you don’t mind the pain: you could stop using Dynamo packages and use Dynamo node of python script instead within the dyn file, and updating same script code over the time and adding conditions to check current revit version and appropriate variant without having multiple dyn files by each Revit version.

You can use the plugin Orkestra to distribute the scripts to all users.

Most package authors wont mind as long as you handle any errors that may arise, and ideally keep any comments in the core node that attribute the source. Most packages with significant effort will list URL’s or author names at the top using #Comments. Generally if a license is involved it will either be in the node or in a link from the node comments (if not the package description), otherwise they’re really not giving people an easy way to check that.

Personally you’re welcome to lift whatever you want from Crumple if it helps. Most nodes are just a Python block in a node, with the intent that people can lift them out to both learn about the nodes as well as modify the code insitu as they desire to suit more fringe applications. Keep in mind you will lose some iteration/list level behaviours and assumptions sometimes depending on how the input structures are noted for the nodes.

The main no-no I think most package managers will agree is not to lift code into a secondary package that becomes a chimera package, full of multiple packages. Some firms do this so they can just have users install one package for their company but it ultimately just puts a bad package out there. A lot of firms do this and it’s pretty disrespectful and will inevitably lead to unmaintained code with breadcrumb trails leading back to package managers who since updated and maintained their own packages, and will probably have to tell whoever stumbled across their code to go bother the maker of said chimera (who inevitably stops maintaining it anyway).

2 Likes

without knowing your company structure ive going to give some feedback here if there is a small orginastion 20-25 people using dynamo. across your organisation might work but if you have bigger you are going to need buttons that is to say pyrevit or nodes coded in c+ so people can push a button.

most people using revit are hesitant to use dynamo since they dont understand it.
works great in small groups when only a couple of person use it but getting a larger group to use it doesnt work very well since the packages needs to be maintained and the group needs to understand how to use the diffrent nodes.

1 Like

Why not use a logon script, automatically copy the files when users log in on their pc?

To deal with the hassle of updating and maintaining packages, we have a logon script that checks if our shared network folder has packages that are not present (or older date) in the default user package location, and copies everything automatically to the users folder.

1 Like

My practice is approx 80 people, with perhaps 60 Revit users. I’ve created a ribbon via pyRevit which I’m trialling with a few users, but I’ve made this using Dynamo scripts as I’m not strong with Python yet.

This would probably be the neatest and most efficient solution, its a bit beyond my current knowledge so will need some researching

Thanks Gavin, I appreciate that. In fact your Export to PDF node was one that I trialled. I think on balance I will keep the nodes as intended, and look into more efficient ways of distributing packages. Orkestra looks promising

1 Like

Your IT team should be able to help with building that into your group policy logon scripts., but this article should provide some self help: Using Startup, Shutdown, Logon, and Logoff Scripts in Group Policy | Microsoft Learn

You could also look at the robocopy example around 27:15 here: https://m.youtube.com/watch?v=FKgNmMlaChc&list=PLdlF7MirPEC2yNFTGymESd3t7Xosfk9c2&index=2&pp=iAQB

4 Likes