Converting a GeometryObject to it's geometry type. IronPython type casting?

I’ve asked this once before, but never got an answer and have encountered the issue again.

Basically, I want to get the Center Left/Right reference plane out of a family. So I use the GetReferenceByName method, for example, to pull the reference. Convert that with GetGeometryObjectFromReference which leaves me with… a geometry object that I can’t seem to do anything with.

Previously someone suggested I take a look at Genius loci’s Wall Edges References node, however, this issue doesn’t appear to happen with walls. I’m thinking it’s specific to FamilyInstances that said method returns a GeometryObject class object instead of an object of whatever that class that geometry actually is.

If I was in C#, I would be able to use the as keyword so far as I’m aware but I don’t know how to translate that into IronPython. I’ve seen some people use .Convert, but haven’t had any luck with that myself.

I’ve realized I can create a sketch plane using the reference pulled from the family instance, so I can use that here. Still curious if there’s a way to convert the GeometryObject though.

Hi,

Out of curiosity, why don’t use the Genius Loci nodes that do exactly your goal ?

get the Center Left/Right reference plane

They contain the keyword FamilyInstance.

1 Like

For starters, I wasn’t aware of the node. However I’ve also moved into writing my scripts exclusively in Python. As I got more into using the API, I eventually found I prefer stay in the Python space rather than jump in and out.
Also I suppose I just like writing my own code, and the independence that comes with it.

This, conveniently, also really simplifies deployment. I don’t have to deal with making sure everyone’s dynamo is pathed properly and/or has the needed packages installed. I realize this has since been addressed in 2023, but we’re yet to have projects that use 2023.

It also means you’ll need to re-write and re-test all of your .dyns between each Revit build and deployment. A few years of helping firms adapt full scale integration has taught me that this just kicks the can down the road into a far worse condition rather than simplifies things.

Best bet it to wrap the Python in a custom node, package that into a custom package locally deployed for each Dynamo build, and distribute that to all users at log-in or via other robo-copy means (something your IT team should be able to do without batting a eyelash). This allows altering your python contents for a given build to address the many Revit, Revit API, add-ins, and Dynamo changes which can occur with any given release or update.

1 Like

It also means you’ll need to re-write and re-test all of your .dyns between each Revit build and deployment.

I’m aware, it’s a fair criticism. The ultimate goal here is to move out of Dynamo to an actual addin, at which point that will all be required anyways so far as I’m aware. I just need to find time to finish learning the basics of C# and then actually foraying into the world of addins.

I wouldn’t be surprised if not all my current scripts got converted and some remained in dynamo, but at that point I’m going to have to continue watching for changes in the API regardless.

Edit:
More over it was mostly just something of an added benefit. Not the main reason I tend to write all my own code. I’ve learned significantly more in doing that than I feel I would have using pre-made packages.
There’s also always the possibility that a dev drops support for, or is slow to update their package. If an important script relies on said package, I’m stuck between a rock and a hard place until it gets fixed or there is an alternative created. Less of a sure eventuality than API changes breaking something, but still a possibility.