This is my Bounding Box:
Nothing at the origin.
The above example is different from my first post but the navigation troubles are the same. It especially happens when panning, then the geometry flies of to who know where?
@John.DeLeeuw I haven’t seen this behavior, or perhaps I am not understanding it. Can you do a quick screencast recording illustrating the behavior?
@Michael_Kirschner2 does this ring any bells for you?
Woah. I haven’t seen this behavior previously. Wondering if the preview geometry is calculating the full length of an arc here.
@John.DeLeeuw The default geometry scaling in Dynamo for Civil 3D is on Medium for a reason, when extracting feature lines from corridors you can get the wrong representation if you have it set on Large or Extra Large. It is one of the well known issues in CivilConnection as well since the Geometry Scaling was introduced.
There is also a reason why people set the geometry scaling on large
I agree with @John.DeLeeuw that navigating with this setting is not pretty. You say it is a well known issue, will it be solved one day?
I’ve seen tips that users should correct large coordinates to the wcs origin and after the process correct them back, but that is not a real solution.
This works for Revit workflows and for operating on the data (e.g. debugging) it will not work for Civil 3D.
The panning is annoying but is not the biggest issue IMHO: the definition of the PolyCurves is affected.
I don’t hold the information for the resolution of this bug LINK
@Paolo_Emilio_Serra1, this is what Dynamo is telling me to do when I select an object in my Civil 3D CS:
@John.DeLeeuw we need to cope with the warning for the time being.
I appreciate the explanations in this thread, but it seems to be a denial of the fact that in Civil3D we work with real world coordinates.
The actual area covered on a project will be ‘medium’, but the script won’t run unless it’s set to Extra Large … and then you get these branching errors, or other apparent glitches where a continuous polycurve is treated as if it has gaps.
We have from 0 to 100,000,000 to play with … but these errors still happen with typical UTM coordinates with eastings around 500,000 and northings around 5,500,000.
We shouldn’t have to shift anything. Dynamo Devs know the limitations of their systems. They should have a mechanism in place where Dynamo recognizes (or is provided with) the bounding box area of a task - which will dictate the scaling needed, and silently shifts the numbers if they can’t be digested, runs the process, and shifts them back to present the results.
I don’t understand this quote - when dynamo encounters a large number it raises a warning suggesting that you change the scaling factor, it does not stop the run or return null - I think the warning perhaps should be more informative and less instructive though, as you point out the solution in some cases is not to follow its advise.
Assuming you are measuring in feet, extra large coordinates will get you 3/4 of the way around the world but will round things off to the size of a subway sandwich, while and Large will get you across the English Channel But will round stuff to the size of a 1/2 carrot diamond.
If your both bigger than the English Channel and smaller than a sandwich then you have to get creative. Shift your relevant geometry to the origin prior to working with it, and shift it back afterwards. The data will be correct but you may see display issues in Dynamo.
Serious question that may sound sarcastic but isn’t: how do you account for the curvature of the earth distorting your dimensions on projects which are both ‘smaller than a sandwich and bigger than the English Channel’?
We didn’t make the coordinate systems we are required to work in. Can’t help that we have coordinates in the 500,000 - 1,000,000 range.
The Large range has been working well in most cases for me. I’ll be the first to admit that I’m becoming increasingly confused on this issue since we get varying recommendations on geometry scaling from different Autodesk representatives. Also, why include a geometry range setting in the software if you’re not supposed to use anything but medium?
Your question warrants a dive into surveying principles like map projections, scale factors, and grid vs. ground coordinates. It’s a discussion that is probably outside the scope of this thread, but might make for some good reading.
Thanks for the introductory search terms.
I always enjoy getting to learn about new stuff as the community grows.
I think part of the point is being missed. The coordinates are big because that’s how the UTM coordinate system works. While the actual area of interest is small. Much smaller than any where the curvature of the earth matters.
My specific example involves a pipeline segment just 1.6km long. I’m trying to ‘thicken’ or ‘offset’ some survey points that follow a linear path adjacent to the pipeline so that I can built a narrow surface with a width that extends a few metres to the other side of the pipeline alignment.
The point is actually to apply some ‘depth of cover’ values to the pipeline, where the survey data is not actually directly on the pipeline polyline (don’t ask Why, it’s not relevant). I’m using the Civil3D nodes to drape the vertices of my pipeline on Topo and on my Depth of Cover surface so I can subtract one from the other and put the top of my pipeline at it’s actual proposed elevation below topo.
As part of this, I import the Depth of Cover survey data. It’s UTM coordinates, XYZ, 3 decimals.
I make a PolyCurve from that data.
PolyCurve.ByPoints demands that I use the Extra Large scaling for this Or It Will Not Run. However, with the Extra Large scaling, It Will Not Run - Giving the PolyCurves may be Branching warning.
Whether or not you think this is actually running or not is semantics when the output of the node is Null.
Alternatively, to avoid the problem with PolyCurves, I can use the survey data to make a 3D Polyline in Civil3D with Polyline3D.ByPoints, and then suck that into Dynamo using Object.Geometry … where again I get the PolyCurves may be Branching error. Despite it most definitely being one single non-branching polyline.
Either way, in the Dynamo background, you can see gaps between some of the vertices of the resulting PolyCurve.
If Dynamo for Civil3D is going to make us fake (shift) all our datasets just to make the nodes function, or so we can be sure the nodes are actually doing what we expect without rounding off significant digits, it’s going to make Dynamo much less useful.
One method may be to scale your geometry down at the beginning of your graph (say by a factor or 10 or 100), perform your operations, and then scale it back up by the same factor before committing your changes back to Civil3D.
Hi - I am trying to understand if the node is not executing at all or executing and failing, because the “demand” that you are suggesting is raised should not be a demand, and instead the node should run and should not return null (in most cases, even if the numbers are large) - but I think the situation you are encountering is that you have coordinates very far away (large numbers) AND you are trying to operate on a small section of the model, as you described, and in this case, yes this geometry scaling function will not help you at all.
For now I would scale your subset of geometry down manually before the operations you must perform or translate it, and back - then you can decide which scale factor to use. Automatically doing either of these is possible, but theres no roadmap for that feature at this time.
ok, so apologies for not replying sooner, and thanks for the replies. I did eventually get my script to work after changing a fundamental step in how I attacked my problem. I thought that the script wasn’t working when the geometry scaling warning appeared, but it was something else. I just had to grit my teeth and ignore the geometry scaling warnings.