Revit file size means very little to the point of effectively nothing. A LOT of data can be generated from very little content.
- Assume a vector takes 3 doubles so 192 bits to store.
- A point mesh of say 501 vectors and 167 faces can therefore be broken down into 0.14mb of data.
- Assuming a transform takes four vectors to form so you can store one in 768 bits.
- With the remaining 5.36mb of space you can create 6973 instances of said geometry which results in something like 3.5 million mesh points.
All those numbers are way off as they don’t take everything into account. As an example things shrink way down when you add the other stuff in the file (materials and textures), but our hypothetical file is standard text - things grow again when you take into account Revit’s 1:20 compression ratio, but shrink again when you take into account everything else it stores in there. But no matter how y ou slice it: file size isn’t directly related to performance these days.
As far as why it’s slow: the conversion and expansion of that geometry from Dynamo to Revit can be slow (multiple sets of , and then you’re doing intersection testing in a cross product structure, which will be slow at a certain level of complexity, and then you’re converting to a bounding box and then a cuboid… it’s a lot of added back and forth which is going to slow stuff down. You also may have Element Binding at play, and converting those cuboids to family instances isn’t going to be quick as Dynamo has to geoemtry to SAT (so first it tesselates, then exports to a fancy text file), then Revit needs to start the family, then import the SAT, then load the family into the project…
How long would it take you to do all of this by hand?