Checking Element file size

Hi gurus,

Just wondering if there are any package or node that helps report a revit element’s actual size in dynamo?
Sometimes dealing with heavy file sizes and it helps to target specific elements that are actually causing it.

Been doing several online researches and on dynamo but nothing of the sort is popping up.

Cheers,
J

1 Like

Nice thought…

@solamour @john_pierson any lead…

Dont seem to recall it is possible unless you get it outside the model. Most firms and apps that do this will save the families out to explorer and then can verify their size that way. Revit has an option in save to save out all families.

1 Like

Thanks for the response!
Well, not quite on the family listing but the actual modelled instances instead. These do make up to quite a huge portion of the model size in general and they’re generally unsizeable.

First up: The Model Checker from BIM Interoperability Tools is free and does a good job at reporting this. I believe ou can even set up a rule to count and/or list families over a given size. I’d recommend using that (or a similar tool) if identification of large families is something you’re after. It’d be significantly faster than Dynamo (purpose built tools are typically faster than Dynamo as they operate at a lower level and only have to do the one thing they do, rather than EVERYTHING which is what Dynamo aims for).

Second up: Family size has way less impact on file performance than most think. I’ve proven this in a few demos over the years by building a 500mb family which was well constructed and swapping it out for a 25mb family which was not, on a job with a few hundred instances of each. Edits in the former were quick, while edits in the later stunk.

Lastly… If you did want to do this via Dynamo, you’d have to do the following:

  1. Get a list of all the families (not the types or instances but the families) in the document.
  2. For each family, perform the following loop:
  • Open the family document
  • Save the family document to a temp location on disc
  • Close the family document
  • Get the size of the temp file
  • Delete the temp file

This is best achieved in one action, as otherwise you’d have to open ALL THE FAMILY DOCUMENTS AT ONCE, then save each (passing the list of all the open families), then close them all (finally freeing up that memory). That would put pretty much any CPU into a “TOO MANY THINGS WITHOUT ENOUGH MEMORY” state, and would take a lot longer than doing what you need to do on each family in a loop (this is also why I’m not a fan of passing documents; at scale the method is less effective).

The loop could be written in Python or a zero touch node. While you are at it you might want to add a auditing, purging a few times, enabling shared status, setting room calculation point to true, checking stuff like categories/subcategories, and loading back into the project. The capability to add those sort of functions makes this method more useful than the BIM Interoperability Tools method.

I think I discussed if not demonstrated opening a document and performing a loop of actions with one of your colleagues back when I worked with you guys on a regular basis, so if you really want to achieve this with Dynamo.

3 Likes

Actually Jacob, Gavin, thanks for that.
I do use the first, second and the external method for obtaining family sizes…
So, this thread is not about the family itself but the modelled elements in Revit.

Like Jacob has mentioned, the family sizes are quite efficient.
Even if we swap it out the heavier family with a slightly leaner one, it does not show a significant downsize in model size. I’m assuming that it’s the amount of data pumped into the model itself that’s causing a large spike in terms of model size.

Hence, this is more about…

  1. Can we actually obtain the filesize of let’s say… “a wall instance of 20m length” and “a wall instance of 2m length”?
  2. My plan is to actually use the node to report on the culprit and model it properly (whether it’s a warning / a floor with a 1million m2 area)

Hope that makes sense.

*P.S. Model checker is perfect for listing family filesize. I run it pretty frequently in case anyone is after that solution

The ‘file size’ impact of a wall is pretty much zero. In fact this is the case for pretty much any standard system family. Element associations define most of the content; the only stuff stored on the wall are it’s instance parameters and properties. Element.Parameters and the list of properties on the API docs should cover most of that. The file size difference between a 2m wall and a 200m wall should be 0. But the impact on performance won’t be; but even that will be VERY context dependent (ie: the 200m wall will have more obstructions to calculation in any given 3D view, but the performance in a schedule view would be the same).

To identify the impact to files size by having or not having an element you could try to do a loop like this:

  • Open the file detached, discarding worksets
  • Do a save as, and append the value to a list or save to a new CSV
  • Get all element instances,
  • For each element:
    • Use a pre-delete method to confirm it can be deleted; if not, skip it
    • Delete the element
    • Save
    • Pull the file size
    • Append the file size to the CSV or list
  • Take the CSV or file size list, and use a List.DropItems to build two new lists, one without the first item and one without the last (DSCore.List.DropItems(lst, [1,-1]@L1<1>); should do the trick).
  • Get the difference between the values in each list. This is the ‘impact’ which deleting that element had on the file size.

Some notes on the differences: The file size might actually go UP in some cases. This is because Revit files are not typical files, and a LOT of data about elements persists in the file even after deleting the element, and a thing which was previously simple might be more complex, or new elements could have been added to the file (imagine deleting a corridor wall, causing ALL rooms to be overlapping, adding 100 or so new warnings into the file). Cleaning up that excess data is why we have stuff like Purge, Audit, and Compact Central. While the size impact might not be as expected and you’d be learning something different, it is a valuable exercise as it will help you to understand the structure of Revit - EVERYTHING is connected. It likely won’t help you get what you’re after.

To respond to your list:

  1. Not really; but it could be fun to try and you’d learn something (just not what your’e after).
  2. The number of vertices and faces in an element does have some impact on performance for anything which exposes geometry (so most things). Mapping of textures on this will also impact things (although that’s much faster now); The count of vertices can be calculated pretty quick with something like DSCore.List.Count(Topology.Vertices(Element.Geometry(e)));. It won’t be an exact science in terms of file impact though. DBG lines in journals are another good way to find ‘problem’ elements. There isn’t an easy ‘find out’ tool without getting deep into journal automation though.
2 Likes

Thanks @jacob.small. I think I get what you mean. I’ll give it a go with that method.
Always good to have some new things to experiment on.

1 Like