I have created 2 overlapinig similar Structure floor with same properties. I want to delete duplicate but want to understand first how revit understands duplicate. the only difference is these 2 floors were created 1 min apart.
Attach is the screen shot. ID are different.
I don’t know the specifics of how or what Revit checks for duplicates, but the Warning does get stored in the model. The node you’re using is for finding duplicate items within a list. This is completely different from Revit duplicates.
You can instead use the Revit Warning nodes to get exactly what you’re after.
Thanks a bunch. Now I can select duplicate items.
I can easily delete duplicate by removing 0 index from each list.
what if I have more than 2 duplicates, is there a way to instruct ’ keep all except 1’ .
List.Deconstruct separates the first item in a list and the rest of the items. That will always leave you with one item.
List.DropItems might be more effective (use @L2 for the list level) as it reduces the data set rather than splits it. Less stuff for the memory to deal with this way.
Thank you guys. Both nodes work fine.
The problem persists with duplicates. I tried to simplify the problem by coping one element 2 times. Now when I check the warnings, it shows indexed copied elements, with one element copied twice. When I copied it asked me to Unjoin elements, I tried both ways ( with and without ). I think each copying method indexs elements differently, but I like to understand what is happening behind copying mechanism, why is it different, and how can I make copying method similar.
Ideally it should show me 3 warnings for each 3 elements
one more thing, warning node does not update itself, it has to be toggled manually then it works. Ideally there should be Refresh button
One test with two types of duplicate ( with and without joining )
I tried another test with one element copied 3 times all unjoined. Indexing is still confusing. It should be all similar
As far as I’m aware, the internal Warnings only compare two elements for duplicate instances so you’ll only ever have them in pairs. You would either have to iteratively clean them up or use python to loop through and group “overlapping” pairs ahead of time. Even then, the second option could end up require iteration to cover all duplicates.
This is correct, and why everyone should always keep on top of the warnings before they become problematic. I believe that it is possible to query the warnings for a given object, and then group all of the items that way;
ie: object A has duplicates B, C, and D; instead of looking for A:B, A:C, A:D individually pull all warnings of type “duplicate” for object A, then grab the element ids for B, C, and D, and take action.
’ individually pull all warnings of type “duplicate” for object A ': is there a node for doing that ? it would be super helpful
It worked finally , thanks guys. I have to run the graph several times before it cleans everything.
Without getting into the API either.
- Take the first item out of each list using a List.FirstItem set to @L2, and if needed use a List.Flatten node to clear any sublists.
- Convert those elements into ids so they are compatible using an Element.Id node.
- Take the second item from each list using a List.LastItems set to @L2 and if needed use a List.Flatten node to clear any sublists.
- Group all the elements in the second list by the element IDs in the first list.
- Run the delete element node as desired.
A few notes: You don’t want to just delete the second item in every possible warning: likely just in the duplicate items ones (ie: walls overlap, run your code as is, and you just deleted a wall, and all hosted objects!). Add a filter for the warning type to prevent over extending the scope of deletion beyond your duplicate items.
The code above might not resolve the case of Object C being in a list with Object A and object B… or simply put the code may try to delete object C twice. Might not matter, or it might. If it does look into adding a List.Unique, a List.FirstIndexOf, and a List.GetItemAtIndex node to the canvas. Those three might not resolve the issue but they’ll be a good start!