Dynamo with C4R / BIM 360 Team

Hey there!

Hope everyone is having an efficient Thursday :slight_smile:

Curious to hear your experiences with Dynamo working in the cloud. Let me know your successes and where you see opportunities for improvement.

Thanks :sunny:


Hey @Emily_Dunne, what do you mean by Dynamo working in the cloud ?
Did you mean having it interact with web services or did you imagine using Dynamo on a cloud-based machine (like an AWS or Azure virtual machine) ?

1 Like

Due to the mention of C4R and BIM360 team I believe it’s a request for information on using Dynamo when accessing a Revit model that’s cloud based.

1 Like

If we are talking with Dynamo accessing cloud based Revit files it works just fine. I did test it also on cloud based linked files and it is working properly. I managed to access information from linked model on the cloud so it is not any different.

1 Like

Thanks for the feedback! Yes, I am asking specifically about running Dynamo scripts on a model hosted by BIM 360 Team / C4R and in “the cloud”.

Curious if anyone has encountered unexpected behavior with any specific nodes or accessing data from linked files for example or any lag times in processing.

Well, i wouldn’t expect Dynamo to even be able to tell the different between models hosted on local network or those on BIM360, since the latter are locally cached anyway (including all links) & the only diff being that the path of the Central starts with BIM360:// .

In practical terms, we’ve not noticed any issues with C4R-hosted models :+1:

1 Like

Hi @Emily_Dunne,

We are currently working on our first C4R project.
I’ve created a dynamo definition which creates floorplans, elevations, 3d views and places them on sheets for each room specified by y/n parameter. This runs smooth on local server based projects.
When this runs on the C4R project it’s like it commits each creation of views to the revit model and the model view has to run the regenerate view - and this takes alot of time.
On a local server based model the definition runs a lot faster and only commits the change once!

Is this something that you also are aware of?


I’m on a large C4R model with many people working in the model. I’ve noticed the same thing it takes much longer and seems to regenerate views more frequently. I’ve also noticed that when someone is syncing (which is about every 5 minutes in this project with the number of people working in the model) it seems to pause Dynamo till the sync is finished then it seem to start working again. C4R project are the biggest pain. Why can’t they setup a sync queue so you are not looping through your sync time after time to reload the latest as multiple people sync at the same time.

Scheduling syncs and make use of the communicator to avoid a lot of these issues. Also dividing models and managing work sets within them can help. In my experience, and from the data I’ve seen and feedback I’ve heard it’s actually a lot faster than Revit Server for similar projects, but users are far more cavalier about BIM360 projects then they were with Revit Server (or perhaps having a hand in

Out of curiosity, have you confirmed that all users in each project are on the same Revit build? Believe it or not this is a huge difference maker, and a case of ‘one bad apple’ being able to spoil the barrel as the bad data syncs slow down everyone else. Considering there haven’t been any updates in 4 months everyone Should be on the current builds.


We require everyone to use the communicator, still have some that don’t look before syncing. You are dead on with the issue when everyone isn’t using the same builds. We had that issue a couple of years ago when we were on a Revit Server and it was such a mess. I wish they would let us split up or electrical model even more than we currently are doing. That way there would be less people in each of the models.

The only problem i have is to retrieve some file related information due to the nature of C4R.
File size for example

You can get the file size from the FileSizeOnOpen line of the respective journals. Published versions also indicate file size rather easily (and publishing as part of your maintenance is a good idea so you can keep a backup handy).

I tried it but it’s really bad on performance sadly

Yes, I’ve done it in a project, I tried not to run the definitions during the peak hours of my collaboration teams, so to prevent crashes or corruption… You see, I experienced some crashes when executing elements throughout the whole model (change a parameter of an object while someone’s is using it in the same exact synchro time can corrupt the file)…

My advice is to test Dynamo during this time, try not to do it when everyone is working on the file, or at least chat with your user with the plugin to make them aware of possible problems… I mean, it’s like you’re working on your local net: when do you usually run the definitions?

Hope my post was clear :slight_smile: