I crash often. Two or three times a day if I’m lucky. In fact if I don’t crash in a given day or when building a graph it either means I’m really just using a tool I’ve already perfected, or that I’m documenting what I’ve produced for a webinar, presentation, or delivery to a client. Otherwise, if I’m doing something new for the first time and I don’t crash I’ll typically kick it up a notch until it does.
To me crashes are an indication that you are at or beyond the limits of either your skillset, your hardware, or the software you are using. That is where I want to be - forcing myself, my tools, and my product to get better. By going out of your way to never crash you’re going to hold yourself back quite a bit. Make many mistakes and learn from them, you’ll be better off.
In general some steps I take to keep my crashes productive:
- Don’t run in automatic unless you’re confident it won’t go belly up on you. If it doesn’t work for a single object then it’ll only lock up your system trying to do it with 1,000,000 different ones.
- Start with small and clean datasets and work your way up to bigger ones. If your Revit model is corrupt or has 10000000 warnings then programmatic adjustments will grow the issue exponentially.
- Don’t work with files other are actively using or which are live syncing when you’re writing a graph. This is kind of common sense - you won’t know why something changes if others are also playing with the files, and someone else could put something in there that causes you to crash.
- Restart Dynamo every hour to free up all the resources. This is painful sometimes but it really does help. The unlimited undo’s stack up datasets like no-ones business.
- Restart associated programs every 2 hours to ensure things haven’t changed and you’re not chasing a unicorn. Sometimes software acts odd due to some glitch and chasing an oddity can be very frustrating. I’ve had this happen with everything from Revit to Excel - even Notepad++ has thrown me for a loop before.
- Keep every software and package involved completely up to date but not into Beta (IE: Revit 2018.2 and Dynamo 1.3.2). The updates written for every software almost always add stability, and this is VERY helpful. I’ve noticed a big difference between stability with 1.3.2 off Revit 2018 vs off Revit 2016.
- Be ready to crash every time you place a node you haven’t used before - you don’t know what it’s going to do, up to and including crashing. When you THINK something might be the right node, try it with a small isolated test first, then incorporate it into the rest of the workflow accordingly.
- Have a strong enough workstation to handle what you’re doing. This is another no brainer, but it often goes overlooked. If you need 8 gb of RAM to do the first run due to the size of your dataset, you’ll be adding to that on each consecutive run and each edit, so a system with 4 gb of RAM just isn’t going to cut it. Reduce the horsepower or get some upgrades.
- When working with larger graphs/datasets/complex items, be patient - a lot of times it’s not crashing but users pounding on the mouse and frantically bringing up task manager and clicking Revit/Dynamo/whatever else expecting your added input to fix things, when in reality this makes it more likely to crash. Let it pinwheel a bit while you take five minutes to fill up the coffee cup/eat lunch/respond to emails/call your client or family/do something else productive.
- Stick to only using pre-produced nodes and stay away from python and design script. This will handicap you but I’ve found I’m less likely to crash with the basics than I am when I try to go off script. You’re less likely to cause an irrecoverable error this way as those other nodes have been tested more.
- This is the most important: save everything often. While this doesn’t prevent crashes it will save a lot of hearth ache when they inevitably occur. This includes both the dyn file, and any dyfs or other sources of code as well as any associated files (.rvt, .rfa, .rte, .doc, .txt, .xls, etc).