Generative Design - does not show large data results

Is there a way for me to edit this setting to work for a large value ?.
the full script i am working on will need more memory the 750 MB.
the average site is of 5km x 5km. and i need to calculate and display how many tables(60mx4m) will fit on it.

Hi Neel,

Thanks for testing with Three Box. Since that is working, we can rule out any issues related to conflicts with other software installed on your machine.

Itā€™s also worth mentioning that Iā€™ve been testing your graph in Sandbox 2.5, since thatā€™s the version of Dynamo thatā€™s running inside of Generative Design for all of our 2021 releases. Itā€™s possible that your graph runs more efficiently in a newer version of Dynamo Sandbox that has some of the recent performance enhancements, but fails in GD.

Currently, thereā€™s no way to edit that memory limit as a user. We have a task filed to expose that memory limit as an editable user setting in the GD UI. Iā€™ll add your use-case and share the issue youā€™re seeing with the GD team.

1 Like

thank you nate, let me know how this progresses.
is there any other generative design package / workflow for dynamo that i could use, for the same tests ?

I recommend simplifying the display - honestly you have so many over so large an area it likely doesnā€™t matter anyway - everything is going to look like an ant.

Leveraging dictionaries should reduce the RAM as well.

Both of those in combination may or may not solve the issue though, and I feel the toy problem you have presented will produce an abstraction level which fails to account for something that someone without industry expertise would skip - after all the maximum count of packed squares of given size in container square of given size is List.Sort([cW/pL * cL/pW, cW/pW * cL/pL])[-1];, and the order of packing can be attained in a similar manor.

You may want to try reaching out to the team from the University of Toronto who put the last study in this article together as they were at a similar macro scale. But even they had a higher degree of abstraction instead of getting into the micro level the way you are. Perhaps try testing groups rather than small sets? Ie: use clusters of squares to see if 100x100 table sets fit, and if so donā€™t break it down further. Or optimize for sets at the macro, then again at the micro (producing 1 macro study and N micro studies on subsections of the larger area).

hi,

could you direct me to a link where i could see any other limitations . that generative design is using.
or maybe source code/ Issues page ?

Does GD only uses CPU or GPU. I think it uses CPU. If CPU, do you have plans to shift GD to GPU if possible and a qualifying GPU is installed ?

1 Like

Hi nate,

any update on the memory limit of 750 mb , to the dynamo instance when running GD.