I’m trying to create a definition to optimize the composition of a linear block of flats, but I’ve found no success so far.
The developer I’m working with have created a selection of model blocks to be used in their projects. A linear block is “sliced” up to its constituent parts: End slices (E) with flats that have views in two directions, a stairway slice (S), and middle slices (M) with flats that have views in one direction. The stairway slice always feeds two slices of flats. The minimum composition would be E-S-E, followed by
and so on, you get the drift. There are variations of end and middle slices with defined lengths.
What I’m trying to do here is to optimize the composition of a linear block given a guide curve from a zoning plan or a layout sketch. I’ve calculated the possible length variations of each E-S-M and M-S-M blocks that serve as constants in the definition. The script lays out at least two end blocks and, depending on the length of the guide curve, a variable number of middle blocks. The script should find out which length variations of blocks to use to minimize the difference between the total length of the blocks and the length of the guide curve.
I set up the logic in Grasshopper at first, where it works great with Galapagos as the optimization solver. I then migrated to Dynamo and was able to build up the definition just fine. It works without the optimization part when I set the number of each block type manually. I then set up the fitness function and the complete definition:
But as I try to run it, I get an error message from the NSGA_II.AssignFitnessFuncResults node:
I haven’t been able to figure out what the problem is with the definition. Moreover, the number calculated by the function doesn’t seem particularly small.
I hope you’ll be able to give me advice on how to move on with this.