Hi @mostapha , @devang and @Mathiassn
Thanks for sharing your thoughts.
For the purpose of testing, please find here a small sample file. The logic will serve to process urban areas nearly 50 times larger. I’ll try to clarify as much as possible:
The data structure I’m trying to preserve on Grasshopper is the following:
[ Properties [ Buildings [ TestSurfaces/SensorGrid ] ] ]
Each property can hold 1 or more buildings.
Each building can have 1 or more test surface
I’m trying to compute Direct Sun Hours on the facades of buildings and to be able to query results for a given Property by looking for a path.
Options 1 :
I’ve managed to do write and run a medium size simulation on Grasshopper.
I’ve decided to flatten all test points to generate only 1 sensor grid, read the results and match their data tree structure back. This saves overhead time (which can be really huge) linked to the decomposition of all grids into subgrids, and facilitates the settings for optimal sensor count (nb of sensors / nb of CPU).
The problems comes with large scale simulations, Gh becomes the bottleneck with a lot of waiting time between clicks to generate test points. The definition becomes hard to handle.
Option 2 :
Using Honeybee-3DM could help bypass Grasshopper limitations.
@devang, following @mostapha’s strategy and @Mathiassn’s logic, would it be possible for honeybee3DM to generate only 1 sensor grid for all test surfaces within a single layer?
And generate a list (csv or else) of number of points generated per surface?
That way I can restructure the results accordingly in grasshopper.
I attach here a sample file.
Sample_Urban_DataStructure.gh (1.2 MB)
Thanks so much to all,