Parallel Analyses Running on CPU and or GPU using Wallacei or something similar Galapagos etc

Hello,

Currently I am doing my post grad and one of the issues I have to tackle is getting the environmental analysis and other factors for optimization.

I have a strong GPU and decent enough 12th gen Intel processor with 64gb of Ram. i REALLY DO NOT KNOW MUCH EXCEPT VERY BASIC THINGS in python and some advanced things insied grasshopper. Is there a manner to create parallel running of simulations in order tso that each iteration about 6 or 7 simulations run?? simultaneously or in batch… I have no preference just being able to automate the entire process and then gather all of the data using Data recorders or Wallacei.

If anyone has experience in this field I would love to discuss this further.
Thank you.

Hi,

Im am doing something similar for my graduation with a focus on solar irradiance simulations. Here are some of my experiences:

For solar simulations you can use AcceleRad which significantly improved the speed in comparison to Radiance. I am not familiar with other analyses so I cannot advise you on that.

If you are doing a lot of simulations in Grasshoppers, you indeed need a lot of memory, depending on your goal, I am not sure if 64 GB is enough. From my experience it was better to run simulations outside GH, because it not always removes garbage from earlier simulations properly.

If you want to automate the process and are okay with sequential simulations, the most easy approach (meaning: without programming) is probably to use Grasshoppers Anemone to iterate over models and store in your data recorder.

For more advanced parallel solutions, I am afraid you need some in depth knowledge about Python or C#. My personal solution was based on Python: I am loading geometry in an external Python environment and then using the LBT Python package to run my simulations in Python using AcceleRad. Around this function I am using the multiprocessing package to make it run in parallel. For now (I am not finished with the project), the GPU seems to be the bottleneck. Afterwards I am saving my results as json file and reading them back in Grasshoppers.

In your case, you could run your simulations in an external cmd window using Python calls with the os lib, without Grasshoppers freezing. However, since Wallacei sequentially defines input parameters, I am not sure how you imagine to use parallelization in your simulations. Perhaps you can elaborate on that.

If you want an easier solution for parallel simulation from GH Python, take a look at the ghpythonlib.parallel package. If you wrap this around your Ladybug simulations (when the simulation is called inside the LBT GH node), they should also run in parallel.

Hope this helps you to get in the right direction

PS. I am afraid that parallelization is not easy without some programming experience, but perhaps I can help by suggesting the right aspects to study.

3 Likes

Mr Vogel,

I was afraid of that but I know you are on the right path with Python for sure.

I would love to learn how you solved this particular issue.

Thanks again.