Batch running several daylight simulations

Hi all,

I was wondering if there was a good way of, locally, utilizing the “queueing” features of the QueenBee ?

So i could maybe WRITE 20 daylight simulations during a day at work and then batch run all radiance files in xxx folder locally when i leave the office in the evening?

Or alternatively in a parametric workflow, would I then just load my 20 HB_Models into the AnnualDaylight component? This could have performance though if the models are large.

I know this is contradicting the pollination business case, but on the other hand it would be a waste not to utilize the possibility.

if you internalize all the geometry and have all the sims; you can have colibri act as kind of a ‘batch’, and itterate through each of the simulations. not really best case by any means buuut it does work :sweat_smile:

Thanks Trev, and yes this is what I’m hoping not to do with 200k+ polygon simulations :wink:

Guess easiest workflow is to use the Model2rad component and setup some python script to look for rad folders. This we have done for legacy, but I’m unsure how it’ll work with the queenbee stuff.

1 Like

There are plenty of solutions to do this in Grasshopper as Trevor pointed out. You can also install the LBT core libraries on your system Python and just use this command to run the recipe from your own Python script.

And this definitely doesn’t undermine the business case of Pollination. Sure, things could probably run a bit faster on Pollination but the main reason why most people might run something on the cloud is the same reason why we develop software with GitHub instead of doing everything locally. Or why you’d use Drobox/GoogleDrive/OneDrive to store your files. It’s just much easier to collaborate, share, manage versions, and track the history of things when it’s on the cloud.