[WIP] is there a way to save fully specified honeybee thermal zones in a component for reuse?

May I ask if there is a way to save the honeybee thermal zones with adjacency solved, which may take a long time for large model, zone program, loads, etc., all defined so as to save the time to open the file next time or reuse it for different workflow?


Hi Grasshope,

I have been thinking about this and was waiting for someone to ask for it so I go ahead and implement it. Interesting that no one ever asked about this!

Check the attached file. There are two components that let you dump the objects to a file and load them back from the file. In this case once you have a large model already worked out you can save it and next time load it from the file. I have tested it for small models but let me know how it will work on a large model.

dumpObjects.gh (469 KB)

Wow this is amazing! Can’t wait to test it out!

Mostapha could we use these for parametric optimizations? Put it right after the part that won’t change anymore so that the whole chain won’t be re-run/re-read? Perhaps there’s no need, I’ll test it out soon!

Kind regards,


Thank you very much, Mostapha!

These two component are very handy and they shall increase the efficiency of energy modeling workflow significantly which involves intensive thermal zone geometry preparation calculation!

Hi Mostapha, I found the following problem with the 2 components:

  1. use the 1st component to save the .HB file

  1. disable the write HB file component

  1. save the GH file, close and reopen GH and Rhino, then open the GH file again,

  2. the .HB file cannot be loaded by the 2nd component

it seems that the Load component only works when the write component is activated.

Appreciate if you can kindly check if you can duplicate the error.


Ah! I see where the problem is. I over-simplified the issue last night. It can be actually pretty hard as cPickle has limitations on de-serializing. Let me think about it more.

PS: You may want to change the topic to “[Maybe] is there…” until I get this fix! :wink:

Hi Grasshope and Theodore. Let’s see if I get it right this time!

PS: Grasshope please also consider these component as a gratitude to your recent post.

dumpObjects_001.gh (481 KB)

Dear Mostapha, thank you very much!

Apologize for my late replay as I was rushing a conference paper in the past few days.

Hi Mostapha,

I tried it and get some trouble.

The process i tried is:

Save/dump the HBZones into a file.

Open a new GH file and uplad the previous saved file.

There i get this : 1. Solution exception:KeyError

Runtime error (KeyNotFoundException): KeyError
line 136, in updateHoneybeeObjects, “<string>”
line 155, in loadHBObjects, “<string>”
line 177, in main, “<string>”
line 180, in script

Is this process supposed to be valid? Or both dump/upload must work inside the same file?



Hi Abraham, This is how it is supposed to work. I can’t recreate the error. Can you share your .hb file with me? Thanks.


Here it is.

I dumped it after some E+ settings components. Can this be the cause?


testSaba.HB (888 KB)

Thanks Abraham! The issue was with the exporter. Let’s try once more.

You’ll still get an error since the component doesn’t package schedules and construction/materials which we can add later but if the materials are loaded into library then it should work fine.

Once we have that in place, this will open up new opportunities. For instance you can break down the process and have a definition on a different machine waiting for the file to run the calculations while you’re working on your model locally. #collaboration #cloud #rain

dumpObjects_002.gh (496 KB)

PS: You need to re-export the file and then try to load them.

Thank you very much, Mostapha, for the updated components!

I did a quick test and it seems that the default zone program set for the thermal zones can be “packed” and exported as the .HB file, and it can be loaded back to generate IDF file which contains the schedule, materials and constructions for the default zone program, and the E+ simulation went well with no error.

Do you mean there might be error when using self-customized materials, constructions, schedules for the zones during the export and load process using these two components?


Great! Thanks for letting me know. As far as you’re using the default materials, constructions and schedules you should be fine.

Also if you’re loading custom stuff that you have it on your machine and in the file it should work fine.

The issue will only happen when there are custom stuff that are not available on the other machine. We can package them with the file later but it’s not implemented yet.

Hi Mostapha,

Perfect now!!

Opportunities is a great word. Like it a lot.

For now, what are the definitions that are dumped and loaded? Just geometry? Didn’t understand what yo wrote about the materials/constructions.

By now you’ll say that the stage where you better do the dump is after the solveAdjacencies? Even there some definitions can be lost (materials, others)?

Thanks a lot for this one.


For now, what are the definitions that are dumped and loaded? Just geometry? Didn’t understand what yo wrote about the materials/constructions.

By now you’ll say that the stage where you better do the dump is after the solveAdjacencies? Even there some definitions can be lost (materials, others)?

Hi Abraham. Everything gets dumped and then loaded by the components. The issue that I mentioned happens because in case of materials, constructions and schedules HBObjects carry the name and the address and not the full data. For instance if you have schedule set to c:\yourschedule.csv it will be saved correctly but just as a string to this address. This means if you load the objects on a different machine and try to run the simulation it will look for c:\yourschedule.csv and if that’s not available it will fail running the simulation. It’s the same with custom materials and constructions. HBObjects just carry the name and not the full definition.

I can add the option to save all the material, construction and schedules full definition with the file but it will results in potentially a much larger file. Currently there is no easy way to know if a material/construction or schedule is custom or is coming from the library so I have to save all the constructions and schedules with the objects. It also includes RADMaterials.

Thanks Mostapha,

This is clear now.

Most of the cases you’ll do in your own machine. But this is the time of start thinking in shared libraries as you said “in the cloud, in the rain, etc”. Opportunities, new opportunities.


Speaking of rain and its relationship to cloud-computing > http://halfblog.net/2011/11/29/the-telegraph-thought-councillor-tho…

After all these years it still makes me laugh loud. I feel bad for the guy.