Large radiance models: Memory errors and levels of simplifications HB-R

Hi @mostapha

I ran into some memory errors with HB-R to execute a DF recipe on a big model (heavy meshes) with a hbjson that weighs 850 MB.
Are there any limitations to bear in mind with the current workflow? otherwise I’ll just go for simplifying the model much more.

best,
Olivier

Nice one! I love it when people break stuff. This seems to be an issue for loading the HBJSON file as a model in the first place which shouldn’t happen. To test it in isolation try something like this.

from honeybee.model import Model
import json
import pathlib

fp = r"PATH-TO-HBJSON-FILE"

input_file = pathlib.Path(fp)

model_dict = json.loads(input_file.read_text())
model = Model.from_dict(model_dict)

If this fails then please send us a link where we can download and test the file on our side. You know where to find us! :slight_smile:

Also it looks like you are using Python 3.8 - Luigi which is what we use to execute the recipes is not compatible with Python 3.8. Can you try with Python 3.7?

cc: @chris

1 Like

Thanks for your reply,

I’ve downgraded to python 3.7 and tried this script as following:

I’ve uploaded the case folder here: https://bit.ly/3qBFd6U
For convenience, this is a smaller hbjson 400mb file that presents the same issue.

1 Like

Just as a quick update.

I managed to run the models by breaking them down into smaller parts. There seems to be some memory limits either from Radiance, Python or hardware.
@sarith I wonder if you’ve ever encountered memory errors when running large models through the command line.

Hi @OlivierDambron, yup that has happened many times. Usually if the model is too complicated the octree generation process might run into memory issues, or in the case of annual simulations if the number of grid points are too many, matrix multiplication runs into similar issues.
But based on your screenshot, it appears that the error is occurring upstream at a Python level (“translation failed”).

Regards,
Sarith

1 Like

Hey @OlivierDambron ,

So I have two new components for you to test to see if they solve your issue. They are called “HB Dump Compressed Objects” and the “HB Load Compressed Objects.” They work just like the JSON-serializing components except they use pkl files instead of json:

You should see that the pkl file size is at least ~1/3 smaller and, most importantly, the serialization process should hopefully not max out your memory.

You can get the new components by running the “LB Versioner” component. I would test them myself but it seems the link you posted is broken. If you can test them on your end and let us know if it works, we’ll consider making the commands accept .hbpkl file inputs just as they accept .hbjson file inputs. Then, you should be able to run them through recipes without issues.

1 Like

hi @chris

Sorry for the broken link, I think I overwrote the file while rushing.
I went for simplifying drastically the model instead of a “select all/assign/run”.

Perhaps it is worth mentioning that the geometry I was trying to use was extracted from an IFC file (using geogym plugins). Many items behaved very strangely with the HB components:

  • sometimes nurbs or meshes would return the components orange/red due to invalid breps or attempts to divide by 0.
  • sometimes a simple gh transformation to those geometries would make the HB Face accept it.
  • when these went through the HB Face components, often the HB Visualize wasn’t showing coherent geometries with that through the Spider Rad Viewer. Although this is a different issue, I just thought I’d mention it as all of the fuss I made could very well be due to inappropriate geometry extracted from an IFC file.

The two hbpkl components work on my end. Thank you, this will definitely help with heavy cases.
image
I look forward to the commands accepting hbpkl it.

As a side note, I noticed after updating that the HB Run Recipe component disappeared from the ghtab:


Do you have the same?

Best,
Olivier

The recipes are now located under the GH Radiance tab .
-A.