Grasshopper DataOut / DataIn with Honeybee

I’m using a DataOut component to gather the parameter for an energy simulation, then run the simulation in another Grasshopper definition and read the results back in via DataInput.

This works fine if both definitions are open in the same Rhino, but not across multiple instances.
Any ideas why? I’m using Honeybee for the simulation.

The reason I want to do this is for parallelization.

Cheers,
Thomas

I’m now getting this error message from “exportToOpenStudio”:

1. Solution exception:C:\Users\jenkins\git\OpenStudio\openstudiocore\src\model\RunPeriodControlSpecialDays.cpp@244 : 'G' is not correctly formatted

But “C:\Users\jenkins” doesn’t exist on my system (?).

@thomas.wortmann ,

It sounds like you are trying to pass Python objects back and forth between Grasshopper definitions, which can work when both definitions are open but, if you close one, the object could get deleted. The legacy plugins weren’t really set up to do anything beyond passing objects within one definition so I don’t think there’s really a way to get your specific example to work.

However, in the new LBT plugin that was released last week, you can serialize Python objects into JSON strings that can then be re-serialized to objects in other definitions. Since you are only passing strings between definitions, you should be able to set it up such that it doesn’t matter if you close one definition. So my recommendation for the time being would be to try and see if you can get things to work with the new plugin.

1 Like

Hi @chris,

Thanks for your reply!

For now I achieved parallelization by splitting the model into parts and using "Re-Run IDF".
I’ll try DataIn / DataOut with the new plug-in soon, however.

Cheers,
Thomas

Hi @thomas.wortmann

For data in/out in seperate Rhino instances or for even Rhino on different machines,
I can recommend speckle

1 Like