I’m using a DataOut component to gather the parameter for an energy simulation, then run the simulation in another Grasshopper definition and read the results back in via DataInput.
This works fine if both definitions are open in the same Rhino, but not across multiple instances.
Any ideas why? I’m using Honeybee for the simulation.
The reason I want to do this is for parallelization.
It sounds like you are trying to pass Python objects back and forth between Grasshopper definitions, which can work when both definitions are open but, if you close one, the object could get deleted. The legacy plugins weren’t really set up to do anything beyond passing objects within one definition so I don’t think there’s really a way to get your specific example to work.
However, in the new LBT plugin that was released last week, you can serialize Python objects into JSON strings that can then be re-serialized to objects in other definitions. Since you are only passing strings between definitions, you should be able to set it up such that it doesn’t matter if you close one definition. So my recommendation for the time being would be to try and see if you can get things to work with the new plugin.
For now I achieved parallelization by splitting the model into parts and using "Re-Run IDF".
I’ll try DataIn / DataOut with the new plug-in soon, however.