Results output for Annual Daylight Simulation in Alphabetic order, rather than input?

Hi @chris,

I’ve noticed recently with the 1.5.1 version update, and perhaps this was around for the 1.4.0 version, that the results output for the daylighting grids are done in alphabetic order, rather than based on the input grid tree structure.

Is this intentional? If not, is it possible to have it provided in the same format the grids are generated? Reason for asking is because aligning the mesh to the correct data grids can be difficult / manual, and can become very awkward if grids are added later on. Up until this point, I have been generating the grids as a tree structure, as below, which allows me to align my meshes and ensure that the right data is analysed (especially if a grid number of points would match at a glance with another similar sized space elsewhere).


If this is not possible, do you have any pointers to how I can always ensure I am grabbing the correct mesh to align with the results, without having to manually check? The DA output result for example doesn’t include a header (like in Energy results) which would allow me to know for certainty that branch 0;1 is actually L0_Secondary, and not my assumed L1_Primary.

Thanks in advance,
Elly

Tagging in @mikkel if this is another daylighting version 1.5.1 bug :slight_smile:

Hi @ElzineBraasch,

I can’t reproduce the issue. Are you able to share an example file?

It should not output the results in alphabetic order. It simply follows the order of the grid information file.
image

Since this is the only post that mentions it, why not bring it back from the grave?
This has been happening to me, matching the order of results to their corresponding mesh is a nightmare. I haven’t figured out why this happens.
Did you ever figure out how to fix this?

I would infer that this is the result of people wrestling with list management and Data Trees, which is probably one of the toughest things to learn about Grasshopper. When I was first learning Grasshopper, it’s something that I struggled with for 2 years. And it still gets pretty complex for me sometimes.

If it makes your life easier for now, you can always join everything into a single sensor grid and mesh.

But I can back @mikkel up that there’s no alphabetical sorting happening with the sensor grids now.

Hi Chris,

It’s the weirdest thing, I can’t reproduce the error anymore. Same model, same script, different behaviour.
Not only was I getting the meshes re-arranged alphabetically but maybe one or two came up with a different sensor count.

For anyone that may experience something similar. I ended up reading the file at: \annual_daylight__params\model\grid_info.json, loading the meshes with those names in that order and then checking for sensor count discrepancy.

My data tree is/was super neat.
The only thing that changed between my reply and today is re-saving the script under a different name and changing the save directory.

If this comes up again I’ll dig around see what went on.
best, M

1 Like

saw it happen again on a model that had some windows slightly offset from the wall (I moved the wall with PO_Rhino but apertures stayed in original position) and some non-planar vertices in shading surfaces.

I suspect it has something to do with the windows. It’s odd that the simulation succeeded and illuminance in those rooms was not null.

anyway, quick update, merely a suspicion because I haven’t had the time to figure it out.

Yea, that sounds like the Radiance design philosophy to me. There can be a lot of things that aren’t right with your model like duplicate identifiers and non-planar geometries. Yet the simulation will still finish. If you ask me, Radiance could have used some more checks and errors for these cases. But I should be careful what I wish for because I know EnergyPlus is at the other end of the spectrum where I get way more errors, warnings, and enforcement of strict rules than I would ever want.

In any event, if you’re using the Pollination Rhino plugin, the PO_ValidateModel command may help you figure out what is fishy about your model and what you need to change to get it to be correct.

exactly what I did, I used PO_ValidateModel to fix all the issues and the mesh re-arranging went away.
The model had offset windows, duplicate identifiers and some non-planar vertices in the shading objects.
I didn’t try one at a time so I can’t say precisely what fixed the issue.

illuminance results before and after model validation are almost identical, radiance did a good job looking past model imperfections. I get what you mean, Energy+ will complain about every little detail.

Anyway, just dumping random info in the thread.

1 Like