Same inputs (i think)... but different outputs

I have a problem here, I am putting the exact same geometry into a grid based daylight analysis, using the exact same radiance parameters, and sky recipe… and I’m getting wildly different results. See images below.

I need to know why light is behaving in the way that it is. There seems to be a more intense fall off in one senario than the other, but I can’t find any paramter that differs between the two models. Can you recommend a way to read the results files in a way to help track down the reason for the difference.

If you’d like the rhino model, you can download it from this link.

004_gridBasedAnalysis DOUGLAS - (345 KB)
004_gridBasedAnalysis DOUGLAS - (341 KB)

Hi Will, As far as I can see the changes doesn’t look random. Are you sure that you are assigning similar geometries and materials in both cases? I need the Rhino file to see what’s going on. Also you can use importRad to import .rad file back and see if everything is exported correctly.


Mostapha, see the link above for the rhino file.

So, after lots of investigating, I think I tracked it down to a syntax error of one of my material names (I think). I still don’t know what was wrong with the name I had (maybe a space or invalid character), but just changing the name resulted in different materials being applied to the geometry (in the images above, the upper one, all the context massing had misapplied glass).

The importRad component was the key to tracking it down, because everything going into Radiance looked ok. Although, the honeybee_createHBSrf component “readMe!” output doesn’t seem to work, which is where I could have potentially solved the problem (maybe?).

Hi Will, I checked your file again and one common mistake that I can see is creating couple of different materials with the same name. In this case one will overwrite the other cases. You need to select a unique name for each material.

My guess is that because of the order of the surfaces in one of the cases this order has been reversed. You can check the material assigned to each surface also from importRAD. Materials and surfaces branching should match.


Great. Good to know. Thanks for looking into that.

Also, can you look into the honeybee_createHBSrf component “readMe!” output. What is supposed to be coming out of that?

Hi Will, Sure thing. Back to createHBSrf, unless you are making a mistake there is nothing to read there. Maybe we should add a simple report also for success? Do you think that helps the process to be easier to understand?

Hi Will,

I share your desire for some more readme outputs for the _createHBSrf component. I added in a few of them that let you know what material has been assigned among other things. I am not to sure if this would have helped so much in your case because both of your materials had the same name but, still, it gives you confirmation of what is assigned.


I think a report for success is helpful in terms of learning what the components are doing, because the HBSurface output doesn’t seem to give all the information of what is being done by the component. In grasshopper your trained to ask yourself “ok, what did I put into this component, and now, what am I getting out of it.” In this case… I’m only able to read geometry, without seeing the materials assigned… or anything else. Is the intention of the readme! output to replace the baloon on native components?

So, second question, why am I not allowed to lable geometry the same material name in different createHBSrf components? In terms of what you expect from grasshopper, I’m shoving a bunch of different geometry into the component and giving it all the same material, why is there no conflict there? My grasshopper instinct would lead me to believe that a duplicate component is doing duplicate actions, and if I just add it to the same data tree downstream, it should all be fine. Does it create some sort of group when assigned materials in the createHBSrf, and it can’t be given the same attributes as another group… or something along those lines? Anyway… through me for a loop.

Another question. I’ve been evaluating the model in section, like you see above, and when the srf is facing one direction, produces different results than when it is facing the other direction. The surface itself is not part of the daylight model, so it shouldn’t be shading anything, do the sample points sample in a particular direction? I would assume it would have been sampling for light spherically.

Hi Will,

I think I made you confused about materials and constructions in HB. Let me try again: It is totally fine to apply the same material name to different breps in different components as far as they are referring to an identical material. What is problematic is to create different materials with the same name. They will be overwritten by the last one. Honeybee keeps track of materials using their names so you can’t have two different materials/constructions with the same name.

~5-10% of difference between the results in Radiance is just normal. Specially if you haven’t set the parameters high enough. How much of difference you are getting? I suggest you to check this presentation. It will clarify how Radiance works:…


Right, makes sense that different material values with the same name would be in conflict.

Also, that document was very helpful in clarifying a few items.

I’m curious about the hemispheric sampling. If I’m interested in spherical sampling of a space, how would that work? I noticed that the results are highly dependent on normal of the surface used to generate the sample points and mesh. This poses a problem when looking to get a good idea of how light behaves in an open space like an atrium sampling in section (see images above).

Would you generate results for the two hemispheres and combine results additively?

Hi Will, Sorry for the late reply. I saw that you also posted this a discussion today. I’ll answer here and copy it over to the other thread.

If you want to have a real hemispheric sampling you should generate more than two normals. You need to generate some random rays in several directions and then average them over the sphere. Also see Greg’s answer to a similar question here:…


I’m a bit confused. When I use a Rhino surface to generate a grid of test points, doesn’t it run a hemispheric sampling automatically at every test point, with the hemisphere oriented normal to the overall surface?

I’m curious if I can just run the same mesh surface flipped, then combine the two additively (its being measured in lux).

Gregs answer was way over my head. The radiance syntax isn’t familiar to me, not to mention my lack of instinct for radial geometry (I’m an architect… i like pictures).

I have no idea what happened to my text! In short what I mentioned was that you are right but Greg’s method is suggesting a higher sampling of the space. Pink over blue.

If I want to create a Honeybee routine that samples a space with using this spherical sensor, how would I go about doing that?

If your interseted, here is my circumstance and how I’m currently gathering results.

Currently i’m modifying the pts vector in the grid analysis to always look up (+Zvector), rather than normal to the test surface in order to get a better result in section (otherwise they’d be shooting out into a wall). It still is only sampling as a hemisphere. The hemisphere makes total sense when evaluating the performance of a work surface, and room performance, but when evaluating an atrium in section, light comes from all angles, especially if the building opens up to side rather than just from above.

In this image you can see the atrium has a bright spot part way down. This is from an opening in the building, and with hemispherical sampling, your not getting a complete picture of light behavior.

Hi Will. Good point. I didn’t realize that your vectors are not facing Z+. As you mentioned they should. If you leave the input for vectors empty z+ vectors will be automatically assigned to the points.

Back to your concern do you want to open a new discussion and send a simple example file so I can clearly see your workflow. I can think of several ways of getting what you need but want to make sure that I clearly understand your question.

  • Mostapha