Question 1:
Is there a HB component that can generate vertical sensor grid from HB rooms e.g., offset sensor grid from a wall? I see HB Sensor Grid From Rooms, but this is to create grid offset from floor.
Question 2: Does sensor direction impact PMV results?
Sensor grid direction makes sense to be with horizontal plane (have sensor face up), but the appropriate direction is less obvious to me with vertical plane in middle of a room. Currently I have ptsVectors pointing in positive x axis (1,0,0) direction. But I am curious if that sensor direction limits radiant interaction on back side of plane (neg x axis).
I’ve tried messing with the direction and it seems to change results very slightly, but looking for definitive answer.
Good questions, @victorbrac . I can answer them both now:
There’s no component for “Sensor Grid from Walls” right now but you can make a sensor grid from virtually any geometry by using the LB Generate Point Grid component along with the HB Sensor Grid component. It looks like this is probably what you are doing in your screenshots. If there’s enough demand for it, I can eventually add components that generate sensor grids from Honeybee Faces and/or Apertures.
No, it does not. At least not for any of the comfort mapping recipes (PMV, UTCI, Adaptive). The comfort mapping recipes attempt to show the temperature that a human would experience when positioned at each sensor point and, in order to do this, they automatically overwrite the direction of the sensor and ensure that we evaluate both sides of the sensor point (since the whole human body “sees” the thermal conditions of the surrounding environment).
This is different than all of the HB-Radiance recipes that are evaluating some metric (eg. irradiance or daylight autonomy) on a surface. For those cases, the direction of the sensor matters since it denotes the direction of the surface. These HB-Radiance recipes also explain why sensors natively include both position and direction. I debated developing some new object for the comfort mapping recipes that only involved the position without direction but it just seemed too convenient to build off of everything that we had already build with the HB-Radiance sensor grids. I can put something into the description of the comfort mapping recipes to note that the sensor directions do not matter.
On another note about your sample results, if they represent a double-height space, then I would make sure there is an AirBoundary between the two Rooms. Otherwise, the results look like a good representation of a 2-story space.
This is exactly what I did.
I intersected room geometry with plane and created grid from intersection plane. Took me a second to find the new LB Generate Point Grid component because it’s under Ladybug and not HB-Radiance which seemed out of place to me.
Thanks for the detailed answer. This is exactly what I was hoping to hear. But because vertical grid was not available to be generated through native components I was not sure if it was supported fully in PMV recipe. In this case, direction is not important, so no need to pass vectors to directions input.
Thanks for the note / watching out for analysis blunders. In this case, the study was focused on upper room that has exposed floor (overhang), but it is not open to room below. Boundary condition is approximately adiabatic between the two rooms where they meet. I’m just showing lower level space for context/visualization purposes.
I am working on a model where I aim to analyse Point-In-Time lighting conditions and have a couple of questions.
Question 1 - Model Preparation:
I’m encountering challenges defining the geometry for the “LB Generate Point Grid” component. While the “HB Visualize by Type” output looks accurate, the floors appear fragmented by the walls. I found a quick workaround by creating a plane in Rhino and assigning that geometry, but I wonder if there’s a more programmatic solution to streamline this process.
To address this issue temporarily, I divided the model by floors so that I can selectively choose the specific floor to analyse. However, my ideal workflow would involve analysing the entire model as a single entity. Could you suggest an approach to achieve this?
Question 2 - Sun Impact Evaluation:
I’m trying to evaluate the sun’s impact on a vertical surface, but I’ve encountered a similar issue to the one above. The surface appears divided by the wall behind it. When I attempt to use a single Rhino surface as geometry, the results still seem incorrect. Instead of a single, concentrically diffused light pattern, I see two separate patterns, one for each surface. Please refer to the attached image for clarification.