HB PMV Comfort Map - Plotting Results on Vertical Plane - Sensor Grid Direction Question

Hi All / @chris -

I have a case where I’d like to use the HB PMV Comfort Map component to plot operative temperature and PMV results on a vertical plane.

I’ve created a plane that bisects two zones, then generated points on the plane for input to sensor grid.

Question 1:
Is there a HB component that can generate vertical sensor grid from HB rooms e.g., offset sensor grid from a wall? I see HB Sensor Grid From Rooms, but this is to create grid offset from floor.

Question 2: Does sensor direction impact PMV results?
Sensor grid direction makes sense to be with horizontal plane (have sensor face up), but the appropriate direction is less obvious to me with vertical plane in middle of a room. Currently I have ptsVectors pointing in positive x axis (1,0,0) direction. But I am curious if that sensor direction limits radiant interaction on back side of plane (neg x axis).

I’ve tried messing with the direction and it seems to change results very slightly, but looking for definitive answer.

Sample Results:

Cold Thermal Sensation %

Operative Temp (F)

Good questions, @victorbrac . I can answer them both now:

There’s no component for “Sensor Grid from Walls” right now but you can make a sensor grid from virtually any geometry by using the LB Generate Point Grid component along with the HB Sensor Grid component. It looks like this is probably what you are doing in your screenshots. If there’s enough demand for it, I can eventually add components that generate sensor grids from Honeybee Faces and/or Apertures.

No, it does not. At least not for any of the comfort mapping recipes (PMV, UTCI, Adaptive). The comfort mapping recipes attempt to show the temperature that a human would experience when positioned at each sensor point and, in order to do this, they automatically overwrite the direction of the sensor and ensure that we evaluate both sides of the sensor point (since the whole human body “sees” the thermal conditions of the surrounding environment).

This is different than all of the HB-Radiance recipes that are evaluating some metric (eg. irradiance or daylight autonomy) on a surface. For those cases, the direction of the sensor matters since it denotes the direction of the surface. These HB-Radiance recipes also explain why sensors natively include both position and direction. I debated developing some new object for the comfort mapping recipes that only involved the position without direction but it just seemed too convenient to build off of everything that we had already build with the HB-Radiance sensor grids. I can put something into the description of the comfort mapping recipes to note that the sensor directions do not matter.

On another note about your sample results, if they represent a double-height space, then I would make sure there is an AirBoundary between the two Rooms. Otherwise, the results look like a good representation of a 2-story space.

2 Likes

This is exactly what I did. :slight_smile:
I intersected room geometry with plane and created grid from intersection plane. Took me a second to find the new LB Generate Point Grid component because it’s under Ladybug and not HB-Radiance which seemed out of place to me.

Thanks for the detailed answer. This is exactly what I was hoping to hear. But because vertical grid was not available to be generated through native components I was not sure if it was supported fully in PMV recipe. In this case, direction is not important, so no need to pass vectors to directions input.

Thanks for the note / watching out for analysis blunders. In this case, the study was focused on upper room that has exposed floor (overhang), but it is not open to room below. Boundary condition is approximately adiabatic between the two rooms where they meet. I’m just showing lower level space for context/visualization purposes.

1 Like