VizThermalMap values issue

Hi,
Plotting the values from the output of the VizThermalMap component show that all grid points are receiving the same value. The colors seem to be correct but the values not.
The image below shows this. I marked in red one specific room (the smaller one) to compare the plotted and the calculated results from the comfort simulation (First column).


Any ideas?
Thanks,
-A.

Hey @AbrahamYezioro ,

Because I have not yet integrated surface view factors into the longwave MRT calculation, the current microclimate maps are using the room-averaged surface temperature (weighted by surface area) for every single point within a Room. So this means that, for nighttime hours, you will get the same operative temperature for each and every point in a room. When the sun comes up and the shortwave solar beings to affect the MRT, you will see the variation that you expect but the simplified longwave calculation is the reason why the operative temperature at midnight looks the way that it does.

I know that the simple longwave assumption is a major limitation at the moment and this will change once I figure out how to get point-to-surface view factors out of Radiance. I have been educating myself about this by implementing the “Point-in-Time” Radiance recipes and I think it’s still feasible for me to implement the surface view factor calculation before the next stable release. In the meantime, it’s probably worth highlighting that the temperature variation you would see from view factors to different surfaces is MUCH less than that which you would see from shortwave solar. So I would still hold that the current LBT thermal maps are more accurate than Legacy because the shortwave calculation of the LBT plugin is greatly improved and accounts for reflected solar.

1 Like

Hi @chris,
Still wonder if your explanation covers why ALL zones are showing the same value number even though that the colors are different.
I believe the numbers are taken from the first zone on the list (checking the results csv files). In short i think there is a problem in the calculation of the values output in the VizThermalMap component.

Thanks a lot!
-A.

Ah, sorry that I did not understand. The values output from the component seem correct to me but they are flattened so maybe you didn’t flatten the list of points so that they would align with the output values:


comfort_mapping_alignment.gh (71.8 KB)

Let me know if there is something else I am missing.

This is absolutely correct. Everything fine now and looking forward to the implementation of the view factors.
Thanks @chris.
-A.

Hi @chris,
One more question, not directly related to the topic.
I want to check the IDF created and used for the comfort simulation. I noticed that it is deleted (?) from the simulation folder.
To overcome this i’m just writing the IDF with the ModelToOSM component.
Is this the way to go or do you have a better option?

Thanks,
-A.

Hi @chris,

I know that the simple longwave assumption is a major limitation at the moment and this will change once I figure out how to get point-to-surface view factors out of Radiance.

I’d be interested in helping test out that point-to-surface view factor method with Radiance if you’ve got one working. If not, I’ve got something similar built using vwrays and rtrace, counting the number of ray intersections with geometry to return a dictionary of {geometry_id: view_proportion}. Let me know if it might be useful.

Hi @chris,
Questions are piling here … :slight_smile:
I have one more for you. It is related to the use of blinds/shades defined in the E+ model and how they are interpreted in the comfort models calculations.
I’m suspecting that since they are not modeled with “real” geometry they are not taken into account. The reason of my suspects is that i’m getting weird results especially in summer time. I can see in the operative temperature results sun patches on the floor at times i defined the blinds are supposed to be closed. Not to say that the comfort results are “not good” - 0 comfort all the period i tested (extreme hot week), on unconditioned spaces.
I’ll be glad to hear your view on this one.

Thanks,
-A.

All good questions here and they are somewhat related to one another so I give everyone a pass for not opening new topics.

You are correct that the recipe only keeps the files that it needs in future steps. So the IDF and OSM are currently deleted after they are created inside the temp folders out of which the energy simulation runs. Only the eplusout.sql is copied to the simulation folder since that is all that is needed to provide the air temperature/humidity and surface temperatures for the comfort mapping step. If you think it would be really helpful for debugging, I’m open to editing the recipe to have it copy one of the two energy simulation input files (either OSM or IDF) into the simulation folder before it deletes them. But I would like to avoid copying both since they contain very similar information and they can take up a lot of disk space if we always keep them for every comfort mapping simulation. Am I correct in assuming that you would rather have the IDF copied instead of the OSM? Also, using the “Model To OSM” component should give you an equivalent IDF and OSM so your current practice is good.

We currently don’t have support yet for dynamic modifiers in honeybee-radiance. However, @mostapha is working on it and we at least know how dynamic objects should be representing in HBJSON format and in the radiance folder structure. Still, this means that dynamic window constructions (HB Window Construction Shade) are currently accounted for in the energy simulation (longwave MRT and air temperature) but not in the shortwave Radiance simulation of the thermal maps. We are planning to add support for dynamic Radiance modifiers in the next few months and you should see it in release notes when we finally get there.

Lastly, thank you for the offer of help, @tg359 . Your method sounds like it will be useful for certain cases but I am currently leaning towards using rcontrib to help compute the view factor for each surface geometry. This is particularly so because Greg just removed the restriction of 9999 as the maximum number of rcontrib modifiers. So we could use it now for large energy models with more than 9999 surfaces. I’ll definitely let both of you know once I have something to test.

Hi @chris,
Thanks for your prompt answers. They are good for me.

Right now i’m using the Model to OSM as you said. Glad it is a good practice. If it is not much trouble, i think keeping the idf would be nice.
As for the Window Construction Shade your answer is exactly what i was hoping for. Of course it would be nice to have now but it is ok that it will be. I was just wondering whether you are going to consider the scheduled shading materials for the Radiance part of the simulation. Now that i know you are, i can be more comfortable.
Thanks again,
-A

Ok, I opened an issue about copying the IDF file to the simulation folder and I’ll try to address it when I get the chance:

And you can definitely rest assured that dynamic constructions are on our agenda. We also have plans to support more than 2 construction states (for both energy and radiance), which should be coming to fruition in the next few months.

1 Like

Hey @AbrahamYezioro ,

I am happy to say that I just pushed an updated version of all three comfort-mapping recipes to the development version of the plugin. They have a load of improvements including the following:

  1. There is now a fully-detailed longwave temperature calculation. It uses Radiance’s rcontrib function to compute view factors to surfaces and then uses EnergyPlus surface temperature results to get longwave temperatures at each sensor. All indoor shades (eg. those representing furniture) are assumed to be at the room-average MRT. For sensors outside of any Room, the same rcontrib function is used to compute sky view in order to account for longwave radiant exchange with the sky. All outdoor context shades are assumed to be at the EPW air temperature unless they have been modeled as Honeybee rooms.
  2. For air temperatures that are close to AirBoundaries (within 2 meters of them), the temperatures will be interpolated across the AirBoundary between the two adjacent rooms. This results in a more realistic depiction of air temperature, especially in passive buildings where the temperature of one room can be very different from another.
  3. The parallelization of the simulation has been greatly improved and all of the sensors in a model are now grouped evenly for both the Radiance calculations and the ultimate running of results through the comfort model.
  4. Some extra unnecessary steps were removed from the Radiance shortwave calculation. Notably the evaluation of ground-reflected shortwave solar was simplified so that we don’t try tracing a ray directly from a downward-pointing sensor to the sun position, which invariably is going to be null for all sun-up hours. This means that, overall, the simulation time has not changed much since the run time for the new longwave calculation has been offset by the increases in efficiency and parallelization.
  5. The simulation can now be run with wintertime run_periods. Previously, the recipe didn’t let you use a run_period like “Dec1 to Jan 31”.
  6. The IDF is stored in the simulation folder after the run. You will be able to find it here so that you can check the assumptions are as you expect:
C:\Users\[USERNAME]\simulation\[MODEL_NAME]\pmv_comfort_map\energy\in.idf

I think this addresses everything that you brought up here except for dynamic objects. At the moment, both Shade transmittance schedules and dynamic window constructions are modeled as static in the Radiance shortwave solar calculation. I know how I should be simulating them and I’ll try to implement them before the stable release this month but I don’t know it I’ll be able to get it together in time. If not, you can expect that they will be implemented in early January.

4 Likes

Thank you @chris!!
This is excellent and beyond.

Can you point me to the part of the source code where this view factor are being calculated? Just need it for another purpose … I want to see if this can be used instead of the raytrace (rayshoot) GH is using. If you know the answer for this also, i will appreciate your view.

Thanks again,
-A.

Hey @AbrahamYezioro ,

Here are the docs for the two commands that are being run in the recipe to produce the CSV of surface view factors:

https://www.ladybug.tools/honeybee-radiance/docs/cli/viewfactor.html

You have to run this command first:

honeybee-radiance view-factor modifiers [MODEL_FILE]

… then you can use the outputs of that command along with a .pts file as input for this one:

honeybee-radiance view-factor contrib [OCTREE] [SENSOR_GRID] [MODIFIERS]

… and that spits out a CSV of spherical view factors.

The source code for these two commands can be found here:

If you wanted to perfectly mimic the “Ray Shoot” functionality of Rhino using Radiance, I think you can remove the -I option for the rcontrib function that you see is being set under the hood of the command. And you can drop the -ad to a really low number so that the calculation runs fast. Then, you’ll just have to make sure that the .pts file that you are feeding into rcontrib has directions representing the rays that you want to shoot. And the output matrix from rcontrib will tell you which of the input surface [MODIFIERS] you hit with each ray in the .pts file. This all can be simplified and streamlined more if you care about spherical or hemispherical view factors instead of whether a specific ray hit something. You’ll see these simplifications in the command source code.

2 Likes

This is great @chris!!

Thanks a lot. Very useful.
Will report (or ask) if needed in a different thread.

-A.