Comfort Mannequin - Solar adjustment MRT: performance information

#1

I’m trying to define the estimated change in mean radiant temperature considering a reference surface (800 x 500 m), and the building stock as a context shading. The parallel input of the SolarAdjustTemperature component is set to true. I have two question:

  1. Why the CPU has a trend as displayed in the figure below?
  2. Why is memory usage only 43%?
#2

If I try the LB view analysis component, the performance are very different with both constant CPU and higher memory usage:
image

#3

@Francesco661 ,
I’d recommend plugging in the driectNormalRadiation into the solar temperature adjustor as this will be much faster and will give you pretty comparable results (since the SolarCal method has been pretty close to the detailed mannequin in my experience).

The reason why the CPU is cycling when you use a detailed mannequin geometry is that it’s running a radiation simulation for each timestep of the period you have requested. Really, the mannequin is made more for point-in-time studies where you are trying to understand what parts of the person are causing the elevated MRT.

1 Like
#4

@chris,
first of all thank you for your reply. Here below my answers point by point:

I’d recommend plugging in the driectNormalRadiation into the solar temperature adjustor as this will be much faster and will give you pretty comparable results (since the SolarCal method has been pretty close to the detailed mannequin in my experience).

I have used the directNormalRadiation as an input in the solar temperature adjustor.

The reason why the CPU is cycling when you use a detailed mannequin geometry is that it’s running a radiation simulation for each timestep of the period you have requested. Really, the mannequin is made more for point-in-time studies where you are trying to understand what parts of the person are causing the elevated MRT.

I have used this component to get the spatial solarAdjustedMRT (SAMRT) considering the hottest week taken from an .epw file in order then to calculate the outdoor UTCI of a district of about 1000x750 meters with a resolution of 5 meters:


It has taken two days: the very long time to calculate the SAMRT doesn’t depend by the performance of the pc. The result without considering the trees effect, based on the average values over the considered period is as reported below:
image
Then I used the same extreme hot week and a more little area with a resolution of 1 meters and trees effect (the yellow area in the first image) and the performances are very different: more memory used and a costant CPU usage.
The problem is that the LBseparatedata doesn’t allow to use this ammount of data:

Maybe I can solve this issue considering a low resolution (2 meters instead of 1 meter) or considering a single day instead of the complete extreme hot week. It is correct?
Another consideration: it would be great if there is a method to upgrade the performance considering a widely area with a good resolution!

#5

@Francesco661 ,
Thanks for the explanation and that’s good to know that you are already using direct normal radiation. In that case, my previous explanation for why the CPU is cycling is likely wrong and it probably has to do with the component running one point in your grid at a time, running all of the hours for that point in parallel, then it needs some time to move to the next point, before ramping up the parallel calculation again.

Currently the best that I can suggest is to break up your analysis into smaller pieces. Using Honeybee microclimate maps instead of the solar adjusted temperature component to build the outdoor map might also give you some efficiency but it comes with its own computer memory issues (and a a massive building energy simulation in your case).

The reality is that this scaling issue is tough to solve given that the current ladybug legacy is tied to the rhino desktop application, which limits the amount of memory that you can pull from. So we won’t really have the best solution until the microclimate maps make it into Honeybee+, which will take a year or more. So, for the time being, the best I can offer is to break up the analysis into a few runs.

#6

@chris I have another question for you: in the “Outdoor Solar Temperature Adjustor” component it is possible to define only one value of the ground reflectivity. I have read this post. I can’t see the model because the posted link doesn’t exist but I want to know if there is a method to consider both the effect of context shading system due to building geometry and trees and the reflectance (or reflectivity) of vertical surface. Fort this purpose I can use the dragonfly component and the related UWG to define the urban properties, and to revise/modify usual airport weather data considering the climate of the urban area? It seems to be a good approach. What do you think?

#7

FWIW, when existing posts from the grasshopper3d.com site were ported to this discourse forum, the links in those old posts got somewhat messed up. Just look for the topic in the old forum and the links there should still work.

The original location for this particular thread is here:

1 Like
#8

I just wanted to note that I posted a new recommended workflows for this case that uses the [+] plugins:

The new workflows should address a lot of the shortcoming from the old one referenced here.