I am very beginner at Ladybug tools. I am evaluating daylight performance (illuminance levels) in a deep office space as grid-based and image-based. Besides that, I would like to evaluate human-centric lighting design performance as well. At this point, I have faced a problem that I can not calculate the color temperature (CCT) of the space.
Do I miss any component or should I look for different solutions to find the color temperature on the Kelvin scale?
You will not get anything even remotely in the vicinity of a physically valid result by using the conventional three-channel HDR images generated through Radiance (which is what Honeybee uses). One needs to be able to derive the x,y cromaticity coordinates of a direct or reflected light source to calculate its CCT.
The derivation of x,y chromaticity coordinates, in turn, requires the spectral power distribution in the visible spectrum, preferably in 5nm (or less) intervals. So, considering that the visible spectrum is between 380 to 760nm, one would need at least 76 values of spectral power (at 5nm intervals). Instead what we get from standard HDR images is just three channels.
The closest that anyone is claiming to come to a proper spectral calculation for such purposes is the folks at Alfa. However, it is a paid software and their calculation methodology, as far as I know, is not openly published.
It’s been a while since I looked into learning more about the standardization efforts with regards to circadian lighting The last I checked much of these metrics/methodologies being promoted by WELL were still not ratified by scientific organizations like the CIE or IES.